WorldWideScience

Sample records for curve based additive

  1. Utilization of curve offsets in additive manufacturing

    Science.gov (United States)

    Haseltalab, Vahid; Yaman, Ulas; Dolen, Melik

    2018-05-01

    Curve offsets are utilized in different fields of engineering and science. Additive manufacturing, which lately becomes an explicit requirement in manufacturing industry, utilizes curve offsets widely. One of the necessities of offsetting is for scaling which is required if there is shrinkage after the fabrication or if the surface quality of the resulting part is unacceptable. Therefore, some post-processing is indispensable. But the major application of curve offsets in additive manufacturing processes is for generating head trajectories. In a point-wise AM process, a correct tool-path in each layer can reduce lots of costs and increase the surface quality of the fabricated parts. In this study, different curve offset generation algorithms are analyzed to show their capabilities and disadvantages through some test cases and improvements on their drawbacks are suggested.

  2. Quantum field theories on algebraic curves. I. Additive bosons

    International Nuclear Information System (INIS)

    Takhtajan, Leon A

    2013-01-01

    Using Serre's adelic interpretation of cohomology, we develop a 'differential and integral calculus' on an algebraic curve X over an algebraically closed field k of constants of characteristic zero, define algebraic analogues of additive multi-valued functions on X and prove the corresponding generalized residue theorem. Using the representation theory of the global Heisenberg algebra and lattice Lie algebra, we formulate quantum field theories of additive and charged bosons on an algebraic curve X. These theories are naturally connected with the algebraic de Rham theorem. We prove that an extension of global symmetries (Witten's additive Ward identities) from the k-vector space of rational functions on X to the vector space of additive multi-valued functions uniquely determines these quantum theories of additive and charged bosons.

  3. ECM using Edwards curves

    DEFF Research Database (Denmark)

    Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja

    2013-01-01

    -arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...

  4. Projection-based curve clustering

    International Nuclear Information System (INIS)

    Auder, Benjamin; Fischer, Aurelie

    2012-01-01

    This paper focuses on unsupervised curve classification in the context of nuclear industry. At the Commissariat a l'Energie Atomique (CEA), Cadarache (France), the thermal-hydraulic computer code CATHARE is used to study the reliability of reactor vessels. The code inputs are physical parameters and the outputs are time evolution curves of a few other physical quantities. As the CATHARE code is quite complex and CPU time-consuming, it has to be approximated by a regression model. This regression process involves a clustering step. In the present paper, the CATHARE output curves are clustered using a k-means scheme, with a projection onto a lower dimensional space. We study the properties of the empirically optimal cluster centres found by the clustering method based on projections, compared with the 'true' ones. The choice of the projection basis is discussed, and an algorithm is implemented to select the best projection basis among a library of orthonormal bases. The approach is illustrated on a simulated example and then applied to the industrial problem. (authors)

  5. MICA: Multiple interval-based curve alignment

    Science.gov (United States)

    Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf

    2018-01-01

    MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA

  6. Effects of solvent additive on “s-shaped” curves in solution-processed small molecule solar cells

    Directory of Open Access Journals (Sweden)

    John A. Love

    2016-11-01

    Full Text Available A novel molecular chromophore, p-SIDT(FBTThCA82, is introduced as an electron-donor material for bulk heterojunction (BHJ solar cells with broad absorption and near ideal energy levels for the use in combination with common acceptor materials. It is found that films cast from chlorobenzene yield devices with strongly s-shaped current–voltage curves, drastically limiting performance. We find that addition of the common solvent additive diiodooctane, in addition to facilitating crystallization, leads to improved vertical phase separation. This yields much better performing devices, with improved curve shape, demonstrating the importance of morphology control in BHJ devices and improving the understanding of the role of solvent additives.

  7. Development of a statistically-based lower bound fracture toughness curve (Ksub(IR) curve)

    International Nuclear Information System (INIS)

    Wullaert, R.A.; Server, W.L.; Oldfield, W.; Stahlkopf, K.E.

    1977-01-01

    A program of initiation fracture toughness measurements on fifty heats of nuclear pressure vessel production materials (including weldments) was used to develop a methodology for establishing a revised reference toughness curve. The new methodology was statistically developed and provides a predefined confidence limit (or tolerance limit) for fracture toughness based upon many heats of a particular type of material. Overall reference curves were developed for seven specific materials using large specimen static and dynamic fracture toughness results. The heat-to-heat variation was removed by normalizing both the fracture toughness and temperature data with the precracked Charpy tanh curve coefficients for each particular heat. The variance and distribution about the curve were determined, and lower bounds of predetermined statistical significance were drawn based upon a Pearson distribution in the lower shelf region (since the data were skewed to high values) and a t-distribution in the transition temperature region (since the data were normally distributed)

  8. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  9. Structural Acoustic Physics Based Modeling of Curved Composite Shells

    Science.gov (United States)

    2017-09-19

    NUWC-NPT Technical Report 12,236 19 September 2017 Structural Acoustic Physics -Based Modeling of Curved Composite Shells Rachel E. Hesse...SUBTITLE Structural Acoustic Physics -Based Modeling of Curved Composite Shells 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...study was to use physics -based modeling (PBM) to investigate wave propagations through curved shells that are subjected to acoustic excitation. An

  10. Modelling of acid-base titration curves of mineral assemblages

    Directory of Open Access Journals (Sweden)

    Stamberg Karel

    2016-01-01

    Full Text Available The modelling of acid-base titration curves of mineral assemblages was studied with respect to basic parameters of their surface sites to be obtained. The known modelling approaches, component additivity (CA and generalized composite (GC, and three types of different assemblages (fucoidic sandstones, sedimentary rock-clay and bentonite-magnetite samples were used. In contrary to GC-approach, application of which was without difficulties, the problem of CA-one consisted in the credibility and accessibility of the parameters characterizing the individual mineralogical components.

  11. Point- and curve-based geometric conflation

    KAUST Repository

    Ló pez-Vá zquez, C.; Manso Callejo, M.A.

    2013-01-01

    Geometric conflation is the process undertaken to modify the coordinates of features in dataset A in order to match corresponding ones in dataset B. The overwhelming majority of the literature considers the use of points as features to define the transformation. In this article we present a procedure to consider one-dimensional curves also, which are commonly available as Global Navigation Satellite System (GNSS) tracks, routes, coastlines, and so on, in order to define the estimate of the displacements to be applied to each object in A. The procedure involves three steps, including the partial matching of corresponding curves, the computation of some analytical expression, and the addition of a correction term in order to satisfy basic cartographic rules. A numerical example is presented. © 2013 Copyright Taylor and Francis Group, LLC.

  12. Compact Hilbert Curve Index Algorithm Based on Gray Code

    Directory of Open Access Journals (Sweden)

    CAO Xuefeng

    2016-12-01

    Full Text Available Hilbert curve has best clustering in various kinds of space filling curves, and has been used as an important tools in discrete global grid spatial index design field. But there are lots of redundancies in the standard Hilbert curve index when the data set has large differences between dimensions. In this paper, the construction features of Hilbert curve is analyzed based on Gray code, and then the compact Hilbert curve index algorithm is put forward, in which the redundancy problem has been avoided while Hilbert curve clustering preserved. Finally, experiment results shows that the compact Hilbert curve index outperforms the standard Hilbert index, their 1 computational complexity is nearly equivalent, but the real data set test shows the coding time and storage space decrease 40%, the speedup ratio of sorting speed is nearly 4.3.

  13. Analysis of velocity planning interpolation algorithm based on NURBS curve

    Science.gov (United States)

    Zhang, Wanjun; Gao, Shanping; Cheng, Xiyan; Zhang, Feng

    2017-04-01

    To reduce interpolation time and Max interpolation error in NURBS (Non-Uniform Rational B-Spline) inter-polation caused by planning Velocity. This paper proposed a velocity planning interpolation algorithm based on NURBS curve. Firstly, the second-order Taylor expansion is applied on the numerator in NURBS curve representation with parameter curve. Then, velocity planning interpolation algorithm can meet with NURBS curve interpolation. Finally, simulation results show that the proposed NURBS curve interpolator meet the high-speed and high-accuracy interpolation requirements of CNC systems. The interpolation of NURBS curve should be finished.

  14. Extended-Search, Bézier Curve-Based Lane Detection and Reconstruction System for an Intelligent Vehicle

    Directory of Open Access Journals (Sweden)

    Xiaoyun Huang

    2015-09-01

    Full Text Available To improve the real-time performance and detection rate of a Lane Detection and Reconstruction (LDR system, an extended-search-based lane detection method and a Bézier curve-based lane reconstruction algorithm are proposed in this paper. The extended-search-based lane detection method is designed to search boundary blocks from the initial position, in an upwards direction and along the lane, with small search areas including continuous search, discontinuous search and bending search in order to detect different lane boundaries. The Bézier curve-based lane reconstruction algorithm is employed to describe a wide range of lane boundary forms with comparatively simple expressions. In addition, two Bézier curves are adopted to reconstruct the lanes' outer boundaries with large curvature variation. The lane detection and reconstruction algorithm — including initial-blocks' determining, extended search, binarization processing and lane boundaries' fitting in different scenarios — is verified in road tests. The results show that this algorithm is robust against different shadows and illumination variations; the average processing time per frame is 13 ms. Significantly, it presents an 88.6% high-detection rate on curved lanes with large or variable curvatures, where the accident rate is higher than that of straight lanes.

  15. MOND rotation curves for spiral galaxies with Cepheid-based distances

    NARCIS (Netherlands)

    Bottema, R; Pestana, JLG; Rothberg, B; Sanders, RH

    2002-01-01

    Rotation curves for four spiral galaxies with recently determined Cepheid-based distances are reconsidered in terms of modified Newtonian dynamics (MOND). For two of the objects, NGC 2403 and NGC 7331, the rotation curves predicted by MOND are compatible with the observed curves when these galaxies

  16. Comparison of wind turbines based on power curve analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    In the study measured power curves for 46 wind turbines were analyzed with the purpose to establish the basis for a consistent comparison of the efficiency of the wind turbines. Emphasis is on wind turbines above 500 kW rated power, with power curves measured after 1994 according to international recommendations. The available power curves fulfilling these requirements were smoothened according to a procedure developed for the purpose in such a way that the smoothened power curves are equally representative as the measured curves. The resulting smoothened power curves are presented in a standardized format for the subsequent processing. Using wind turbine data from the power curve documentation the analysis results in curves for specific energy production (kWh/M{sup 2}/yr) versus specific rotor load (kW/M{sup 2}) for a range of mean wind speeds. On this basis generalized curves for specific annual energy production versus specific rotor load are established for a number of generalized wind turbine concepts. The 46 smoothened standardized power curves presented in the report, the procedure developed to establish them, and the results of the analysis based on them aim at providers of measured power curves as well as users of them including manufacturers, advisors and decision makers. (au)

  17. The Use of Statistically Based Rolling Supply Curves for Electricity Market Analysis: A Preliminary Look

    Energy Technology Data Exchange (ETDEWEB)

    Jenkin, Thomas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Larson, Andrew [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Mark F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ben [U.S. Department of Energy; Spitsen, Paul [U.S. Department of Energy

    2018-03-27

    In light of the changing electricity resource mixes across the United States, an important question in electricity modeling is how additions and retirements of generation, including additions in variable renewable energy (VRE) generation could impact markets by changing hourly wholesale energy prices. Instead of using resource-intensive production cost models (PCMs) or building and using simple generator supply curves, this analysis uses a 'top-down' approach based on regression analysis of hourly historical energy and load data to estimate the impact of supply changes on wholesale electricity prices, provided the changes are not so substantial that they fundamentally alter the market and dispatch-order driven behavior of non-retiring units. The rolling supply curve (RSC) method used in this report estimates the shape of the supply curve that fits historical hourly price and load data for given time intervals, such as two-weeks, and then repeats this on a rolling basis through the year. These supply curves can then be modified on an hourly basis to reflect the impact of generation retirements or additions, including VRE and then reapplied to the same load data to estimate the change in hourly electricity price. The choice of duration over which these RSCs are estimated has a significant impact on goodness of fit. For example, in PJM in 2015, moving from fitting one curve per year to 26 rolling two-week supply curves improves the standard error of the regression from 16 dollars/MWh to 6 dollars/MWh and the R-squared of the estimate from 0.48 to 0.76. We illustrate the potential use and value of the RSC method by estimating wholesale price effects under various generator retirement and addition scenarios, and we discuss potential limits of the technique, some of which are inherent. The ability to do this type of analysis is important to a wide range of market participants and other stakeholders, and it may have a role in complementing use of or providing

  18. Reliability Based Geometric Design of Horizontal Circular Curves

    Science.gov (United States)

    Rajbongshi, Pabitra; Kalita, Kuldeep

    2018-06-01

    Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.

  19. Reliability Based Geometric Design of Horizontal Circular Curves

    Science.gov (United States)

    Rajbongshi, Pabitra; Kalita, Kuldeep

    2018-03-01

    Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.

  20. Linear Titration Curves of Acids and Bases.

    Science.gov (United States)

    Joseph, N R

    1959-05-29

    The Henderson-Hasselbalch equation, by a simple transformation, becomes pH - pK = pA - pB, where pA and pB are the negative logarithms of acid and base concentrations. Sigmoid titration curves then reduce to straight lines; titration curves of polyelectrolytes, to families of straight lines. The method is applied to the titration of the dipeptide glycyl aminotricarballylic acid, with four titrable groups. Results are expressed as Cartesian and d'Ocagne nomograms. The latter is of a general form applicable to polyelectrolytes of any degree of complexity.

  1. Design of airborne imaging spectrometer based on curved prism

    Science.gov (United States)

    Nie, Yunfeng; Xiangli, Bin; Zhou, Jinsong; Wei, Xiaoxiao

    2011-11-01

    A novel moderate-resolution imaging spectrometer spreading from visible wavelength to near infrared wavelength range with a spectral resolution of 10 nm, which combines curved prisms with the Offner configuration, is introduced. Compared to conventional imaging spectrometers based on dispersive prism or diffractive grating, this design possesses characteristics of small size, compact structure, low mass as well as little spectral line curve (smile) and spectral band curve (keystone or frown). Besides, the usage of compound curved prisms with two or more different materials can greatly reduce the nonlinearity inevitably brought by prismatic dispersion. The utilization ratio of light radiation is much higher than imaging spectrometer of the same type based on combination of diffractive grating and concentric optics. In this paper, the Seidel aberration theory of curved prism and the optical principles of Offner configuration are illuminated firstly. Then the optical design layout of the spectrometer is presented, and the performance evaluation of this design, including spot diagram and MTF, is analyzed. To step further, several types of telescope matching this system are provided. This work provides an innovational perspective upon optical system design of airborne spectral imagers; therefore, it can offer theoretic guide for imaging spectrometer of the same kind.

  2. Simulation-optimization model of reservoir operation based on target storage curves

    Directory of Open Access Journals (Sweden)

    Hong-bin Fang

    2014-10-01

    Full Text Available This paper proposes a new storage allocation rule based on target storage curves. Joint operating rules are also proposed to solve the operation problems of a multi-reservoir system with joint demands and water transfer-supply projects. The joint operating rules include a water diversion rule to determine the amount of diverted water in a period, a hedging rule based on an aggregated reservoir to determine the total release from the system, and a storage allocation rule to specify the release from each reservoir. A simulation-optimization model was established to optimize the key points of the water diversion curves, the hedging rule curves, and the target storage curves using the improved particle swarm optimization (IPSO algorithm. The multi-reservoir water supply system located in Liaoning Province, China, including a water transfer-supply project, was employed as a case study to verify the effectiveness of the proposed join operating rules and target storage curves. The results indicate that the proposed operating rules are suitable for the complex system. The storage allocation rule based on target storage curves shows an improved performance with regard to system storage distribution.

  3. Using Spreadsheets to Produce Acid-Base Titration Curves.

    Science.gov (United States)

    Cawley, Martin James; Parkinson, John

    1995-01-01

    Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)

  4. Learning curves in health professions education.

    Science.gov (United States)

    Pusic, Martin V; Boutis, Kathy; Hatala, Rose; Cook, David A

    2015-08-01

    Learning curves, which graphically show the relationship between learning effort and achievement, are common in published education research but are not often used in day-to-day educational activities. The purpose of this article is to describe the generation and analysis of learning curves and their applicability to health professions education. The authors argue that the time is right for a closer look at using learning curves-given their desirable properties-to inform both self-directed instruction by individuals and education management by instructors.A typical learning curve is made up of a measure of learning (y-axis), a measure of effort (x-axis), and a mathematical linking function. At the individual level, learning curves make manifest a single person's progress towards competence including his/her rate of learning, the inflection point where learning becomes more effortful, and the remaining distance to mastery attainment. At the group level, overlaid learning curves show the full variation of a group of learners' paths through a given learning domain. Specifically, they make overt the difference between time-based and competency-based approaches to instruction. Additionally, instructors can use learning curve information to more accurately target educational resources to those who most require them.The learning curve approach requires a fine-grained collection of data that will not be possible in all educational settings; however, the increased use of an assessment paradigm that explicitly includes effort and its link to individual achievement could result in increased learner engagement and more effective instructional design.

  5. Qualitative Comparison of Contraction-Based Curve Skeletonization Methods

    NARCIS (Netherlands)

    Sobiecki, André; Yasan, Haluk C.; Jalba, Andrei C.; Telea, Alexandru C.

    2013-01-01

    In recent years, many new methods have been proposed for extracting curve skeletons of 3D shapes, using a mesh-contraction principle. However, it is still unclear how these methods perform with respect to each other, and with respect to earlier voxel-based skeletonization methods, from the viewpoint

  6. Eyewitness identification: Bayesian information gain, base-rate effect equivalency curves, and reasonable suspicion.

    Science.gov (United States)

    Wells, Gary L; Yang, Yueran; Smalarz, Laura

    2015-04-01

    We provide a novel Bayesian treatment of the eyewitness identification problem as it relates to various system variables, such as instruction effects, lineup presentation format, lineup-filler similarity, lineup administrator influence, and show-ups versus lineups. We describe why eyewitness identification is a natural Bayesian problem and how numerous important observations require careful consideration of base rates. Moreover, we argue that the base rate in eyewitness identification should be construed as a system variable (under the control of the justice system). We then use prior-by-posterior curves and information-gain curves to examine data obtained from a large number of published experiments. Next, we show how information-gain curves are moderated by system variables and by witness confidence and we note how information-gain curves reveal that lineups are consistently more proficient at incriminating the guilty than they are at exonerating the innocent. We then introduce a new type of analysis that we developed called base rate effect-equivalency (BREE) curves. BREE curves display how much change in the base rate is required to match the impact of any given system variable. The results indicate that even relatively modest changes to the base rate can have more impact on the reliability of eyewitness identification evidence than do the traditional system variables that have received so much attention in the literature. We note how this Bayesian analysis of eyewitness identification has implications for the question of whether there ought to be a reasonable-suspicion criterion for placing a person into the jeopardy of an identification procedure. (c) 2015 APA, all rights reserved).

  7. A graph-based method for fitting planar B-spline curves with intersections

    Directory of Open Access Journals (Sweden)

    Pengbo Bo

    2016-01-01

    Full Text Available The problem of fitting B-spline curves to planar point clouds is studied in this paper. A novel method is proposed to deal with the most challenging case where multiple intersecting curves or curves with self-intersection are necessary for shape representation. A method based on Delauney Triangulation of data points is developed to identify connected components which is also capable of removing outliers. A skeleton representation is utilized to represent the topological structure which is further used to create a weighted graph for deciding the merging of curve segments. Different to existing approaches which utilize local shape information near intersections, our method considers shape characteristics of curve segments in a larger scope and is thus capable of giving more satisfactory results. By fitting each group of data points with a B-spline curve, we solve the problems of curve structure reconstruction from point clouds, as well as the vectorization of simple line drawing images by drawing lines reconstruction.

  8. Laffer Curves and Home Production

    Directory of Open Access Journals (Sweden)

    Kotamäki Mauri

    2017-06-01

    Full Text Available In the earlier related literature, consumption tax rate Laffer curve is found to be strictly increasing (see Trabandt and Uhlig (2011. In this paper, a general equilibrium macro model is augmented by introducing a substitute for private consumption in the form of home production. The introduction of home production brings about an additional margin of adjustment – an increase in consumption tax rate not only decreases labor supply and reduces the consumption tax base but also allows a substitution of market goods with home-produced goods. The main objective of this paper is to show that, after the introduction of home production, the consumption tax Laffer curve exhibits an inverse U-shape. Also the income tax Laffer curves are significantly altered. The result shown in this paper casts doubt on some of the earlier results in the literature.

  9. Theory of titration curves-VII The properties of derivative titration curves for strong acid-strong base and other isovalent ion-combination titrations.

    Science.gov (United States)

    Meites, T; Meites, L

    1970-06-01

    This paper deals with isovalent ion-combination titrations based on reactions that can be represented by the equation M(n+) + X(n-) --> MX, where the activity of the product MX is invariant throughout a titration, and with the derivative titration curves obtained by plotting d[M(+)]/dfversus f for such titrations. It describes some of the ways in which such curves can be obtained; it compares and contrasts them both with potentiometric titration curves, which resemble them in shape, and with segmented titration curves, from which they are derived; and it discusses their properties in detail.

  10. The effect of additional equilibrium stress functions on the three-node hybrid-mixed curved beam element

    International Nuclear Information System (INIS)

    Kim, Jin Gon; Park, Yong Kuk

    2008-01-01

    To develop an effective hybrid-mixed element, it is extremely critical as to how to assume the stress field. This research article demonstrates the effect of additional equilibrium stress functions to enhance the numerical performance of the locking-free three-node hybrid-mixed curved beam element, proposed in Saleeb and Chang's previous work. It is exceedingly complicated or even infeasible to determine the stress functions to satisfy fully both the equilibrium conditions and suppression of kinematic deformation modes in the three-node hybrid-mixed formulation. Accordingly, the additional stress functions to satisfy partially or fully equilibrium conditions are incorporated in this study. Several numerical examples for static and dynamic problems confirm that the newly proposed element with these additional stress functions is highly effective regardless of the slenderness ratio and curvature of arches in static and dynamic analyses

  11. Integrated analysis on static/dynamic aeroelasticity of curved panels based on a modified local piston theory

    Science.gov (United States)

    Yang, Zhichun; Zhou, Jian; Gu, Yingsong

    2014-10-01

    A flow field modified local piston theory, which is applied to the integrated analysis on static/dynamic aeroelastic behaviors of curved panels, is proposed in this paper. The local flow field parameters used in the modification are obtained by CFD technique which has the advantage to simulate the steady flow field accurately. This flow field modified local piston theory for aerodynamic loading is applied to the analysis of static aeroelastic deformation and flutter stabilities of curved panels in hypersonic flow. In addition, comparisons are made between results obtained by using the present method and curvature modified method. It shows that when the curvature of the curved panel is relatively small, the static aeroelastic deformations and flutter stability boundaries obtained by these two methods have little difference, while for curved panels with larger curvatures, the static aeroelastic deformation obtained by the present method is larger and the flutter stability boundary is smaller compared with those obtained by the curvature modified method, and the discrepancy increases with the increasing of curvature of panels. Therefore, the existing curvature modified method is non-conservative compared to the proposed flow field modified method based on the consideration of hypersonic flight vehicle safety, and the proposed flow field modified local piston theory for curved panels enlarges the application range of piston theory.

  12. Feature Extraction from 3D Point Cloud Data Based on Discrete Curves

    Directory of Open Access Journals (Sweden)

    Yi An

    2013-01-01

    Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.

  13. Thermodynamic Activity-Based Progress Curve Analysis in Enzyme Kinetics.

    Science.gov (United States)

    Pleiss, Jürgen

    2018-03-01

    Macrokinetic Michaelis-Menten models based on thermodynamic activity provide insights into enzyme kinetics because they separate substrate-enzyme from substrate-solvent interactions. Kinetic parameters are estimated from experimental progress curves of enzyme-catalyzed reactions. Three pitfalls are discussed: deviations between thermodynamic and concentration-based models, product effects on the substrate activity coefficient, and product inhibition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Experience Curves: A Tool for Energy Policy Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Neij, Lena; Helby, Peter [Lund Univ. (Sweden). Environmental and Energy Systems Studies; Dannemand Andersen, Per; Morthorst, Poul Erik [Riso National Laboratory, Roskilde (Denmark); Durstewitz, Michael; Hoppe-Kilpper, Martin [Inst. fuer Solare Energieversorgungstechnik e.V., Kassel (DE); and others

    2003-07-01

    The objective of the project, Experience curves: a tool for energy policy assessment (EXTOOL), was to analyse the experience curve as a tool for the assessment of energy policy measures. This is of special interest, since the use of experience curves for the assessment of energy policy measures requires the development of the established experience curve methodology. This development raises several questions which have been addressed and analysed in this project. The analysis is based on case studies of wind power, an area with considerable experience in technology development, deployment and policy measures. Therefore, a case study based on wind power provides a good opportunity to study the usefulness of experience curves as a tool for the assessment of energy policy measures. However, the results are discussed in terms of using experience curves for the assessment of any energy technology. The project shows that experience curves can be used to assess the effect of combined policy measures in terms of cost reductions. Moreover, the result of the project show that experience curves could be used to analyse international 'learning systems', i.e. cost reductions brought about by the development of wind power and policy measures used in other countries. Nevertheless, the use of experience curves for the assessment of policy programmes has several limitations. First, the analysis and assessment of policy programmes cannot be achieved unless relevant experience curves based on good data can be developed. The authors are of the opinion that only studies that provide evidence of the validity, reliability and relevance of experience curves should be taken into account in policy making. Second, experience curves provide an aggregated picture of the situation and more detailed analysis of various sources of cost reduction, and cost reductions resulting from individual policy measures, requires additional data and analysis tools. Third, we do not recommend the use of

  15. Prediction of Pressing Quality for Press-Fit Assembly Based on Press-Fit Curve and Maximum Press-Mounting Force

    Directory of Open Access Journals (Sweden)

    Bo You

    2015-01-01

    Full Text Available In order to predict pressing quality of precision press-fit assembly, press-fit curves and maximum press-mounting force of press-fit assemblies were investigated by finite element analysis (FEA. The analysis was based on a 3D Solidworks model using the real dimensions of the microparts and the subsequent FEA model that was built using ANSYS Workbench. The press-fit process could thus be simulated on the basis of static structure analysis. To verify the FEA results, experiments were carried out using a press-mounting apparatus. The results show that the press-fit curves obtained by FEA agree closely with the curves obtained using the experimental method. In addition, the maximum press-mounting force calculated by FEA agrees with that obtained by the experimental method, with the maximum deviation being 4.6%, a value that can be tolerated. The comparison shows that the press-fit curve and max press-mounting force calculated by FEA can be used for predicting the pressing quality during precision press-fit assembly.

  16. A standard curve based method for relative real time PCR data processing

    Directory of Open Access Journals (Sweden)

    Krause Andreas

    2005-03-01

    Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that

  17. Modelling and assessment of urban flood hazards based on rainfall intensity-duration-frequency curves reformation

    OpenAIRE

    Ghazavi, Reza; Moafi Rabori, Ali; Ahadnejad Reveshty, Mohsen

    2016-01-01

    Estimate design storm based on rainfall intensity–duration–frequency (IDF) curves is an important parameter for hydrologic planning of urban areas. The main aim of this study was to estimate rainfall intensities of Zanjan city watershed based on overall relationship of rainfall IDF curves and appropriate model of hourly rainfall estimation (Sherman method, Ghahreman and Abkhezr method). Hydrologic and hydraulic impacts of rainfall IDF curves change in flood properties was evaluated via Stormw...

  18. Prediction of flow boiling curves based on artificial neural network

    International Nuclear Information System (INIS)

    Wu Junmei; Xi'an Jiaotong Univ., Xi'an; Su Guanghui

    2007-01-01

    The effects of the main system parameters on flow boiling curves were analyzed by using an artificial neural network (ANN) based on the database selected from the 1960s. The input parameters of the ANN are system pressure, mass flow rate, inlet subcooling, wall superheat and steady/transition boiling, and the output parameter is heat flux. The results obtained by the ANN show that the heat flux increases with increasing inlet sub cooling for all heat transfer modes. Mass flow rate has no significant effects on nucleate boiling curves. The transition boiling and film boiling heat fluxes will increase with an increase of mass flow rate. The pressure plays a predominant role and improves heat transfer in whole boiling regions except film boiling. There are slight differences between the steady and the transient boiling curves in all boiling regions except the nucleate one. (authors)

  19. Investigation of the bases for use of the KIc curve

    International Nuclear Information System (INIS)

    McCabe, D.E.; Nanstad, R.K.; Rosenfield, A.R.; Marschall, C.W.; Irwin, G.R.

    1991-01-01

    Title 10 of the Code of Federal Regulations, Part 50 (10CFR50), Appendix G, establishes the bases for setting allowable pressure and temperature limits on reactors during heatup and cooldown operation. Both the K Ic and K Ia curves are utilized in prescribed ways to maintain reactor vessel structural integrity in the presence of an assumed or actual flaw and operating stresses. Currently, the code uses the K Ia curve, normalized to the RT NDT , to represent the fracture toughness trend for unirradiated and irradiated pressure vessel steels. Although this is clearly a conservative policy, it has been suggested that the K Ic curve is the more appropriate for application to a non-accident operating condition. A number of uncertainties have been identified, however, that might convert normal operating transients into a dynamic loading situation. Those include the introduction of running cracks from local brittle zones, crack pop-ins, reduced toughness from arrested cleavage cracks, description of the K Ic curve for irradiated materials, and other related unresolved issues relative to elastic-plastic fracture mechanics. Some observations and conclusions can be made regarding various aspects of those uncertainties and they are discussed in this paper. A discussion of further work required and under way to address the remaining uncertainties is also presented

  20. Analysis of possibilities of early diagnostics criteria for Parkinson's disease based on analysis of the input-output curve

    Directory of Open Access Journals (Sweden)

    Janković Marko

    2013-01-01

    Full Text Available In this paper, we analyze the possibilities of the diagnosis of Parkinson's disease at an early stage, based on characteristics of the input-output curve. The input-output (IO curve was analyzed in two ways: we analyzed the gain of the curve for low-level transcranial stimulation and we analyzed the overall 'quality' of the IO curve. The 'quality' of the curve calculation is based on basic concepts from quantum mechanics and calculation of Tsallis entropy.

  1. Part 5: Receiver Operating Characteristic Curve and Area under the Curve

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2016-04-01

    Full Text Available Multiple diagnostic tools are used by emergency physicians,every day. In addition, new tools are evaluated to obtainmore accurate methods and reduce time or cost of conventionalones. In the previous parts of this educationalseries, we described diagnostic performance characteristicsof diagnostic tests including sensitivity, specificity, positiveand negative predictive values, and likelihood ratios. Thereceiver operating characteristics (ROC curve is a graphicalpresentation of screening characteristics. ROC curve is usedto determine the best cutoff point and compare two or moretests or observers by measuring the area under the curve(AUC. In this part of our educational series, we explain ROCcurve and two methods to determine the best cutoff value.

  2. Effects of La and Ce Addition on the Modification of Al-Si Based Alloys

    Directory of Open Access Journals (Sweden)

    Emad M. Elgallad

    2016-01-01

    Full Text Available This study focuses on the effects of the addition of rare earth metals (mainly lanthanum and cerium on the eutectic Si characteristics in Al-Si based alloys. Based on the solidification curves and microstructural examination of the corresponding alloys, it was found that addition of La or Ce increases the alloy melting temperature and the Al-Si eutectic temperature, with an Al-Si recalescence of 2-3°C, and the appearance of post-α-Al peaks attributed to precipitation of rare earth intermetallics. Addition of La or Ce to Al-(7–13% Si causes only partial modification of the eutectic Si particles. Lanthanum has a high affinity to react with Sr, which weakens the modification efficiency of the latter. Cerium, however, has a high affinity for Ti, forming a large amount of sludge. Due to the large difference in the length of the eutectic Si particles in the same sample, the normal use of standard deviation in this case is meaningless.

  3. Model-based methodology to develop the isochronous stress-strain curves for modified 9Cr steels

    International Nuclear Information System (INIS)

    Kim, Woo Gon; Yin, Song Nan; Kim, Sung Ho; Lee, Chan Bock; Jung, Ik Hee

    2008-01-01

    Since high temperature materials are designed with a target life based on a specified amount of allowable strain and stress, their Isochronous Stress-Strain Curves (ISSC) are needed to avoid an excessive deformation during an intended service life. In this paper, a model-based methodology to develop the isochronous curves for a G91 steel is described. Creep strain-time curves were reviewed for typical high-temperature materials, and Garofalo's model which conforms well to the primary and secondary creep stages was proper for the G91 steel. Procedures to obtain an instantaneous elastic-plastic strain, ε i were given in detail. Also, to accurately determine the P 1 , P 2 and P 3 parameters in the Garofalo's model, a Nonlinear Least Square Fitting (NLSF) method was adopted and useful. The long-term creep curves for the G91 steel can be modeled by the Garofalo's model, and the long-term ISSCs can be developed using the modeled creep curves

  4. Going Beyond, Going Further: The Preparation of Acid-Base Titration Curves.

    Science.gov (United States)

    McClendon, Michael

    1984-01-01

    Background information, list of materials needed, and procedures used are provided for a simple technique for generating mechanically plotted acid-base titration curves. The method is suitable for second-year high school chemistry students. (JN)

  5. Automated pavement horizontal curve measurement methods based on inertial measurement unit and 3D profiling data

    Directory of Open Access Journals (Sweden)

    Wenting Luo

    2016-04-01

    Full Text Available Pavement horizontal curve is designed to serve as a transition between straight segments, and its presence may cause a series of driving-related safety issues to motorists and drivers. As is recognized that traditional methods for curve geometry investigation are time consuming, labor intensive, and inaccurate, this study attempts to develop a method that can automatically conduct horizontal curve identification and measurement at network level. The digital highway data vehicle (DHDV was utilized for data collection, in which three Euler angles, driving speed, and acceleration of survey vehicle were measured with an inertial measurement unit (IMU. The 3D profiling data used for cross slope calibration was obtained with PaveVision3D Ultra technology at 1 mm resolution. In this study, the curve identification was based on the variation of heading angle, and the curve radius was calculated with kinematic method, geometry method, and lateral acceleration method. In order to verify the accuracy of the three methods, the analysis of variance (ANOVA test was applied by using the control variable of curve radius measured by field test. Based on the measured curve radius, a curve safety analysis model was used to predict the crash rates and safe driving speeds at horizontal curves. Finally, a case study on 4.35 km road segment demonstrated that the proposed method could efficiently conduct network level analysis.

  6. A versatile curve-fit model for linear to deeply concave rank abundance curves

    NARCIS (Netherlands)

    Neuteboom, J.H.; Struik, P.C.

    2005-01-01

    A new, flexible curve-fit model for linear to concave rank abundance curves was conceptualized and validated using observational data. The model links the geometric-series model and log-series model and can also fit deeply concave rank abundance curves. The model is based ¿ in an unconventional way

  7. Temporal Drivers of Liking Based on Functional Data Analysis and Non-Additive Models for Multi-Attribute Time-Intensity Data of Fruit Chews.

    Science.gov (United States)

    Kuesten, Carla; Bi, Jian

    2018-06-03

    Conventional drivers of liking analysis was extended with a time dimension into temporal drivers of liking (TDOL) based on functional data analysis methodology and non-additive models for multiple-attribute time-intensity (MATI) data. The non-additive models, which consider both direct effects and interaction effects of attributes to consumer overall liking, include Choquet integral and fuzzy measure in the multi-criteria decision-making, and linear regression based on variance decomposition. Dynamics of TDOL, i.e., the derivatives of the relative importance functional curves were also explored. Well-established R packages 'fda', 'kappalab' and 'relaimpo' were used in the paper for developing TDOL. Applied use of these methods shows that the relative importance of MATI curves offers insights for understanding the temporal aspects of consumer liking for fruit chews.

  8. Enhancement of global flood damage assessments using building material based vulnerability curves

    Science.gov (United States)

    Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves

  9. Satellite altimetry based rating curves throughout the entire Amazon basin

    Science.gov (United States)

    Paris, A.; Calmant, S.; Paiva, R. C.; Collischonn, W.; Silva, J. S.; Bonnet, M.; Seyler, F.

    2013-05-01

    The Amazonian basin is the largest hydrological basin all over the world. In the recent past years, the basin has experienced an unusual succession of extreme draughts and floods, which origin is still a matter of debate. Yet, the amount of data available is poor, both over time and space scales, due to factor like basin's size, access difficulty and so on. One of the major locks is to get discharge series distributed over the entire basin. Satellite altimetry can be used to improve our knowledge of the hydrological stream flow conditions in the basin, through rating curves. Rating curves are mathematical relationships between stage and discharge at a given place. The common way to determine the parameters of the relationship is to compute the non-linear regression between the discharge and stage series. In this study, the discharge data was obtained by simulation through the entire basin using the MGB-IPH model with TRMM Merge input rainfall data and assimilation of gage data, run from 1998 to 2010. The stage dataset is made of ~800 altimetry series at ENVISAT and JASON-2 virtual stations. Altimetry series span between 2002 and 2010. In the present work we present the benefits of using stochastic methods instead of probabilistic ones to determine a dataset of rating curve parameters which are consistent throughout the entire Amazon basin. The rating curve parameters have been computed using a parameter optimization technique based on Markov Chain Monte Carlo sampler and Bayesian inference scheme. This technique provides an estimate of the best parameters for the rating curve, but also their posterior probability distribution, allowing the determination of a credibility interval for the rating curve. Also is included in the rating curve determination the error over discharges estimates from the MGB-IPH model. These MGB-IPH errors come from either errors in the discharge derived from the gage readings or errors in the satellite rainfall estimates. The present

  10. High resolution melt curve analysis based on methylation status for human semen identification.

    Science.gov (United States)

    Fachet, Caitlyn; Quarino, Lawrence; Karnas, K Joy

    2017-03-01

    A high resolution melt curve assay to differentiate semen from blood, saliva, urine, and vaginal fluid based on methylation status at the Dapper Isoform 1 (DACT1) gene was developed. Stains made from blood, saliva, urine, semen, and vaginal fluid were obtained from volunteers and DNA was isolated using either organic extraction (saliva, urine, and vaginal fluid) or Chelex ® 100 extraction (blood and semen). Extracts were then subjected to bisulfite modification in order to convert unmethylated cytosines to uracil, consequently creating sequences whose amplicons have melt curves that vary depending on their initial methylation status. When primers designed to amplify the promoter region of the DACT1 gene were used, DNA from semen samples was distinguishable from other fluids by a having a statistically significant lower melting temperature. The assay was found to be sperm-significant since semen from a vasectomized man produced a melting temperature similar to the non-semen body fluids. Blood and semen stains stored up to 5 months and tested at various intervals showed little variation in melt temperature indicating the methylation status was stable during the course of the study. The assay is a more viable method for forensic science practice than most molecular-based methods for body fluid stain identification since it is time efficient and utilizes instrumentation common to forensic biology laboratories. In addition, the assay is advantageous over traditional presumptive chemical methods for body fluid identification since results are confirmatory and the assay offers the possibility of multiplexing which may test for multiple body fluids simultaneously.

  11. Proposal of fatigue crack growth rate curve in air for nickel-base alloys used in BWR

    International Nuclear Information System (INIS)

    Ogawa, Takuya; Itatani, Masao; Nagase, Hiroshi; Aoike, Satoru; Yoneda, Hideki

    2013-01-01

    When the defects are detected in the nuclear components in Japan, structural integrity assessment should be performed for the technical judgment on continuous service based on the Rules on Fitness-for-Service for Nuclear Power Plants of the Japan Society of Mechanical Engineers Code (JSME FFS Code). Fatigue crack growth analysis is required when the cyclic loading would be applied for the components. Recently, fatigue crack growth rate curve in air environment for Nickel-base alloys weld metal used in BWR was proposed by the authors and it was adopted as a code case of JSME FFS Code to evaluate the embedded flaw. In this study, fatigue crack growth behavior for heat-affected zone (HAZ) of Nickel-base alloys in air was investigated. And a unified fatigue crack growth rate curve in air for HAZ and weld metal of Nickel-base alloys used in BWR was evaluated. As a result, it was found that the curve for weld metal could be applied as a curve for both HAZ and weld metal since moderately conservative assessment of fatigue crack growth rate of HAZ is possible by the curve for weld metal in the Paris region. And the threshold value of stress intensity far range (ΔK th ) is determined to 3.0 MPa√m based on the fatigue crack growth rate of HAZ. (author)

  12. Statistical data processing of mobility curves of univalent weak bases

    Czech Academy of Sciences Publication Activity Database

    Šlampová, Andrea; Boček, Petr

    2008-01-01

    Roč. 29, č. 2 (2008), s. 538-541 ISSN 0173-0835 R&D Projects: GA AV ČR IAA400310609; GA ČR GA203/05/2106 Institutional research plan: CEZ:AV0Z40310501 Keywords : mobility curve * univalent weak bases * statistical evaluation Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.509, year: 2008

  13. Trajectory Optimization of Spray Painting Robot for Complex Curved Surface Based on Exponential Mean Bézier Method

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2017-01-01

    Full Text Available Automated tool trajectory planning for spray painting robots is still a challenging problem, especially for a large complex curved surface. This paper presents a new method of trajectory optimization for spray painting robot based on exponential mean Bézier method. The definition and the three theorems of exponential mean Bézier curves are discussed. Then a spatial painting path generation method based on exponential mean Bézier curves is developed. A new simple algorithm for trajectory optimization on complex curved surfaces is introduced. A golden section method is adopted to calculate the values. The experimental results illustrate that the exponential mean Bézier curves enhanced flexibility of the path planning, and the trajectory optimization algorithm achieved satisfactory performance. This method can also be extended to other applications.

  14. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    Science.gov (United States)

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  15. Laws governing the energy conversion of ionization curves

    International Nuclear Information System (INIS)

    Gorgoskii, V.I.

    1986-01-01

    The author attempts to determine if ionization curves are structured or smooth, the cause of the smoothing of the curves, the possibility of the curves having maxima and why, how many maxima are on the ionization curve, and which of these maxima is the fundamental maxima. The study shows that ionization curves without and additional maximum, i.e., with one fundamental maximum, can be obtained for potassium, rubidium, and cesium. This requires reduction of the density of the electrons in the stream and the density of the atoms of the target gas. It is also shown that in order to obtain ionization curves with additional maxima in the cases of neon, argon, and krypton, the measurements must be carried out at high densities of the electrons in the stream and of the atoms of the target gas

  16. Refined tropical curve counts and canonical bases for quantum cluster algebras

    DEFF Research Database (Denmark)

    Mandel, Travis

    We express the (quantizations of the) Gross-Hacking-Keel-Kontsevich canonical bases for cluster algebras in terms of certain (Block-Göttsche) weighted counts of tropical curves. In the process, we obtain via scattering diagram techniques a new invariance result for these Block-Göttsche counts....

  17. Study of zirconium-addition binary systems

    International Nuclear Information System (INIS)

    Wozniakova, B.; Kuchar, L.

    1975-01-01

    The curves are given of the solid and the liquid of binary zirconium-addition systems. Most additions reduce the melting temperature of zirconium. The only known additions to increase the melting temperature are nitrogen, oxygen and hafnium. Also given are the transformation curves of the systems and the elements are given which reduce or raise the temperature of α-β transformation. From the Mendeleev table into which are plotted the curves of the solid and the liquid of binary systems it is possible to predict the properties of unknown binary systems. For the calculations of the curves of the solid and the liquid, 1860 degC was taken as the temperature of zirconium melting. For the calculations of transformation curves, 865 degC was taken as the temperature of α-β transformation. The equations are given of the curves of the solid and the liquid and of the transformation curves of some Zr-addition systems. Also given are the calculated equilibrium distribution coefficients and the equilibrium distribution coefficients of the transformation of additions in Zr and their limit values for temperatures approximating the melting point or the temperature of the transformation of pure Zr, and the values pertaining to eutectic and peritectic or eutectoid and peritectoid temperatures. (J.B.)

  18. Analytical expression for initial magnetization curve of Fe-based soft magnetic composite material

    Energy Technology Data Exchange (ETDEWEB)

    Birčáková, Zuzana, E-mail: zuzana.bircakova@upjs.sk [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Kollár, Peter; Füzer, Ján [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Bureš, Radovan; Fáberová, Mária [Institute of Materials Research, Slovak Academy of Sciences, Watsonova 47, 04001 Košice (Slovakia)

    2017-02-01

    The analytical expression for the initial magnetization curve for Fe-phenolphormaldehyde resin composite material was derived based on the already proposed ideas of the magnetization vector deviation function and the domain wall annihilation function, characterizing the reversible magnetization processes through the extent of deviation of magnetization vectors from magnetic field direction and the irreversible processes through the effective numbers of movable domain walls, respectively. As for composite materials the specific dependences of these functions were observed, the ideas were extended meeting the composites special features, which are principally the much higher inner demagnetizing fields produced by magnetic poles on ferromagnetic particle surfaces. The proposed analytical expression enables us to find the relative extent of each type of magnetization processes when magnetizing a specimen along the initial curve. - Highlights: • Analytical expression of the initial curve derived for SMC. • Initial curve described by elementary magnetization processes. • Influence of inner demagnetizing fields on magnetization process in SMC.

  19. Mannheim Curves in Nonflat 3-Dimensional Space Forms

    Directory of Open Access Journals (Sweden)

    Wenjing Zhao

    2015-01-01

    Full Text Available We consider the Mannheim curves in nonflat 3-dimensional space forms (Riemannian or Lorentzian and we give the concept of Mannheim curves. In addition, we investigate the properties of nonnull Mannheim curves and their partner curves. We come to the conclusion that a necessary and sufficient condition is that a linear relationship with constant coefficients will exist between the curvature and the torsion of the given original curves. In the case of null curve, we reveal that there are no null Mannheim curves in the 3-dimensional de Sitter space.

  20. Comparison of Paired ROC Curves through a Two-Stage Test.

    Science.gov (United States)

    Yu, Wenbao; Park, Eunsik; Chang, Yuan-Chin Ivan

    2015-01-01

    The area under the receiver operating characteristic (ROC) curve (AUC) is a popularly used index when comparing two ROC curves. Statistical tests based on it for analyzing the difference have been well developed. However, this index is less informative when two ROC curves cross and have similar AUCs. In order to detect differences between ROC curves in such situations, a two-stage nonparametric test that uses a shifted area under the ROC curve (sAUC), along with AUCs, is proposed for paired designs. The new procedure is shown, numerically, to be effective in terms of power under a wide range of scenarios; additionally, it outperforms two conventional ROC-type tests, especially when two ROC curves cross each other and have similar AUCs. Larger sAUC implies larger partial AUC at the range of low false-positive rates in this case. Because high specificity is important in many classification tasks, such as medical diagnosis, this is an appealing characteristic. The test also implicitly analyzes the equality of two commonly used binormal ROC curves at every operating point. We also apply the proposed method to synthesized data and two real examples to illustrate its usefulness in practice.

  1. Curves and Abelian varieties

    CERN Document Server

    Alexeev, Valery; Clemens, C Herbert; Beauville, Arnaud

    2008-01-01

    This book is devoted to recent progress in the study of curves and abelian varieties. It discusses both classical aspects of this deep and beautiful subject as well as two important new developments, tropical geometry and the theory of log schemes. In addition to original research articles, this book contains three surveys devoted to singularities of theta divisors, of compactified Jacobians of singular curves, and of "strange duality" among moduli spaces of vector bundles on algebraic varieties.

  2. Probability- and curve-based fractal reconstruction on 2D DEM terrain profile

    International Nuclear Information System (INIS)

    Lai, F.-J.; Huang, Y.M.

    2009-01-01

    Data compression and reconstruction has been playing important roles in information science and engineering. As part of them, image compression and reconstruction that mainly deal with image data set reduction for storage or transmission and data set restoration with least loss is still a topic deserved a great deal of works to focus on. In this paper we propose a new scheme in comparison with the well-known Improved Douglas-Peucker (IDP) method to extract characteristic or feature points of two-dimensional digital elevation model (2D DEM) terrain profile to compress data set. As for reconstruction in use of fractal interpolation, we propose a probability-based method to speed up the fractal interpolation execution to a rate as high as triple or even ninefold of the regular. In addition, a curve-based method is proposed in the study to determine the vertical scaling factor that much affects the generation of the interpolated data points to significantly improve the reconstruction performance. Finally, an evaluation is made to show the advantage of employing the proposed new method to extract characteristic points associated with our novel fractal interpolation scheme.

  3. Curve Evolution in Subspaces and Exploring the Metameric Class of Histogram of Gradient Orientation based Features using Nonlinear Projection Methods

    DEFF Research Database (Denmark)

    Tatu, Aditya Jayant

    This thesis deals with two unrelated issues, restricting curve evolution to subspaces and computing image patches in the equivalence class of Histogram of Gradient orientation based features using nonlinear projection methods. Curve evolution is a well known method used in various applications like...... tracking interfaces, active contour based segmentation methods and others. It can also be used to study shape spaces, as deforming a shape can be thought of as evolving its boundary curve. During curve evolution a curve traces out a path in the infinite dimensional space of curves. Due to application...... specific requirements like shape priors or a given data model, and due to limitations of the computer, the computed curve evolution forms a path in some finite dimensional subspace of the space of curves. We give methods to restrict the curve evolution to a finite dimensional linear or implicitly defined...

  4. Determination of critical nitrogen dilution curve based on stem dry matter in rice.

    Directory of Open Access Journals (Sweden)

    Syed Tahir Ata-Ul-Karim

    Full Text Available Plant analysis is a very promising diagnostic tool for assessment of crop nitrogen (N requirements in perspectives of cost effective and environment friendly agriculture. Diagnosing N nutritional status of rice crop through plant analysis will give insights into optimizing N requirements of future crops. The present study was aimed to develop a new methodology for determining the critical nitrogen (Nc dilution curve based on stem dry matter (SDM and to assess its suitability to estimate the level of N nutrition for rice (Oryza sativa L. in east China. Three field experiments with varied N rates (0-360 kg N ha(-1 using three Japonica rice hybrids, Lingxiangyou-18, Wuxiangjing-14 and Wuyunjing were conducted in Jiangsu province of east China. SDM and stem N concentration (SNC were determined during vegetative stage for growth analysis. A Nc dilution curve based on SDM was described by the equation (Nc = 2.17W(-0.27 with W being SDM in t ha(-1, when SDM ranged from 0.88 to 7.94 t ha(-1. However, for SDM < 0.88 t ha(-1, the constant critical value Nc = 1.76% SDM was applied. The curve was dually validated for N-limiting and non-N-limiting growth conditions. The N nutrition index (NNI and accumulated N deficit (Nand of stem ranged from 0.57 to 1.06 and 51.1 to -7.07 kg N ha(-1, respectively, during key growth stages under varied N rates in 2010 and 2011. The values of ΔN derived from either NNI or Nand could be used as references for N dressing management during rice growth. Our results demonstrated that the present curve well differentiated the conditions of limiting and non-limiting N nutrition in rice crop. The SDM based Nc dilution curve can be adopted as an alternate and novel approach for evaluating plant N status to support N fertilization decision during the vegetative growth of Japonica rice in east China.

  5. An Approach of Estimating Individual Growth Curves for Young Thoroughbred Horses Based on Their Birthdays

    Science.gov (United States)

    ONODA, Tomoaki; YAMAMOTO, Ryuta; SAWAMURA, Kyohei; MURASE, Harutaka; NAMBO, Yasuo; INOUE, Yoshinobu; MATSUI, Akira; MIYAKE, Takeshi; HIRAI, Nobuhiro

    2014-01-01

    ABSTRACT We propose an approach of estimating individual growth curves based on the birthday information of Japanese Thoroughbred horses, with considerations of the seasonal compensatory growth that is a typical characteristic of seasonal breeding animals. The compensatory growth patterns appear during only the winter and spring seasons in the life of growing horses, and the meeting point between winter and spring depends on the birthday of each horse. We previously developed new growth curve equations for Japanese Thoroughbreds adjusting for compensatory growth. Based on the equations, a parameter denoting the birthday information was added for the modeling of the individual growth curves for each horse by shifting the meeting points in the compensatory growth periods. A total of 5,594 and 5,680 body weight and age measurements of Thoroughbred colts and fillies, respectively, and 3,770 withers height and age measurements of both sexes were used in the analyses. The results of predicted error difference and Akaike Information Criterion showed that the individual growth curves using birthday information better fit to the body weight and withers height data than not using them. The individual growth curve for each horse would be a useful tool for the feeding managements of young Japanese Thoroughbreds in compensatory growth periods. PMID:25013356

  6. THE CPA QUALIFICATION METHOD BASED ON THE GAUSSIAN CURVE FITTING

    Directory of Open Access Journals (Sweden)

    M.T. Adithia

    2015-01-01

    Full Text Available The Correlation Power Analysis (CPA attack is an attack on cryptographic devices, especially smart cards. The results of the attack are correlation traces. Based on the correlation traces, an evaluation is done to observe whether significant peaks appear in the traces or not. The evaluation is done manually, by experts. If significant peaks appear then the smart card is not considered secure since it is assumed that the secret key is revealed. We develop a method that objectively detects peaks and decides which peak is significant. We conclude that using the Gaussian curve fitting method, the subjective qualification of the peak significance can be objectified. Thus, better decisions can be taken by security experts. We also conclude that the Gaussian curve fitting method is able to show the influence of peak sizes, especially the width and height, to a significance of a particular peak.

  7. Probabilistic evaluation of design S-N curve and reliability assessment of ASME code-based evaluation

    International Nuclear Information System (INIS)

    Zhao Yongxiang

    1999-01-01

    A probabilistic evaluating approach of design S-N curve and a reliability assessment approach of the ASME code-based evaluation are presented on the basis of Langer S-N model-based P-S-N curves. The P-S-N curves are estimated by a so-called general maximum likelihood method. This method can be applied to deal with the virtual stress amplitude-crack initial life data which have a characteristics of double random variables. Investigation of a set of the virtual stress amplitude-crack initial life (S-N) data of 1Cr18Ni9Ti austenitic stainless steel-welded joint reveals that the P-S-N curves can give a good prediction of scatter regularity of the S-N data. Probabilistic evaluation of the design S-N curve with 0.9999 survival probability has considered various uncertainties, besides of the scatter of the S-N data, to an appropriate extent. The ASME code-based evaluation with 20 reduction factor on the mean life is much more conservative than that with 2 reduction factor on the stress amplitude. Evaluation of the latter in 666.61 MPa virtual stress amplitude is equivalent to 0.999522 survival probability and in 2092.18 MPa virtual stress amplitude equivalent to 0.9999999995 survival probability. This means that the evaluation in the low loading level may be non-conservative and in contrast, too conservative in the high loading level. Cause is that the reduction factors are constants and the factors can not take into account the general observation that scatter of the N data increases with the loading level decreasing. This has indicated that it is necessary to apply the probabilistic approach to the evaluation of design S-N curve

  8. The writhe of open and closed curves

    International Nuclear Information System (INIS)

    Berger, Mitchell A; Prior, Chris

    2006-01-01

    Twist and writhe measure basic geometric properties of a ribbon or tube. While these measures have applications in molecular biology, materials science, fluid mechanics and astrophysics, they are under-utilized because they are often considered difficult to compute. In addition, many applications involve curves with endpoints (open curves); but for these curves the definition of writhe can be ambiguous. This paper provides simple expressions for the writhe of closed curves, and provides a new definition of writhe for open curves. The open curve definition is especially appropriate when the curve is anchored at endpoints on a plane or stretches between two parallel planes. This definition can be especially useful for magnetic flux tubes in the solar atmosphere, and for isotropic rods with ends fixed to a plane

  9. Graphene based tunable fractal Hilbert curve array broadband radar absorbing screen for radar cross section reduction

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xianjun, E-mail: xianjun.huang@manchester.ac.uk [School of Electrical and Electronic Engineering, University of Manchester, Manchester M13 9PL (United Kingdom); College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073 (China); Hu, Zhirun [School of Electrical and Electronic Engineering, University of Manchester, Manchester M13 9PL (United Kingdom); Liu, Peiguo [College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073 (China)

    2014-11-15

    This paper proposes a new type of graphene based tunable radar absorbing screen. The absorbing screen consists of Hilbert curve metal strip array and chemical vapour deposition (CVD) graphene sheet. The graphene based screen is not only tunable when the chemical potential of the graphene changes, but also has broadband effective absorption. The absorption bandwidth is from 8.9GHz to 18.1GHz, ie., relative bandwidth of more than 68%, at chemical potential of 0eV, which is significantly wider than that if the graphene sheet had not been employed. As the chemical potential varies from 0 to 0.4eV, the central frequency of the screen can be tuned from 13.5GHz to 19.0GHz. In the proposed structure, Hilbert curve metal strip array was designed to provide multiple narrow band resonances, whereas the graphene sheet directly underneath the metal strip array provides tunability and averagely required surface resistance so to significantly extend the screen operation bandwidth by providing broadband impedance matching and absorption. In addition, the thickness of the screen has been optimized to achieve nearly the minimum thickness limitation for a nonmagnetic absorber. The working principle of this absorbing screen is studied in details, and performance under various incident angles is presented. This work extends applications of graphene into tunable microwave radar cross section (RCS) reduction applications.

  10. Graphene based tunable fractal Hilbert curve array broadband radar absorbing screen for radar cross section reduction

    International Nuclear Information System (INIS)

    Huang, Xianjun; Hu, Zhirun; Liu, Peiguo

    2014-01-01

    This paper proposes a new type of graphene based tunable radar absorbing screen. The absorbing screen consists of Hilbert curve metal strip array and chemical vapour deposition (CVD) graphene sheet. The graphene based screen is not only tunable when the chemical potential of the graphene changes, but also has broadband effective absorption. The absorption bandwidth is from 8.9GHz to 18.1GHz, ie., relative bandwidth of more than 68%, at chemical potential of 0eV, which is significantly wider than that if the graphene sheet had not been employed. As the chemical potential varies from 0 to 0.4eV, the central frequency of the screen can be tuned from 13.5GHz to 19.0GHz. In the proposed structure, Hilbert curve metal strip array was designed to provide multiple narrow band resonances, whereas the graphene sheet directly underneath the metal strip array provides tunability and averagely required surface resistance so to significantly extend the screen operation bandwidth by providing broadband impedance matching and absorption. In addition, the thickness of the screen has been optimized to achieve nearly the minimum thickness limitation for a nonmagnetic absorber. The working principle of this absorbing screen is studied in details, and performance under various incident angles is presented. This work extends applications of graphene into tunable microwave radar cross section (RCS) reduction applications

  11. Diagnostic tests’ decision-making rules based upon analysis of ROC-curves

    Directory of Open Access Journals (Sweden)

    Л. В. Батюк

    2015-10-01

    Full Text Available In this paper we propose the model which substantiates diagnostics decision making based on the analysis of Receiver Operating Characteristic curves (ROC-curves and predicts optimal values of diagnostic indicators of biomedical information. To assess the quality of the test result prediction the standard criteria of the sensitivity and specificity of the model were used. Values of these criteria were calculated for the cases when the sensitivity of the test was greater than specificity by several times, when the number of correct diagnoses was maximal, when the sensitivity of the test was equal to its specificity and the sensitivity of the test was several times greater than the specificity of the test. To assess the significance of the factor characteristics and to compare the prognostic characteristics of models we used mathematical modeling and plotting the ROC-curves. The optimal value of the diagnostic indicator was found to be achieved when the sensitivity of the test is equal to its specificity. The model was adapted to solve the case when the sensitivity of the test is greater than specificity of the test.

  12. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    Science.gov (United States)

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  13. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    Directory of Open Access Journals (Sweden)

    Alavalapati Goutham Reddy

    Full Text Available Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  14. Dealing with Non-stationarity in Intensity-Frequency-Duration Curve

    Science.gov (United States)

    Rengaraju, S.; Rajendran, V.; C T, D.

    2017-12-01

    Extremes like flood and drought are becoming frequent and more vulnerable in recent times, generally attributed to the recent revelation of climate change. One of the main concerns is that whether the present infrastructures like dams, storm water drainage networks, etc., which were designed following the so called `stationary' assumption, are capable of withstanding the expected severe extremes. Stationary assumption considers that extremes are not changing with respect to time. However, recent studies proved that climate change has altered the climate extremes both temporally and spatially. Traditionally, the observed non-stationary in the extreme precipitation is incorporated in the extreme value distributions in terms of changing parameters. Nevertheless, this raises a question which parameter needs to be changed, i.e. location or scale or shape, since either one or more of these parameters vary at a given location. Hence, this study aims to detect the changing parameters to reduce the complexity involved in the development of non-stationary IDF curve and to provide the uncertainty bound of estimated return level using Bayesian Differential Evolutionary Monte Carlo (DE-MC) algorithm. Firstly, the extreme precipitation series is extracted using Peak Over Threshold. Then, the time varying parameter(s) is(are) detected for the extracted series using Generalized Additive Models for Location Scale and Shape (GAMLSS). Then, the IDF curve is constructed using Generalized Pareto Distribution incorporating non-stationarity only if the parameter(s) is(are) changing with respect to time, otherwise IDF curve will follow stationary assumption. Finally, the posterior probability intervals of estimated return revel are computed through Bayesian DE-MC approach and the non-stationary based IDF curve is compared with the stationary based IDF curve. The results of this study emphasize that the time varying parameters also change spatially and the IDF curves should incorporate non

  15. Titration Curves: Fact and Fiction.

    Science.gov (United States)

    Chamberlain, John

    1997-01-01

    Discusses ways in which datalogging equipment can enable titration curves to be measured accurately and how computing power can be used to predict the shape of curves. Highlights include sources of error, use of spreadsheets to generate titration curves, titration of a weak acid with a strong alkali, dibasic acids, weak acid and weak base, and…

  16. Customized versus population-based growth curves: prediction of low body fat percent at term corrected gestational age following preterm birth.

    Science.gov (United States)

    Law, Tameeka L; Katikaneni, Lakshmi D; Taylor, Sarah N; Korte, Jeffrey E; Ebeling, Myla D; Wagner, Carol L; Newman, Roger B

    2012-07-01

    Compare customized versus population-based growth curves for identification of small-for-gestational-age (SGA) and body fat percent (BF%) among preterm infants. Prospective cohort study of 204 preterm infants classified as SGA or appropriate-for-gestational-age (AGA) by population-based and customized growth curves. BF% was determined by air-displacement plethysmography. Differences between groups were compared using bivariable and multivariable linear and logistic regression analyses. Customized curves reclassified 30% of the preterm infants as SGA. SGA infants identified by customized method only had significantly lower BF% (13.8 ± 6.0) than the AGA (16.2 ± 6.3, p = 0.02) infants and similar to the SGA infants classified by both methods (14.6 ± 6.7, p = 0.51). Customized growth curves were a significant predictor of BF% (p = 0.02), whereas population-based growth curves were not a significant independent predictor of BF% (p = 0.50) at term corrected gestational age. Customized growth potential improves the differentiation of SGA infants and low BF% compared with a standard population-based growth curve among a cohort of preterm infants.

  17. A New Curve Tracing Algorithm Based on Local Feature in the Vectorization of Paper Seismograms

    Directory of Open Access Journals (Sweden)

    Maofa Wang

    2014-02-01

    Full Text Available History paper seismograms are very important information for earthquake monitoring and prediction. The vectorization of paper seismograms is an import problem to be resolved. Auto tracing of waveform curves is a key technology for the vectorization of paper seismograms. It can transform an original scanning image into digital waveform data. Accurately tracing out all the key points of each curve in seismograms is the foundation for vectorization of paper seismograms. In the paper, we present a new curve tracing algorithm based on local feature, applying to auto extraction of earthquake waveform in paper seismograms.

  18. Electromechanical response of a curved piezoelectric nanobeam with the consideration of surface effects

    International Nuclear Information System (INIS)

    Yan Zhi; Jiang Liying

    2011-01-01

    This work investigates the electromechanical response of a curved piezoelectric nanobeam with the consideration of surface effects through the surface-layer-based model and the generalized Young-Laplace equations. For nanoscale piezoelectric structures, the surface effects also include surface piezoelectricity in addition to the residual surface stress and surface elasticity for elastic nanomaterials. A Euler-Bernoulli curved beam theory is used to get the explicit solutions for the electroelastic fields of a curved cantilever beam when subjected to mechanical and electrical loads. In order to apply the appropriate boundary conditions on the beam, effective axial force, shear force and moment are derived. The results indicate that the surface effects play a significant role in the electroelastic fields and the piezoelectric response of the curved piezoelectric nanobeam. It is also found that the coupling of the residual surface stress, the surface elasticity and the surface piezoelectricity may be dramatic despite that the influence of the individual one is small under some circumstances. This study is expected to be useful for design and applications of curved beam based piezoelectric nanodevices, such as the curved nanowires/nanobelts or nanorings as nanoswitches or nanoactuators for displacement control purpose.

  19. Strain- and stress-based forming limit curves for DP 590 steel sheet using Marciniak-Kuczynski method

    Science.gov (United States)

    Kumar, Gautam; Maji, Kuntal

    2018-04-01

    This article deals with the prediction of strain-and stress-based forming limit curves for advanced high strength steel DP590 sheet using Marciniak-Kuczynski (M-K) method. Three yield criteria namely Von-Mises, Hill's 48 and Yld2000-2d and two hardening laws i.e., Hollomon power and Swift hardening laws were considered to predict the forming limit curves (FLCs) for DP590 steel sheet. The effects of imperfection factor and initial groove angle on prediction of FLC were also investigated. It was observed that the FLCs shifted upward with the increase of imperfection factor value. The initial groove angle was found to have significant effects on limit strains in the left side of FLC, and insignificant effect for the right side of FLC for certain range of strain paths. The limit strains were calculated at zero groove angle for the right side of FLC, and a critical groove angle was used for the left side of FLC. The numerically predicted FLCs considering the different combinations of yield criteria and hardening laws were compared with the published experimental results of FLCs for DP590 steel sheet. The FLC predicted using the combination of Yld2000-2d yield criterion and swift hardening law was in better coorelation with the experimental data. Stress based forming limit curves (SFLCs) were also calculated from the limiting strain values obtained by M-K model. Theoretically predicted SFLCs were compared with that obtained from the experimental forming limit strains. Stress based forming limit curves were seen to better represent the forming limits of DP590 steel sheet compared to that by strain-based forming limit curves.

  20. Effect of β on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion

    International Nuclear Information System (INIS)

    Feng Qinghai; Yuan Wancheng

    2010-01-01

    In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of β is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of β are suggested.

  1. Identifiability of altimetry-based rating curve parameters in function of river morphological parameters

    Science.gov (United States)

    Paris, Adrien; André Garambois, Pierre; Calmant, Stéphane; Paiva, Rodrigo; Walter, Collischonn; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Bonnet, Marie-Paule; Seyler, Frédérique; Monnier, Jérôme

    2016-04-01

    Estimating river discharge for ungauged river reaches from satellite measurements is not straightforward given the nonlinearity of flow behavior with respect to measurable and non measurable hydraulic parameters. As a matter of facts, current satellite datasets do not give access to key parameters such as river bed topography and roughness. A unique set of almost one thousand altimetry-based rating curves was built by fit of ENVISAT and Jason-2 water stages with discharges obtained from the MGB-IPH rainfall-runoff model in the Amazon basin. These rated discharges were successfully validated towards simulated discharges (Ens = 0.70) and in-situ discharges (Ens = 0.71) and are not mission-dependent. The rating curve writes Q = a(Z-Z0)b*sqrt(S), with Z the water surface elevation and S its slope gained from satellite altimetry, a and b power law coefficient and exponent and Z0 the river bed elevation such as Q(Z0) = 0. For several river reaches in the Amazon basin where ADCP measurements are available, the Z0 values are fairly well validated with a relative error lower than 10%. The present contribution aims at relating the identifiability and the physical meaning of a, b and Z0given various hydraulic and geomorphologic conditions. Synthetic river bathymetries sampling a wide range of rivers and inflow discharges are used to perform twin experiments. A shallow water model is run for generating synthetic satellite observations, and then rating curve parameters are determined for each river section thanks to a MCMC algorithm. Thanks to twin experiments, it is shown that rating curve formulation with water surface slope, i.e. closer from Manning equation form, improves parameter identifiability. The compensation between parameters is limited, especially for reaches with little water surface variability. Rating curve parameters are analyzed for riffle and pools for small to large rivers, different river slopes and cross section shapes. It is shown that the river bed

  2. Uncertainty estimation with bias-correction for flow series based on rating curve

    Science.gov (United States)

    Shao, Quanxi; Lerat, Julien; Podger, Geoff; Dutta, Dushmanta

    2014-03-01

    Streamflow discharge constitutes one of the fundamental data required to perform water balance studies and develop hydrological models. A rating curve, designed based on a series of concurrent stage and discharge measurements at a gauging location, provides a way to generate complete discharge time series with a reasonable quality if sufficient measurement points are available. However, the associated uncertainty is frequently not available even though it has a significant impact on hydrological modelling. In this paper, we identify the discrepancy of the hydrographers' rating curves used to derive the historical discharge data series and proposed a modification by bias correction which is also in the form of power function as the traditional rating curve. In order to obtain the uncertainty estimation, we propose a further both-side Box-Cox transformation to stabilize the regression residuals as close to the normal distribution as possible, so that a proper uncertainty can be attached for the whole discharge series in the ensemble generation. We demonstrate the proposed method by applying it to the gauging stations in the Flinders and Gilbert rivers in north-west Queensland, Australia.

  3. TELECOMMUNICATIONS INFRASTRUCTURE AND GDP /JIPP CURVE/

    Directory of Open Access Journals (Sweden)

    Mariana Kaneva

    2016-07-01

    Full Text Available The relationship between telecommunications infrastructure and economic activity is under discussion in many scientific papers. Most of the authors use for research and analysis the Jipp curve. A lot of doubts about the correctness of the Jipp curve appear in terms of applying econometric models. The aim of this study is a review of the Jipp curve, refining the possibility of its application in modern conditions. The methodology used in the study is based on dynamic econometric models, including tests for nonstationarity and tests for causality. The focus of this study is directed to methodological problems in measuring the local density types of telecommunication networks. This study offers a specific methodology for assessing the Jipp law, through VAR-approach and Granger causality tests. It is proved that mechanical substitution of momentary aggregated variables (such as the number of subscribers of a telecommunication network at the end of the year and periodically aggregated variables (such as GDP per capita in the Jipp�s curve is methodologically wrong. Researchers have to reconsider the relationship set in the Jipp�s curve by including additional variables that characterize the Telecommunications sector and the economic activity in a particular country within a specified time period. GDP per capita should not be regarded as a single factor for the local density of telecommunications infrastructure. New econometric models studying the relationship between the investments in telecommunications infrastructure and economic development may be not only linear regression models, but also other econometric models. New econometric models should be proposed after testing and validating with sound economic theory and econometric methodology.

  4. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  5. Method of construction spatial transition curve

    Directory of Open Access Journals (Sweden)

    S.V. Didanov

    2013-04-01

    Full Text Available Purpose. The movement of rail transport (speed rolling stock, traffic safety, etc. is largely dependent on the quality of the track. In this case, a special role is the transition curve, which ensures smooth insertion of the transition from linear to circular section of road. The article deals with modeling of spatial transition curve based on the parabolic distribution of the curvature and torsion. This is a continuation of research conducted by the authors regarding the spatial modeling of curved contours. Methodology. Construction of the spatial transition curve is numerical methods for solving nonlinear integral equations, where the initial data are taken coordinate the starting and ending points of the curve of the future, and the inclination of the tangent and the deviation of the curve from the tangent plane at these points. System solutions for the numerical method are the partial derivatives of the equations of the unknown parameters of the law of change of torsion and length of the transition curve. Findings. The parametric equations of the spatial transition curve are calculated by finding the unknown coefficients of the parabolic distribution of the curvature and torsion, as well as the spatial length of the transition curve. Originality. A method for constructing the spatial transition curve is devised, and based on this software geometric modeling spatial transition curves of railway track with specified deviations of the curve from the tangent plane. Practical value. The resulting curve can be applied in any sector of the economy, where it is necessary to ensure a smooth transition from linear to circular section of the curved space bypass. An example is the transition curve in the construction of the railway line, road, pipe, profile, flat section of the working blades of the turbine and compressor, the ship, plane, car, etc.

  6. Development of theoretical oxygen saturation calibration curve based on optical density ratio and optical simulation approach

    Science.gov (United States)

    Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia

    2017-09-01

    The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.

  7. Influence of pavement condition on horizontal curve safety.

    Science.gov (United States)

    Buddhavarapu, Prasad; Banerjee, Ambarish; Prozzi, Jorge A

    2013-03-01

    Crash statistics suggest that horizontal curves are the most vulnerable sites for crash occurrence. These crashes are often severe and many involve at least some level of injury due to the nature of the collisions. Ensuring the desired pavement surface condition is one potentially effective strategy to reduce the occurrence of severe accidents on horizontal curves. This study sought to develop crash injury severity models by integrating crash and pavement surface condition databases. It focuses on developing a causal relationship between pavement condition indices and severity level of crashes occurring on two-lane horizontal curves in Texas. In addition, it examines the suitability of the existing Skid Index for safety maintenance of two-lane curves. Significant correlation is evident between pavement condition and crash injury severity on two-lane undivided horizontal curves in Texas. Probability of a crash becoming fatal is appreciably sensitive to certain pavement indices. Data suggested that road facilities providing a smoother and more comfortable ride are vulnerable to severe crashes on horizontal curves. In addition, the study found that longitudinal skid measurement barely correlates with injury severity of crashes occurring on curved portions. The study recommends exploring the option of incorporating lateral friction measurement into Pavement Management System (PMS) databases specifically at curved road segments. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Fabricating small-scale, curved, polymeric structures with convex and concave menisci through interfacial free energy equilibrium.

    Science.gov (United States)

    Cheng, Chao-Min; Matsuura, Koji; Wang, I-Jan; Kuroda, Yuka; LeDuc, Philip R; Naruse, Keiji

    2009-11-21

    Polymeric curved structures are widely used in imaging systems including optical fibers and microfluidic channels. Here, we demonstrate that small-scale, poly(dimethylsiloxane) (PDMS)-based, curved structures can be fabricated through controlling interfacial free energy equilibrium. Resultant structures have a smooth, symmetric, curved surface, and may be convex or concave in form based on surface tension balance. Their curvatures are controlled by surface characteristics (i.e., hydrophobicity and hydrophilicity) of the molds and semi-liquid PDMS. In addition, these structures are shown to be biocompatible for cell culture. Our system provides a simple, efficient and economical method for generating integrateable optical components without costly fabrication facilities.

  9. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  10. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  11. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.

    Science.gov (United States)

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-07-14

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.

  12. Lagrangian Curves on Spectral Curves of Monopoles

    International Nuclear Information System (INIS)

    Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.

    2010-01-01

    We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.

  13. Greater Activity in the Frontal Cortex on Left Curves: A Vector-Based fNIRS Study of Left and Right Curve Driving.

    Directory of Open Access Journals (Sweden)

    Noriyuki Oka

    Full Text Available In the brain, the mechanisms of attention to the left and the right are known to be different. It is possible that brain activity when driving also differs with different horizontal road alignments (left or right curves, but little is known about this. We found driver brain activity to be different when driving on left and right curves, in an experiment using a large-scale driving simulator and functional near-infrared spectroscopy (fNIRS.The participants were fifteen healthy adults. We created a course simulating an expressway, comprising straight line driving and gentle left and right curves, and monitored the participants under driving conditions, in which they drove at a constant speed of 100 km/h, and under non-driving conditions, in which they simply watched the screen (visual task. Changes in hemoglobin concentrations were monitored at 48 channels including the prefrontal cortex, the premotor cortex, the primary motor cortex and the parietal cortex. From orthogonal vectors of changes in deoxyhemoglobin and changes in oxyhemoglobin, we calculated changes in cerebral oxygen exchange, reflecting neural activity, and statistically compared the resulting values from the right and left curve sections.Under driving conditions, there were no sites where cerebral oxygen exchange increased significantly more during right curves than during left curves (p > 0.05, but cerebral oxygen exchange increased significantly more during left curves (p < 0.05 in the right premotor cortex, the right frontal eye field and the bilateral prefrontal cortex. Under non-driving conditions, increases were significantly greater during left curves (p < 0.05 only in the right frontal eye field.Left curve driving was thus found to require more brain activity at multiple sites, suggesting that left curve driving may require more visual attention than right curve driving. The right frontal eye field was activated under both driving and non-driving conditions.

  14. Greater Activity in the Frontal Cortex on Left Curves: A Vector-Based fNIRS Study of Left and Right Curve Driving

    Science.gov (United States)

    Oka, Noriyuki; Yoshino, Kayoko; Yamamoto, Kouji; Takahashi, Hideki; Li, Shuguang; Sugimachi, Toshiyuki; Nakano, Kimihiko; Suda, Yoshihiro; Kato, Toshinori

    2015-01-01

    Objectives In the brain, the mechanisms of attention to the left and the right are known to be different. It is possible that brain activity when driving also differs with different horizontal road alignments (left or right curves), but little is known about this. We found driver brain activity to be different when driving on left and right curves, in an experiment using a large-scale driving simulator and functional near-infrared spectroscopy (fNIRS). Research Design and Methods The participants were fifteen healthy adults. We created a course simulating an expressway, comprising straight line driving and gentle left and right curves, and monitored the participants under driving conditions, in which they drove at a constant speed of 100 km/h, and under non-driving conditions, in which they simply watched the screen (visual task). Changes in hemoglobin concentrations were monitored at 48 channels including the prefrontal cortex, the premotor cortex, the primary motor cortex and the parietal cortex. From orthogonal vectors of changes in deoxyhemoglobin and changes in oxyhemoglobin, we calculated changes in cerebral oxygen exchange, reflecting neural activity, and statistically compared the resulting values from the right and left curve sections. Results Under driving conditions, there were no sites where cerebral oxygen exchange increased significantly more during right curves than during left curves (p > 0.05), but cerebral oxygen exchange increased significantly more during left curves (p right premotor cortex, the right frontal eye field and the bilateral prefrontal cortex. Under non-driving conditions, increases were significantly greater during left curves (p right frontal eye field. Conclusions Left curve driving was thus found to require more brain activity at multiple sites, suggesting that left curve driving may require more visual attention than right curve driving. The right frontal eye field was activated under both driving and non-driving conditions

  15. Hyper-and-elliptic-curve cryptography

    NARCIS (Netherlands)

    Bernstein, D.J.; Lange, T.

    2014-01-01

    This paper introduces ‘hyper-and-elliptic-curve cryptography’, in which a single high-security group supports fast genus-2-hyperelliptic-curve formulas for variable-base-point single-scalar multiplication (for example, Diffie–Hellman shared-secret computation) and at the same time supports fast

  16. Displacement sensing based on resonant frequency monitoring of electrostatically actuated curved micro beams

    International Nuclear Information System (INIS)

    Krakover, Naftaly; Krylov, Slava; Ilic, B Robert

    2016-01-01

    The ability to control nonlinear interactions of suspended mechanical structures offers a unique opportunity to engineer rich dynamical behavior that extends the dynamic range and ultimate device sensitivity. We demonstrate a displacement sensing technique based on resonant frequency monitoring of curved, doubly clamped, bistable micromechanical beams interacting with a movable electrode. In this configuration, the electrode displacement influences the nonlinear electrostatic interactions, effective stiffness and frequency of the curved beam. Increased sensitivity is made possible by dynamically operating the beam near the snap-through bistability onset. Various in-plane device architectures were fabricated from single crystal silicon and measured under ambient conditions using laser Doppler vibrometry. In agreement with the reduced order Galerkin-based model predictions, our experimental results show a significant resonant frequency reduction near critical snap-through, followed by a frequency increase within the post-buckling configuration. Interactions with a stationary electrode yield a voltage sensitivity up to  ≈560 Hz V −1 and results with a movable electrode allow motion sensitivity up to  ≈1.5 Hz nm −1 . Our theoretical and experimental results collectively reveal the potential of displacement sensing using nonlinear interactions of geometrically curved beams near instabilities, with possible applications ranging from highly sensitive resonant inertial detectors to complex optomechanical platforms providing an interface between the classical and quantum domains. (paper)

  17. An explanation for the shape of nanoindentation unloading curves based on finite element simulation

    International Nuclear Information System (INIS)

    Bolshakov, A.; Pharr, G.M.

    1995-01-01

    Current methods for measuring hardness and modulus from nanoindentation load-displacement data are based on Sneddon's equations for the indentation of an elastic half-space by an axially symmetric rigid punch. Recent experiments have shown that nanoindentation unloading data are distinctly curved in a manner which is not consistent with either the flat punch or the conical indenter geometries frequently used in modeling, but are more closely approximated by a parabola of revolution. Finite element simulations for conical indentation of an elastic-plastic material are presented which corroborate the experimental observations, and from which a simple explanation for the shape of the unloading curve is derived. The explanation is based on the concept of an effective indenter shape whose geometry is determined by the shape of the plastic hardness impression formed during indentation

  18. Curve aligning approach for gait authentication based on a wearable accelerometer

    International Nuclear Information System (INIS)

    Sun, Hu; Yuao, Tao

    2012-01-01

    Gait authentication based on a wearable accelerometer is a novel biometric which can be used for identity identification, medical rehabilitation and early detection of neurological disorders. The method for matching gait patterns tells heavily on authentication performances. In this paper, curve aligning is introduced as a new method for matching gait patterns and it is compared with correlation and dynamic time warping (DTW). A support vector machine (SVM) is proposed to fuse pattern-matching methods in a decision level. Accelerations collected from ankles of 22 walking subjects are processed for authentications in our experiments. The fusion of curve aligning with backward–forward accelerations and DTW with vertical accelerations promotes authentication performances substantially and consistently. This fusion algorithm is tested repeatedly. Its mean and standard deviation of equal error rates are 0.794% and 0.696%, respectively, whereas among all presented non-fusion algorithms, the best one shows an EER of 3.03%. (paper)

  19. Inverse Diffusion Curves Using Shape Optimization.

    Science.gov (United States)

    Zhao, Shuang; Durand, Fredo; Zheng, Changxi

    2018-07-01

    The inverse diffusion curve problem focuses on automatic creation of diffusion curve images that resemble user provided color fields. This problem is challenging since the 1D curves have a nonlinear and global impact on resulting color fields via a partial differential equation (PDE). We introduce a new approach complementary to previous methods by optimizing curve geometry. In particular, we propose a novel iterative algorithm based on the theory of shape derivatives. The resulting diffusion curves are clean and well-shaped, and the final image closely approximates the input. Our method provides a user-controlled parameter to regularize curve complexity, and generalizes to handle input color fields represented in a variety of formats.

  20. A residual life prediction model based on the generalized σ -N curved surface

    Directory of Open Access Journals (Sweden)

    Zongwen AN

    2016-06-01

    Full Text Available In order to investigate change rule of the residual life of structure under random repeated load, firstly, starting from the statistic meaning of random repeated load, the joint probability density function of maximum stress and minimum stress is derived based on the characteristics of order statistic (maximum order statistic and minimum order statistic; then, based on the equation of generalized σ -N curved surface, considering the influence of load cycles number on fatigue life, a relationship among minimum stress, maximum stress and residual life, that is the σmin(n- σmax(n-Nr(n curved surface model, is established; finally, the validity of the proposed model is demonstrated by a practical case. The result shows that the proposed model can reflect the influence of maximum stress and minimum stress on residual life of structure under random repeated load, which can provide a theoretical basis for life prediction and reliability assessment of structure.

  1. Stage discharge curve for Guillemard Bridge streamflow sation based on rating curve method using historical flood event data

    International Nuclear Information System (INIS)

    Ros, F C; Sidek, L M; Desa, M N; Arifin, K; Tosaka, H

    2013-01-01

    The purpose of the stage-discharge curves varies from water quality study, flood modelling study, can be used to project climate change scenarios and so on. As the bed of the river often changes due to the annual monsoon seasons that sometimes cause by massive floods, the capacity of the river will changed causing shifting controlled to happen. This study proposes to use the historical flood event data from 1960 to 2009 in calculating the stage-discharge curve of Guillemard Bridge located in Sg. Kelantan. Regression analysis was done to check the quality of the data and examine the correlation between the two variables, Q and H. The mean values of the two variables then were adopted to find the value of difference between zero gauge height and the level of zero flow, 'a', K and 'n' to fit into rating curve equation and finally plotting the stage-discharge rating curve. Regression analysis of the historical flood data indicate that 91 percent of the original uncertainty has been explained by the analysis with the standard error of 0.085.

  2. Deposition Time and Thermal Cycles of Fabricating Thin-wall Steel Parts by Double Electrode GMAW Based Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Yang Dongqing

    2017-01-01

    Full Text Available The deposition time for fabricating the thin-wall part as well as the peak temperature of the substrate during the process was analyzed in the double electrode gas metal arc welding (DE-GMAW based additive manufacturing (AM. The total deposition time and the interlayer idle time of the manufacturing process decreased with the increasing of the bypass current under the same interlayer temperature and the same deposition rate. The thermal cycling curves illustrated that the peak temperature of the substrate was lower in the DE-GMAW base AM under the same conditions. When depositing the thin-wall parts, the DE-GMAW based AM can reduce the heat input to the substrate and improve the fabrication efficiency, compared with the GMAW based AM.

  3. Implementation of the Master Curve method in ProSACC

    Energy Technology Data Exchange (ETDEWEB)

    Feilitzen, Carl von; Sattari-Far, Iradj [Inspecta Technology AB, Stockholm (Sweden)

    2012-03-15

    Cleavage fracture toughness data display normally large amount of statistical scatter in the transition region. The cleavage toughness data in this region is specimen size-dependent, and should be treated statistically rather than deterministically. Master Curve methodology is a procedure for mechanical testing and statistical analysis of fracture toughness of ferritic steels in the transition region. The methodology accounts for temperature and size dependence of fracture toughness. Using the Master Curve methodology for evaluation of the fracture toughness in the transition region releases the overconservatism that has been observed in using the ASME-KIC curve. One main advantage of using the Master Curve methodology is possibility to use small Charpy-size specimens to determine fracture toughness. Detailed description of the Master Curve methodology is given by Sattari-Far and Wallin [2005). ProSACC is a suitable program in using for structural integrity assessments of components containing crack like defects and for defect tolerance analysis. The program gives possibilities to conduct assessments based on deterministic or probabilistic grounds. The method utilized in ProSACC is based on the R6-method developed at Nuclear Electric plc, Milne et al [1988]. The basic assumption in this method is that fracture in a cracked body can be described by two parameters Kr and Lr. The parameter Kr is the ratio between the stress intensity factor and the fracture toughness of the material. The parameter Lr is the ratio between applied load and the plastic limit load of the structure. The ProSACC assessment results are therefore highly dependent on the applied fracture toughness value in the assessment. In this work, the main options of the Master Curve methodology are implemented in the ProSACC program. Different options in evaluating Master Curve fracture toughness from standard fracture toughness testing data or impact testing data are considered. In addition, the

  4. Implementation of the Master Curve method in ProSACC

    International Nuclear Information System (INIS)

    Feilitzen, Carl von; Sattari-Far, Iradj

    2012-03-01

    Cleavage fracture toughness data display normally large amount of statistical scatter in the transition region. The cleavage toughness data in this region is specimen size-dependent, and should be treated statistically rather than deterministically. Master Curve methodology is a procedure for mechanical testing and statistical analysis of fracture toughness of ferritic steels in the transition region. The methodology accounts for temperature and size dependence of fracture toughness. Using the Master Curve methodology for evaluation of the fracture toughness in the transition region releases the overconservatism that has been observed in using the ASME-KIC curve. One main advantage of using the Master Curve methodology is possibility to use small Charpy-size specimens to determine fracture toughness. Detailed description of the Master Curve methodology is given by Sattari-Far and Wallin [2005). ProSACC is a suitable program in using for structural integrity assessments of components containing crack like defects and for defect tolerance analysis. The program gives possibilities to conduct assessments based on deterministic or probabilistic grounds. The method utilized in ProSACC is based on the R6-method developed at Nuclear Electric plc, Milne et al [1988]. The basic assumption in this method is that fracture in a cracked body can be described by two parameters Kr and Lr. The parameter Kr is the ratio between the stress intensity factor and the fracture toughness of the material. The parameter Lr is the ratio between applied load and the plastic limit load of the structure. The ProSACC assessment results are therefore highly dependent on the applied fracture toughness value in the assessment. In this work, the main options of the Master Curve methodology are implemented in the ProSACC program. Different options in evaluating Master Curve fracture toughness from standard fracture toughness testing data or impact testing data are considered. In addition, the

  5. Nonparametric estimation of age-specific reference percentile curves with radial smoothing.

    Science.gov (United States)

    Wan, Xiaohai; Qu, Yongming; Huang, Yao; Zhang, Xiao; Song, Hanping; Jiang, Honghua

    2012-01-01

    Reference percentile curves represent the covariate-dependent distribution of a quantitative measurement and are often used to summarize and monitor dynamic processes such as human growth. We propose a new nonparametric method based on a radial smoothing (RS) technique to estimate age-specific reference percentile curves assuming the underlying distribution is relatively close to normal. We compared the RS method with both the LMS and the generalized additive models for location, scale and shape (GAMLSS) methods using simulated data and found that our method has smaller estimation error than the two existing methods. We also applied the new method to analyze height growth data from children being followed in a clinical observational study of growth hormone treatment, and compared the growth curves between those with growth disorders and the general population. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Developing Novel Reservoir Rule Curves Using Seasonal Inflow Projections

    Science.gov (United States)

    Tseng, Hsin-yi; Tung, Ching-pin

    2015-04-01

    Due to significant seasonal rainfall variations, reservoirs and their flexible operational rules are indispensable to Taiwan. Furthermore, with the intensifying impacts of climate change on extreme climate, the frequency of droughts in Taiwan has been increasing in recent years. Drought is a creeping phenomenon, the slow onset character of drought makes it difficult to detect at an early stage, and causes delays on making the best decision of allocating water. For these reasons, novel reservoir rule curves using projected seasonal streamflow are proposed in this study, which can potentially reduce the adverse effects of drought. This study dedicated establishing new rule curves which consider both current available storage and anticipated monthly inflows with leading time of two months to reduce the risk of water shortage. The monthly inflows are projected based on the seasonal climate forecasts from Central Weather Bureau (CWB), which a weather generation model is used to produce daily weather data for the hydrological component of the GWLF. To incorporate future monthly inflow projections into rule curves, this study designs a decision flow index which is a linear combination of current available storage and inflow projections with leading time of 2 months. By optimizing linear relationship coefficients of decision flow index, the shape of rule curves and the percent of water supply in each zone, the best rule curves to decrease water shortage risk and impacts can be developed. The Shimen Reservoir in the northern Taiwan is used as a case study to demonstrate the proposed method. Existing rule curves (M5 curves) of Shimen Reservoir are compared with two cases of new rule curves, including hindcast simulations and historic seasonal forecasts. The results show new rule curves can decrease the total water shortage ratio, and in addition, it can also allocate shortage amount to preceding months to avoid extreme shortage events. Even though some uncertainties in

  7. A Literature-Based Analysis of the Learning Curves of Laparoscopic Radical Prostatectomy

    Directory of Open Access Journals (Sweden)

    Daniel W. Good

    2014-05-01

    Full Text Available There is a trend for the increased adoption of minimally invasive techniques of radical prostatectomy (RP – laparoscopic (LRP and robotic assisted (RARP – from the traditional open radical retropubic prostatectomy (ORP, popularised by Partin et al. Recently there has been a dramatic expansion in the rates of RARP being performed, and there have been many early reports postulating that the learning curve for RARP is shorter than for LRP. The aim of this study was to review the literature and analyse the length of the LRP learning curves for the various outcome measures: perioperative, oncologic, and functional outcomes. A broad search of the literature was performed in November 2013 using the PubMed database. Only studies of real patients and those from 2004 until 2013 were included; those on simulators were excluded. In total, 239 studies were identified after which 13 were included. The learning curve is a heterogeneous entity, depending entirely on the criteria used to define it. There is evidence of multiple learning curves; however the length of these is dependent on the definitions used by the authors. Few studies use the more rigorous definition of plateauing of the curve. Perioperative learning curve takes approximately 150-200 cases to plateau, oncologic curve approximately 200 cases, and the functional learning curve up to 700 cases to plateau (700 for potency, 200 cases for continence. In this review, we have analysed the literature with respect to the learning curve for LRP. It is clear that the learning curve is long. This necessitates centralising LRP to high volume centres such that surgeons, trainees, and patients are able to utilise the benefits of LRP.

  8. Section curve reconstruction and mean-camber curve extraction of a point-sampled blade surface.

    Directory of Open Access Journals (Sweden)

    Wen-long Li

    Full Text Available The blade is one of the most critical parts of an aviation engine, and a small change in the blade geometry may significantly affect the dynamics performance of the aviation engine. Rapid advancements in 3D scanning techniques have enabled the inspection of the blade shape using a dense and accurate point cloud. This paper proposes a new method to achieving two common tasks in blade inspection: section curve reconstruction and mean-camber curve extraction with the representation of a point cloud. The mathematical morphology is expanded and applied to restrain the effect of the measuring defects and generate an ordered sequence of 2D measured points in the section plane. Then, the energy and distance are minimized to iteratively smoothen the measured points, approximate the section curve and extract the mean-camber curve. In addition, a turbine blade is machined and scanned to observe the curvature variation, energy variation and approximation error, which demonstrates the availability of the proposed method. The proposed method is simple to implement and can be applied in aviation casting-blade finish inspection, large forging-blade allowance inspection and visual-guided robot grinding localization.

  9. Experimental and simulated beam-foil decay curves for some transitions in Zn II

    International Nuclear Information System (INIS)

    Hultberg, S.; Liljeby, L.; Mannervik, S.; Veje, E.; Lindgaard, A.

    1980-01-01

    Experimental beam-foil decay curves for the 4s-4p, 4p-4d, 4d-4f, and the 4p-5s transitions in Zn II are compared to decay curves synthesized from transition probabilities calculated in the numerical Coulomb approximation and either measured initial level populations or population models. Good agreement exists between experimental curves and those based on the measured initial level populations for the 5s, 4d, and 4f levels while certain deviations are noted for the 4p term. None of the applied population models reproduce all experimental curves satisfyingly well. In addition, lifetimes are determined experimentally for 7 terms in Zn II, and good agreement with the numerical Coulomb approximation lifetimes is generally found except for some p terms. Beam-foil excitation-mechanism results for zinc are presented and compared to previous results from light projectiles. (Auth.)

  10. A residual life prediction model based on the generalized σ -N curved surface

    OpenAIRE

    Zongwen AN; Xuezong BAI; Jianxiong GAO

    2016-01-01

    In order to investigate change rule of the residual life of structure under random repeated load, firstly, starting from the statistic meaning of random repeated load, the joint probability density function of maximum stress and minimum stress is derived based on the characteristics of order statistic (maximum order statistic and minimum order statistic); then, based on the equation of generalized σ -N curved surface, considering the influence of load cycles number on fatigue life, a relation...

  11. No evidence for an open vessel effect in centrifuge-based vulnerability curves of a long-vesselled liana (Vitis vinifera).

    Science.gov (United States)

    Jacobsen, Anna L; Pratt, R Brandon

    2012-06-01

    Vulnerability to cavitation curves are used to estimate xylem cavitation resistance and can be constructed using multiple techniques. It was recently suggested that a technique that relies on centrifugal force to generate negative xylem pressures may be susceptible to an open vessel artifact in long-vesselled species. Here, we used custom centrifuge rotors to measure different sample lengths of 1-yr-old stems of grapevine to examine the influence of open vessels on vulnerability curves, thus testing the hypothesized open vessel artifact. These curves were compared with a dehydration-based vulnerability curve. Although samples differed significantly in the number of open vessels, there was no difference in the vulnerability to cavitation measured on 0.14- and 0.271-m-long samples of Vitis vinifera. Dehydration and centrifuge-based curves showed a similar pattern of declining xylem-specific hydraulic conductivity (K(s)) with declining water potential. The percentage loss in hydraulic conductivity (PLC) differed between dehydration and centrifuge curves and it was determined that grapevine is susceptible to errors in estimating maximum K(s) during dehydration because of the development of vessel blockages. Our results from a long-vesselled liana do not support the open vessel artifact hypothesis. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.

  12. A Practical Anodic and Cathodic Curve Intersection Model to Understand Multiple Corrosion Potentials of Fe-Based Glassy Alloys in OH- Contained Solutions.

    Science.gov (United States)

    Li, Y J; Wang, Y G; An, B; Xu, H; Liu, Y; Zhang, L C; Ma, H Y; Wang, W M

    2016-01-01

    A practical anodic and cathodic curve intersection model, which consisted of an apparent anodic curve and an imaginary cathodic line, was proposed to explain multiple corrosion potentials occurred in potentiodynamic polarization curves of Fe-based glassy alloys in alkaline solution. The apparent anodic curve was selected from the measured anodic curves. The imaginary cathodic line was obtained by linearly fitting the differences of anodic curves and can be moved evenly or rotated to predict the number and value of corrosion potentials.

  13. Global experience curves for wind farms

    International Nuclear Information System (INIS)

    Junginger, M.; Faaij, A.; Turkenburg, W.C.

    2005-01-01

    In order to forecast the technological development and cost of wind turbines and the production costs of wind electricity, frequent use is made of the so-called experience curve concept. Experience curves of wind turbines are generally based on data describing the development of national markets, which cause a number of problems when applied for global assessments. To analyze global wind energy price development more adequately, we compose a global experience curve. First, underlying factors for past and potential future price reductions of wind turbines are analyzed. Also possible implications and pitfalls when applying the experience curve methodology are assessed. Second, we present and discuss a new approach of establishing a global experience curve and thus a global progress ratio for the investment cost of wind farms. Results show that global progress ratios for wind farms may lie between 77% and 85% (with an average of 81%), which is significantly more optimistic than progress ratios applied in most current scenario studies and integrated assessment models. While the findings are based on a limited amount of data, they may indicate faster price reduction opportunities than so far assumed. With this global experience curve we aim to improve the reliability of describing the speed with which global costs of wind power may decline

  14. Visual navigation using edge curve matching for pinpoint planetary landing

    Science.gov (United States)

    Cui, Pingyuan; Gao, Xizhen; Zhu, Shengying; Shao, Wei

    2018-05-01

    Pinpoint landing is challenging for future Mars and asteroid exploration missions. Vision-based navigation scheme based on feature detection and matching is practical and can achieve the required precision. However, existing algorithms are computationally prohibitive and utilize poor-performance measurements, which pose great challenges for the application of visual navigation. This paper proposes an innovative visual navigation scheme using crater edge curves during descent and landing phase. In the algorithm, the edge curves of the craters tracked from two sequential images are utilized to determine the relative attitude and position of the lander through a normalized method. Then, considering error accumulation of relative navigation, a method is developed. That is to integrate the crater-based relative navigation method with crater-based absolute navigation method that identifies craters using a georeferenced database for continuous estimation of absolute states. In addition, expressions of the relative state estimate bias are derived. Novel necessary and sufficient observability criteria based on error analysis are provided to improve the navigation performance, which hold true for similar navigation systems. Simulation results demonstrate the effectiveness and high accuracy of the proposed navigation method.

  15. A simple transformation for converting CW-OSL curves to LM-OSL curves

    DEFF Research Database (Denmark)

    Bulur, E.

    2000-01-01

    A simple mathematical transformation is introduced to convert from OSL decay curves obtained in the conventional way to those obtained using a linear modulation technique based on a linear increase of the stimulation light intensity during OSL measurement. The validity of the transformation...... was tested by the IR-stimulated luminescence curves from feldspars, recorded using both the conventional and the linear modulation techniques. The transformation was further applied to green-light-stimulated OSL from K and Na feldspars. (C) 2000 Elsevier Science Ltd. All rights reserved....

  16. An extension of the receiver operating characteristic curve and AUC-optimal classification.

    Science.gov (United States)

    Takenouchi, Takashi; Komori, Osamu; Eguchi, Shinto

    2012-10-01

    While most proposed methods for solving classification problems focus on minimization of the classification error rate, we are interested in the receiver operating characteristic (ROC) curve, which provides more information about classification performance than the error rate does. The area under the ROC curve (AUC) is a natural measure for overall assessment of a classifier based on the ROC curve. We discuss a class of concave functions for AUC maximization in which a boosting-type algorithm including RankBoost is considered, and the Bayesian risk consistency and the lower bound of the optimum function are discussed. A procedure derived by maximizing a specific optimum function has high robustness, based on gross error sensitivity. Additionally, we focus on the partial AUC, which is the partial area under the ROC curve. For example, in medical screening, a high true-positive rate to the fixed lower false-positive rate is preferable and thus the partial AUC corresponding to lower false-positive rates is much more important than the remaining AUC. We extend the class of concave optimum functions for partial AUC optimality with the boosting algorithm. We investigated the validity of the proposed method through several experiments with data sets in the UCI repository.

  17. The South Carolina bridge-scour envelope curves

    Science.gov (United States)

    Benedict, Stephen T.; Feaster, Toby D.; Caldwell, Andral W.

    2016-09-30

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a series of three field investigations to evaluate historical, riverine bridge scour in the Piedmont and Coastal Plain regions of South Carolina. These investigations included data collected at 231 riverine bridges, which lead to the development of bridge-scour envelope curves for clear-water and live-bed components of scour. The application and limitations of the South Carolina bridge-scour envelope curves were documented in four reports, each report addressing selected components of bridge scour. The current investigation (2016) synthesizes the findings of these previous reports into a guidance manual providing an integrated procedure for applying the envelope curves. Additionally, the investigation provides limited verification for selected bridge-scour envelope curves by comparing them to field data collected outside of South Carolina from previously published sources. Although the bridge-scour envelope curves have limitations, they are useful supplementary tools for assessing the potential for scour at riverine bridges in South Carolina.

  18. Assessing neural activity related to decision-making through flexible odds ratio curves and their derivatives.

    Science.gov (United States)

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Pardo-Vazquez, Jose L; Leboran, Victor; Molenberghs, Geert; Faes, Christel; Acuña, Carlos

    2011-06-30

    It is well established that neural activity is stochastically modulated over time. Therefore, direct comparisons across experimental conditions and determination of change points or maximum firing rates are not straightforward. This study sought to compare temporal firing probability curves that may vary across groups defined by different experimental conditions. Odds-ratio (OR) curves were used as a measure of comparison, and the main goal was to provide a global test to detect significant differences of such curves through the study of their derivatives. An algorithm is proposed that enables ORs based on generalized additive models, including factor-by-curve-type interactions to be flexibly estimated. Bootstrap methods were used to draw inferences from the derivatives curves, and binning techniques were applied to speed up computation in the estimation and testing processes. A simulation study was conducted to assess the validity of these bootstrap-based tests. This methodology was applied to study premotor ventral cortex neural activity associated with decision-making. The proposed statistical procedures proved very useful in revealing the neural activity correlates of decision-making in a visual discrimination task. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Vertex algebras and algebraic curves

    CERN Document Server

    Frenkel, Edward

    2004-01-01

    Vertex algebras are algebraic objects that encapsulate the concept of operator product expansion from two-dimensional conformal field theory. Vertex algebras are fast becoming ubiquitous in many areas of modern mathematics, with applications to representation theory, algebraic geometry, the theory of finite groups, modular functions, topology, integrable systems, and combinatorics. This book is an introduction to the theory of vertex algebras with a particular emphasis on the relationship with the geometry of algebraic curves. The notion of a vertex algebra is introduced in a coordinate-independent way, so that vertex operators become well defined on arbitrary smooth algebraic curves, possibly equipped with additional data, such as a vector bundle. Vertex algebras then appear as the algebraic objects encoding the geometric structure of various moduli spaces associated with algebraic curves. Therefore they may be used to give a geometric interpretation of various questions of representation theory. The book co...

  20. Learning curves and long-term outcome of simulation-based thoracentesis training for medical students

    Science.gov (United States)

    2011-01-01

    Background Simulation-based medical education has been widely used in medical skills training; however, the effectiveness and long-term outcome of simulation-based training in thoracentesis requires further investigation. The purpose of this study was to assess the learning curve of simulation-based thoracentesis training, study skills retention and transfer of knowledge to a clinical setting following simulation-based education intervention in thoracentesis procedures. Methods Fifty-two medical students were enrolled in this study. Each participant performed five supervised trials on the simulator. Participant's performance was assessed by performance score (PS), procedure time (PT), and participant's confidence (PC). Learning curves for each variable were generated. Long-term outcome of the training was measured by the retesting and clinical performance evaluation 6 months and 1 year, respectively, after initial training on the simulator. Results Significant improvements in PS, PT, and PC were noted among the first 3 to 4 test trials (p 0.05). Clinical competency in thoracentesis was improved in participants who received simulation training relative to that of first year medical residents without such experience (p simulation-based thoracentesis training can significantly improve an individual's performance. The saturation of learning from the simulator can be achieved after four practice sessions. Simulation-based training can assist in long-term retention of skills and can be partially transferred to clinical practice. PMID:21696584

  1. p-Curve and p-Hacking in Observational Research.

    Science.gov (United States)

    Bruns, Stephan B; Ioannidis, John P A

    2016-01-01

    The p-curve, the distribution of statistically significant p-values of published studies, has been used to make inferences on the proportion of true effects and on the presence of p-hacking in the published literature. We analyze the p-curve for observational research in the presence of p-hacking. We show by means of simulations that even with minimal omitted-variable bias (e.g., unaccounted confounding) p-curves based on true effects and p-curves based on null-effects with p-hacking cannot be reliably distinguished. We also demonstrate this problem using as practical example the evaluation of the effect of malaria prevalence on economic growth between 1960 and 1996. These findings call recent studies into question that use the p-curve to infer that most published research findings are based on true effects in the medical literature and in a wide range of disciplines. p-values in observational research may need to be empirically calibrated to be interpretable with respect to the commonly used significance threshold of 0.05. Violations of randomization in experimental studies may also result in situations where the use of p-curves is similarly unreliable.

  2. Modeling of alpha mass-efficiency curve

    International Nuclear Information System (INIS)

    Semkow, T.M.; Jeter, H.W.; Parsa, B.; Parekh, P.P.; Haines, D.K.; Bari, A.

    2005-01-01

    We present a model for efficiency of a detector counting gross α radioactivity from both thin and thick samples, corresponding to low and high sample masses in the counting planchette. The model includes self-absorption of α particles in the sample, energy loss in the absorber, range straggling, as well as detector edge effects. The surface roughness of the sample is treated in terms of fractal geometry. The model reveals a linear dependence of the detector efficiency on the sample mass, for low masses, as well as a power-law dependence for high masses. It is, therefore, named the linear-power-law (LPL) model. In addition, we consider an empirical power-law (EPL) curve, and an exponential (EXP) curve. A comparison is made of the LPL, EPL, and EXP fits to the experimental α mass-efficiency data from gas-proportional detectors for selected radionuclides: 238 U, 230 Th, 239 Pu, 241 Am, and 244 Cm. Based on this comparison, we recommend working equations for fitting mass-efficiency data. Measurement of α radioactivity from a thick sample can determine the fractal dimension of its surface

  3. Marginalizing Instrument Systematics in HST WFC3 Transit Light Curves

    Science.gov (United States)

    Wakeford, H. R.; Sing, D. K.; Evans, T.; Deming, D.; Mandell, A.

    2016-03-01

    Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) infrared observations at 1.1-1.7 μm probe primarily the H2O absorption band at 1.4 μm, and have provided low-resolution transmission spectra for a wide range of exoplanets. We present the application of marginalization based on Gibson to analyze exoplanet transit light curves obtained from HST WFC3 to better determine important transit parameters such as Rp/R*, which are important for accurate detections of H2O. We approximate the evidence, often referred to as the marginal likelihood, for a grid of systematic models using the Akaike Information Criterion. We then calculate the evidence-based weight assigned to each systematic model and use the information from all tested models to calculate the final marginalized transit parameters for both the band-integrated and spectroscopic light curves to construct the transmission spectrum. We find that a majority of the highest weight models contain a correction for a linear trend in time as well as corrections related to HST orbital phase. We additionally test the dependence on the shift in spectral wavelength position over the course of the observations and find that spectroscopic wavelength shifts {δ }λ (λ ) best describe the associated systematic in the spectroscopic light curves for most targets while fast scan rate observations of bright targets require an additional level of processing to produce a robust transmission spectrum. The use of marginalization allows for transparent interpretation and understanding of the instrument and the impact of each systematic evaluated statistically for each data set, expanding the ability to make true and comprehensive comparisons between exoplanet atmospheres.

  4. Estimation of Curve Tracing Time in Supercapacitor based PV Characterization

    Science.gov (United States)

    Basu Pal, Sudipta; Das Bhattacharya, Konika; Mukherjee, Dipankar; Paul, Debkalyan

    2017-08-01

    Smooth and noise-free characterisation of photovoltaic (PV) generators have been revisited with renewed interest in view of large size PV arrays making inroads into the urban sector of major developing countries. Such practice has recently been observed to be confronted by the use of a suitable data acquisition system and also the lack of a supporting theoretical analysis to justify the accuracy of curve tracing. However, the use of a selected bank of supercapacitors can mitigate the said problems to a large extent. Assuming a piecewise linear analysis of the V-I characteristics of a PV generator, an accurate analysis of curve plotting time has been possible. The analysis has been extended to consider the effect of equivalent series resistance of the supercapacitor leading to increased accuracy (90-95%) of curve plotting times.

  5. A simplified early-warning system for imminent landslide prediction based on failure index fragility curves developed through numerical analysis

    Directory of Open Access Journals (Sweden)

    Ugur Ozturk

    2016-07-01

    Full Text Available Early-warning systems (EWSs are crucial to reduce the risk of landslide, especially where the structural measures are not fully capable of preventing the devastating impact of such an event. Furthermore, designing and successfully implementing a complete landslide EWS is a highly complex task. The main technical challenges are linked to the definition of heterogeneous material properties (geotechnical and geomechanical parameters as well as a variety of the triggering factors. In addition, real-time data processing creates a significant complexity, since data collection and numerical models for risk assessment are time consuming tasks. Therefore, uncertainties in the physical properties of a landslide together with the data management represent the two crucial deficiencies in an efficient landslide EWS. Within this study the application is explored of the concept of fragility curves to landslides; fragility curves are widely used to simulate systems response to natural hazards, i.e. floods or earthquakes. The application of fragility curves to landslide risk assessment is believed to simplify emergency risk assessment; even though it cannot substitute detailed analysis during peace-time. A simplified risk assessment technique can remove some of the unclear features and decrease data processing time. The method is based on synthetic samples which are used to define the approximate failure thresholds for landslides, taking into account the materials and the piezometric levels. The results are presented in charts. The method presented in this paper, which is called failure index fragility curve (FIFC, allows assessment of the actual real-time risk in a case study that is based on the most appropriate FIFC. The application of an FIFC to a real case is presented as an example. This method to assess the landslide risk is another step towards a more integrated dynamic approach to a potential landslide prevention system. Even if it does not define

  6. Correlation between 2D and 3D flow curve modelling of DP steels using a microstructure-based RVE approach

    International Nuclear Information System (INIS)

    Ramazani, A.; Mukherjee, K.; Quade, H.; Prahl, U.; Bleck, W.

    2013-01-01

    A microstructure-based approach by means of representative volume elements (RVEs) is employed to evaluate the flow curve of DP steels using virtual tensile tests. Microstructures with different martensite fractions and morphologies are studied in two- and three-dimensional approaches. Micro sections of DP microstructures with various amounts of martensite have been converted to 2D RVEs, while 3D RVEs were constructed statistically with randomly distributed phases. A dislocation-based model is used to describe the flow curve of each ferrite and martensite phase separately as a function of carbon partitioning and microstructural features. Numerical tensile tests of RVE were carried out using the ABAQUS/Standard code to predict the flow behaviour of DP steels. It is observed that 2D plane strain modelling gives an underpredicted flow curve for DP steels, while the 3D modelling gives a quantitatively reasonable description of flow curve in comparison to the experimental data. In this work, a von Mises stress correlation factor σ 3D /σ 2D has been identified to compare the predicted flow curves of these two dimensionalities showing a third order polynomial relation with respect to martensite fraction and a second order polynomial relation with respect to equivalent plastic strain, respectively. The quantification of this polynomial correlation factor is performed based on laboratory-annealed DP600 chemistry with varying martensite content and it is validated for industrially produced DP qualities with various chemistry, strength level and martensite fraction.

  7. The estimation of I–V curves of PV panel using manufacturers’ I–V curves and evolutionary strategy

    International Nuclear Information System (INIS)

    Barukčić, M.; Hederić, Ž.; Špoljarić, Ž.

    2014-01-01

    Highlights: • The approximation of a I–V curve by two linear and a sigmoid functions is proposed. • The sigmoid function is used to estimate the knee of the I–V curve. • Dependence on irradiance and temperature of sigmoid function parameters is proposed. • The sigmoid function is used to estimate maximum power point (MPP). - Abstract: The method for estimation of I–V curves of photovoltaic (PV) panel by analytic expression is presented in the paper. The problem is defined in the form of an optimization problem. The optimization problem objective is based on data from I–V curves obtained by manufacturers’ or measured I–V curves. In order to estimate PV panel parameters, the optimization problem is solved by using an evolutionary strategy. The proposed method is tested for different PV panel technologies using data sheets. In this method the I–V curve approximation with two linear and a sigmoid function is proposed. The method for estimating the knee of the I–V curve and maximum power point at any irradiance and temperature is proposed

  8. NormaCurve: a SuperCurve-based method that simultaneously quantifies and normalizes reverse phase protein array data.

    Directory of Open Access Journals (Sweden)

    Sylvie Troncale

    Full Text Available MOTIVATION: Reverse phase protein array (RPPA is a powerful dot-blot technology that allows studying protein expression levels as well as post-translational modifications in a large number of samples simultaneously. Yet, correct interpretation of RPPA data has remained a major challenge for its broad-scale application and its translation into clinical research. Satisfying quantification tools are available to assess a relative protein expression level from a serial dilution curve. However, appropriate tools allowing the normalization of the data for external sources of variation are currently missing. RESULTS: Here we propose a new method, called NormaCurve, that allows simultaneous quantification and normalization of RPPA data. For this, we modified the quantification method SuperCurve in order to include normalization for (i background fluorescence, (ii variation in the total amount of spotted protein and (iii spatial bias on the arrays. Using a spike-in design with a purified protein, we test the capacity of different models to properly estimate normalized relative expression levels. The best performing model, NormaCurve, takes into account a negative control array without primary antibody, an array stained with a total protein stain and spatial covariates. We show that this normalization is reproducible and we discuss the number of serial dilutions and the number of replicates that are required to obtain robust data. We thus provide a ready-to-use method for reliable and reproducible normalization of RPPA data, which should facilitate the interpretation and the development of this promising technology. AVAILABILITY: The raw data, the scripts and the normacurve package are available at the following web site: http://microarrays.curie.fr.

  9. Laser-based additive manufacturing of metals

    CSIR Research Space (South Africa)

    Kumar, S

    2010-11-01

    Full Text Available For making metallic products through Additive Manufacturing (AM) processes, laser-based systems play very significant roles. Laser-based processes such as Selective Laser Melting (SLM) and Laser Engineered Net Shaping (LENS) are dominating processes...

  10. Long-term hydrological simulation based on the Soil Conservation Service curve number

    Science.gov (United States)

    Mishra, Surendra Kumar; Singh, Vijay P.

    2004-05-01

    Presenting a critical review of daily flow simulation models based on the Soil Conservation Service curve number (SCS-CN), this paper introduces a more versatile model based on the modified SCS-CN method, which specializes into seven cases. The proposed model was applied to the Hemavati watershed (area = 600 km2) in India and was found to yield satisfactory results in both calibration and validation. The model conserved monthly and annual runoff volumes satisfactorily. A sensitivity analysis of the model parameters was performed, including the effect of variation in storm duration. Finally, to investigate the model components, all seven variants of the modified version were tested for their suitability.

  11. Characterizing time series via complexity-entropy curves

    Science.gov (United States)

    Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.

    2017-06-01

    The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.

  12. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  13. Modeling Patterns of Activities using Activity Curves.

    Science.gov (United States)

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2016-06-01

    Pervasive computing offers an unprecedented opportunity to unobtrusively monitor behavior and use the large amount of collected data to perform analysis of activity-based behavioral patterns. In this paper, we introduce the notion of an activity curve , which represents an abstraction of an individual's normal daily routine based on automatically-recognized activities. We propose methods to detect changes in behavioral routines by comparing activity curves and use these changes to analyze the possibility of changes in cognitive or physical health. We demonstrate our model and evaluate our change detection approach using a longitudinal smart home sensor dataset collected from 18 smart homes with older adult residents. Finally, we demonstrate how big data-based pervasive analytics such as activity curve-based change detection can be used to perform functional health assessment. Our evaluation indicates that correlations do exist between behavior and health changes and that these changes can be automatically detected using smart homes, machine learning, and big data-based pervasive analytics.

  14. Flow over riblet curved surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Loureiro, J B R; Freire, A P Silva, E-mail: atila@mecanica.ufrj.br [Mechanical Engineering Program, Federal University of Rio de Janeiro (COPPE/UFRJ), C.P. 68503, 21.941-972, Rio de Janeiro, RJ (Brazil)

    2011-12-22

    The present work studies the mechanics of turbulent drag reduction over curved surfaces by riblets. The effects of surface modification on flow separation over steep and smooth curved surfaces are investigated. Four types of two-dimensional surfaces are studied based on the morphometric parameters that describe the body of a blue whale. Local measurements of mean velocity and turbulence profiles are obtained through laser Doppler anemometry (LDA) and particle image velocimetry (PIV).

  15. A Method for Formulizing Disaster Evacuation Demand Curves Based on SI Model

    Directory of Open Access Journals (Sweden)

    Yulei Song

    2016-10-01

    Full Text Available The prediction of evacuation demand curves is a crucial step in the disaster evacuation plan making, which directly affects the performance of the disaster evacuation. In this paper, we discuss the factors influencing individual evacuation decision making (whether and when to leave and summarize them into four kinds: individual characteristics, social influence, geographic location, and warning degree. In the view of social contagion of decision making, a method based on Susceptible-Infective (SI model is proposed to formulize the disaster evacuation demand curves to address both social influence and other factors’ effects. The disaster event of the “Tianjin Explosions” is used as a case study to illustrate the modeling results influenced by the four factors and perform the sensitivity analyses of the key parameters of the model. Some interesting phenomena are found and discussed, which is meaningful for authorities to make specific evacuation plans. For example, due to the lower social influence in isolated communities, extra actions might be taken to accelerate evacuation process in those communities.

  16. Training, Simulation, the Learning Curve, and How to Reduce Complications in Urology.

    Science.gov (United States)

    Brunckhorst, Oliver; Volpe, Alessandro; van der Poel, Henk; Mottrie, Alexander; Ahmed, Kamran

    2016-04-01

    Urology is at the forefront of minimally invasive surgery to a great extent. These procedures produce additional learning challenges and possess a steep initial learning curve. Training and assessment methods in surgical specialties such as urology are known to lack clear structure and often rely on differing operative flow experienced by individuals and institutions. This article aims to assess current urology training modalities, to identify the role of simulation within urology, to define and identify the learning curves for various urologic procedures, and to discuss ways to decrease complications in the context of training. A narrative review of the literature was conducted through December 2015 using the PubMed/Medline, Embase, and Cochrane Library databases. Evidence of the validity of training methods in urology includes observation of a procedure, mentorship and fellowship, e-learning, and simulation-based training. Learning curves for various urologic procedures have been recommended based on the available literature. The importance of structured training pathways is highlighted, with integration of modular training to ensure patient safety. Valid training pathways are available in urology. The aim in urology training should be to combine all of the available evidence to produce procedure-specific curricula that utilise the vast array of training methods available to ensure that we continue to improve patient outcomes and reduce complications. The current evidence for different training methods available in urology, including simulation-based training, was reviewed, and the learning curves for various urologic procedures were critically analysed. Based on the evidence, future pathways for urology curricula have been suggested to ensure that patient safety is improved. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  17. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    Directory of Open Access Journals (Sweden)

    Vinothkumar Muthurajan

    2016-01-01

    Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  18. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    Science.gov (United States)

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  19. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    Science.gov (United States)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  20. Exploring Algorithms for Stellar Light Curves With TESS

    Science.gov (United States)

    Buzasi, Derek

    2018-01-01

    The Kepler and K2 missions have produced tens of thousands of stellar light curves, which have been used to measure rotation periods, characterize photometric activity levels, and explore phenomena such as differential rotation. The quasi-periodic nature of rotational light curves, combined with the potential presence of additional periodicities not due to rotation, complicates the analysis of these time series and makes characterization of uncertainties difficult. A variety of algorithms have been used for the extraction of rotational signals, including autocorrelation functions, discrete Fourier transforms, Lomb-Scargle periodograms, wavelet transforms, and the Hilbert-Huang transform. In addition, in the case of K2 a number of different pipelines have been used to produce initial detrended light curves from the raw image frames.In the near future, TESS photometry, particularly that deriving from the full-frame images, will dramatically further expand the number of such light curves, but details of the pipeline to be used to produce photometry from the FFIs remain under development. K2 data offers us an opportunity to explore the utility of different reduction and analysis tool combinations applied to these astrophysically important tasks. In this work, we apply a wide range of algorithms to light curves produced by a number of popular K2 pipeline products to better understand the advantages and limitations of each approach and provide guidance for the most reliable and most efficient analysis of TESS stellar data.

  1. Designing an ASIP for cryptographic pairings over Barreto-Naehrig curves

    NARCIS (Netherlands)

    Kammler, D.; Zhang, D.; Schwabe, P.; Scharwaechter, H.; Langenberg, M.; Auras, D.; Ascheid, G.; Mathar, R.; Clavier, C.; Gaj, K.

    2009-01-01

    This paper presents a design-space exploration of an application-specific instruction-set processor (ASIP) for the computation of various cryptographic pairings over Barreto-Naehrig curves (BN curves). Cryptographic pairings are based on elliptic curves over finite fields—in the case of BN curves a

  2. Remote sensing used for power curves

    International Nuclear Information System (INIS)

    Wagner, R; Joergensen, H E; Paulsen, U S; Larsen, T J; Antoniou, I; Thesbjerg, L

    2008-01-01

    Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviation in the power curve significantly. Two LiDARs and a SoDAR are used to measure the wind profile in front of a wind turbine. These profiles are used to calculate the equivalent wind speed. The comparison of the power curves obtained with the three instruments to the traditional power curve, obtained using a cup anemometer measurement, confirms the results obtained from the simulations. Using LiDAR profiles reduces the error in power curve measurement, when these are used as relative instrument together with a cup anemometer. Results from the SoDAR do not show such promising results, probably because of noisy measurements resulting in distorted profiles

  3. Learning Curve Analysis and Surgical Outcomes of Single-port Laparoscopic Myomectomy.

    Science.gov (United States)

    Lee, Hee Jun; Kim, Ju Yeong; Kim, Seul Ki; Lee, Jung Ryeol; Suh, Chang Suk; Kim, Seok Hyun

    2015-01-01

    To identify learning curves for single-port laparoscopic myomectomy (SPLM) and evaluate surgical outcomes according to the sequence of operation. A retrospective study. A university-based hospital (Canadian Task Force classification II-2). The medical records from 205 patients who had undergone SPLM from October 2009 to May 2013 were reviewed. Because the myomectomy time was significantly affected by the size and number of myomas removed by SPLM, cases in which 2 or more of the myomas removed were >7 cm in diameter were excluded. Furthermore, cases involving additional operations performed simultaneously (e.g., ovarian or hysteroscopic surgery) were also excluded. A total of 161 cases of SPLM were included. None. We assessed the SPLM learning curve via a graph based on operation time versus sequence of cases. Patients were chronologically arranged according to their surgery dates and were then placed into 1 of 4 groups according to their operation sequence. SPLM was completed successfully in 160 of 161 cases (99.4%). One case was converted to multiport surgery. Basal characteristics of the patients between the 4 groups did not differ. The median operation times for the 4 groups were 112.0, 92.8, 83.7, and 90.0 minutes, respectively. Operation time decreased significantly in the second, third, and fourth groups compared with that in the first group (p learning curve became less steep, was evident after about 45 operations. Results from the current study suggested that proficiency for SPLM was achieved after about 45 operations. Additionally, operation time decreased with experience without an increase in complication rate. Copyright © 2015 AAGL. Published by Elsevier Inc. All rights reserved.

  4. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1998-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)

  5. A study of swing-curve physics in diffraction-based overlay

    Science.gov (United States)

    Bhattacharyya, Kaustuve; den Boef, Arie; Storms, Greet; van Heijst, Joost; Noot, Marc; An, Kevin; Park, Noh-Kyoung; Jeon, Se-Ra; Oh, Nang-Lyeom; McNamara, Elliott; van de Mast, Frank; Oh, SeungHwa; Lee, Seung Yoon; Hwang, Chan; Lee, Kuntack

    2016-03-01

    With the increase of process complexity in advanced nodes, the requirements of process robustness in overlay metrology continues to tighten. Especially with the introduction of newer materials in the film-stack along with typical stack variations (thickness, optical properties, profile asymmetry etc.), the signal formation physics in diffraction-based overlay (DBO) becomes an important aspect to apply in overlay metrology target and recipe selection. In order to address the signal formation physics, an effort is made towards studying the swing-curve phenomena through wavelength and polarizations on production stacks using simulations as well as experimental technique using DBO. The results provide a wealth of information on target and recipe selection for robustness. Details from simulation and measurements will be reported in this technical publication.

  6. Dissolution glow curve in LLD

    International Nuclear Information System (INIS)

    Haverkamp, U.; Wiezorek, C.; Poetter, R.

    1990-01-01

    Lyoluminescence dosimetry is based upon light emission during dissolution of previously irradiated dosimetric materials. The lyoluminescence signal is expressed in the dissolution glow curve. These curves begin, depending on the dissolution system, with a high peak followed by an exponentially decreasing intensity. System parameters that influence the graph of the dissolution glow curve, are, for example, injection speed, temperature and pH value of the solution and the design of the dissolution cell. The initial peak does not significantly correlate with the absorbed dose, it is mainly an effect of the injection. The decay of the curve consists of two exponential components: one fast and one slow. The components depend on the absorbed dose and the dosimetric materials used. In particular, the slow component correlates with the absorbed dose. In contrast to the fast component the argument of the exponential function of the slow component is independent of the dosimetric materials investigated: trehalose, glucose and mannitol. The maximum value, following the peak of the curve, and the integral light output are a measure of the absorbed dose. The reason for the different light outputs of various dosimetric materials after irradiation with the same dose is the differing solubility. The character of the dissolution glow curves is the same following irradiation with photons, electrons or neutrons. (author)

  7. Memristance controlling approach based on modification of linear M—q curve

    International Nuclear Information System (INIS)

    Liu Hai-Jun; Li Zhi-Wei; Yu Hong-Qi; Sun Zhao-Lin; Nie Hong-Shan

    2014-01-01

    The memristor has broad application prospects in many fields, while in many cases, those fields require accurate impedance control. The nonlinear model is of great importance for realizing memristance control accurately, but the implementing complexity caused by iteration has limited the actual application of this model. Considering the approximate linear characteristics at the middle region of the memristance-charge (M—q) curve of the nonlinear model, this paper proposes a memristance controlling approach, which is achieved by linearizing the middle region of the M—q curve of the nonlinear memristor, and establishes the linear relationship between memristances M and input excitations so that it can realize impedance control precisely by only adjusting input signals briefly. First, it analyzes the feasibility for linearizing the middle part of the M—q curve of the memristor with a nonlinear model from the qualitative perspective. Then, the linearization equations of the middle region of the M—q curve is constructed by using the shift method, and under a sinusoidal excitation case, the analytical relation between the memristance M and the charge time t is derived through the Taylor series expansions. At last, the performance of the proposed approach is demonstrated, including the linearizing capability for the middle part of the M—q curve of the nonlinear model memristor, the controlling ability for memristance M, and the influence of input excitation on linearization errors. (interdisciplinary physics and related areas of science and technology)

  8. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  9. Rational quadratic trigonometric Bézier curve based on new basis with exponential functions

    Directory of Open Access Journals (Sweden)

    Wu Beibei

    2017-06-01

    Full Text Available We construct a rational quadratic trigonometric Bézier curve with four shape parameters by introducing two exponential functions into the trigonometric basis functions in this paper. It has the similar properties as the rational quadratic Bézier curve. For given control points, the shape of the curve can be flexibly adjusted by changing the shape parameters and the weight. Some conics can be exactly represented when the control points, the shape parameters and the weight are chosen appropriately. The C0, C1 and C2 continuous conditions for joining two constructed curves are discussed. Some examples are given.

  10. Development of Load Duration Curve System in Data Scarce Watersheds Based on a Distributed Hydrological Model

    Science.gov (United States)

    WANG, J.

    2017-12-01

    In stream water quality control, the total maximum daily load (TMDL) program is very effective. However, the load duration curves (LDC) of TMDL are difficult to be established because no sufficient observed flow and pollutant data can be provided in data-scarce watersheds in which no hydrological stations or consecutively long-term hydrological data are available. Although the point sources or a non-point sources of pollutants can be clarified easily with the aid of LDC, where does the pollutant come from and to where it will be transported in the watershed cannot be traced by LDC. To seek out the best management practices (BMPs) of pollutants in a watershed, and to overcome the limitation of LDC, we proposed to develop LDC based on a distributed hydrological model of SWAT for the water quality management in data scarce river basins. In this study, firstly, the distributed hydrological model of SWAT was established with the scarce-hydrological data. Then, the long-term daily flows were generated with the established SWAT model and rainfall data from the adjacent weather station. Flow duration curves (FDC) was then developed with the aid of generated daily flows by SWAT model. Considering the goal of water quality management, LDC curves of different pollutants can be obtained based on the FDC. With the monitored water quality data and the LDC curves, the water quality problems caused by the point or non-point source pollutants in different seasons can be ascertained. Finally, the distributed hydrological model of SWAT was employed again to tracing the spatial distribution and the origination of the pollutants of coming from what kind of agricultural practices and/or other human activities. A case study was conducted in the Jian-jiang river, a tributary of Yangtze river, of Duyun city, Guizhou province. Results indicate that this kind of method can realize the water quality management based on TMDL and find out the suitable BMPs for reducing pollutant in a watershed.

  11. Multivariate analysis of diagnostic parameters derived from whole-kidney and parenchymal time-activity curves

    International Nuclear Information System (INIS)

    Bergmann, H.; Mostbeck, A.; Samal, M.; Nimmon, C.C.; Staudenherz, A.; Dudczak, R.

    2002-01-01

    Aim: In a previous work, we have confirmed earlier reports that time-activity curves of renal cortex provide additional useful diagnostic information. The aim of this experiment was to support the finding quantitatively using multiple regression. Materials and Methods: In a retrospective study, we have analyzed MAG3 renal data (90 kidneys in 57 children). Whole-kidney (WK) and parenchymal (PA) time-activity curves were extracted from 20 min pre-diuretic phase using standard WK and parenchymal fuzzy ROIs. Using multiple regression analysis, peak time, mean transit time, output efficiency, and four additional indices of residual activity in WK and PA ROIs were related to the maximum elimination rate (EM) of urine after the diuretic. The kidneys were divided into four groups according to the WK peak time (WKPT): WKPT longer than 0 (all kidneys), 5, 10, and 15 min. Results: Multiple correlation coefficients between the set of WK, PA, and WK+PA curve parameters (independent variables) and the log EM (dependent variable) for each group are summarized. Conclusions: Using pre-diuretic time-activity curves, it is possible to predict diuretic response. This can be useful when interpreting dubious results. Parenchymal curves predict diuretic response better than the whole-kidney curves. With increasing WKPT the whole-kidney curves become useless, while the parenchymal curves are still useful. Using both WK and PA curves produces the best results. This demonstrates that both WK and PA curves carry independent diagnostic information. The contribution obtained from the parenchymal curves certainly worth the difficulties and time required to draw additional ROIs. However, substantial efforts have to be given to the accurate and reproducible definition of parenchymal ROIs

  12. IGMtransmission: Transmission curve computation

    Science.gov (United States)

    Harrison, Christopher M.; Meiksin, Avery; Stock, David

    2015-04-01

    IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.

  13. Fractal based curves in musical creativity: A critical annotation

    Science.gov (United States)

    Georgaki, Anastasia; Tsolakis, Christos

    In this article we examine fractal curves and synthesis algorithms in musical composition and research. First we trace the evolution of different approaches for the use of fractals in music since the 80's by a literature review. Furthermore, we review representative fractal algorithms and platforms that implement them. Properties such as self-similarity (pink noise), correlation, memory (related to the notion of Brownian motion) or non correlation at multiple levels (white noise), can be used to develop hierarchy of criteria for analyzing different layers of musical structure. L-systems can be applied in the modelling of melody in different musical cultures as well as in the investigation of musical perception principles. Finally, we propose a critical investigation approach for the use of artificial or natural fractal curves in systematic musicology.

  14. Supply-cost curves for geographically distributed renewable-energy resources

    International Nuclear Information System (INIS)

    Izquierdo, Salvador; Dopazo, Cesar; Fueyo, Norberto

    2010-01-01

    The supply-cost curves of renewable-energy sources are an essential tool to synthesize and analyze large-scale energy-policy scenarios, both in the short and long terms. Here, we suggest and test a parametrization of such curves that allows their representation for modeling purposes with a minimal set of information. In essence, an economic potential is defined based on the mode of the marginal supply-cost curves; and, using this definition, a normalized log-normal distribution function is used to model these curves. The feasibility of this proposal is assessed with data from a GIS-based analysis of solar, wind and biomass technologies in Spain. The best agreement is achieved for solar energy.

  15. Material Properties Test to Determine Ultimate Strain and True Stress-True Strain Curves for High Yield Steels

    Energy Technology Data Exchange (ETDEWEB)

    K.R. Arpin; T.F. Trimble

    2003-04-01

    This testing was undertaken to develop material true stress-true strain curves for elastic-plastic material behavior for use in performing transient analysis. Based on the conclusions of this test, the true stress-true strain curves derived herein are valid for use in elastic-plastic finite element analysis for structures fabricated from these materials. In addition, for the materials tested herein, the ultimate strain values are greater than those values cited as the limits for the elastic-plastic strain acceptance criteria for transient analysis.

  16. Model-based Approach for Long-term Creep Curves of Alloy 617 for a High Temperature Gas-cooled Reactor

    International Nuclear Information System (INIS)

    Kim, Woo Gon; Yin, Song Nan; Kim, Yong Wan

    2008-01-01

    Alloy 617 is a principal candidate alloy for the high temperature gas-cooled reactor (HTGR) components, because of its high creep rupture strength coupled with its good corrosion behavior in simulated HTGR-helium and its sufficient workability. To describe a creep strain-time curve well, various constitutive equations have been proposed by Kachanov-Rabotnov, Andrade, Garofalo, Evans and Maruyama, et al.. Among them, the K-R model has been used frequently, because a secondary creep resulting from a balance between a softening and a hardening of materials and a tertiary creep resulting from an appearance and acceleration of the internal or external damage processes are adequately considered. In the case of nickel-base alloys, it has been reported that a tertiary creep at a low strain range may be generated, and this tertiary stage may govern the total creep deformation. Therefore, a creep curve for nickel-based Alloy 617 will be predicted appropriately by using the K-R model that can reflect a tertiary creep. In this paper, the long-term creep curves for Alloy 617 were predicted by using the nonlinear least square fitting (NLSF) method in the K-R model. The modified K-R model was introduced to fit the full creep curves well. The values for the λ and K parameters in the modified K-R model were obtained with stresses

  17. Smooth time-dependent receiver operating characteristic curve estimators.

    Science.gov (United States)

    Martínez-Camblor, Pablo; Pardo-Fernández, Juan Carlos

    2018-03-01

    The receiver operating characteristic curve is a popular graphical method often used to study the diagnostic capacity of continuous (bio)markers. When the considered outcome is a time-dependent variable, two main extensions have been proposed: the cumulative/dynamic receiver operating characteristic curve and the incident/dynamic receiver operating characteristic curve. In both cases, the main problem for developing appropriate estimators is the estimation of the joint distribution of the variables time-to-event and marker. As usual, different approximations lead to different estimators. In this article, the authors explore the use of a bivariate kernel density estimator which accounts for censored observations in the sample and produces smooth estimators of the time-dependent receiver operating characteristic curves. The performance of the resulting cumulative/dynamic and incident/dynamic receiver operating characteristic curves is studied by means of Monte Carlo simulations. Additionally, the influence of the choice of the required smoothing parameters is explored. Finally, two real-applications are considered. An R package is also provided as a complement to this article.

  18. Anterior Overgrowth in Primary Curves, Compensatory Curves and Junctional Segments in Adolescent Idiopathic Scoliosis.

    Science.gov (United States)

    Schlösser, Tom P C; van Stralen, Marijn; Chu, Winnie C W; Lam, Tsz-Ping; Ng, Bobby K W; Vincken, Koen L; Cheng, Jack C Y; Castelein, René M

    2016-01-01

    Although much attention has been given to the global three-dimensional aspect of adolescent idiopathic scoliosis (AIS), the accurate three-dimensional morphology of the primary and compensatory curves, as well as the intervening junctional segments, in the scoliotic spine has not been described before. A unique series of 77 AIS patients with high-resolution CT scans of the spine, acquired for surgical planning purposes, were included and compared to 22 healthy controls. Non-idiopathic curves were excluded. Endplate segmentation and local longitudinal axis in endplate plane enabled semi-automatic geometric analysis of the complete three-dimensional morphology of the spine, taking inter-vertebral rotation, intra-vertebral torsion and coronal and sagittal tilt into account. Intraclass correlation coefficients for interobserver reliability were 0.98-1.00. Coronal deviation, axial rotation and the exact length discrepancies in the reconstructed sagittal plane, as defined per vertebra and disc, were analyzed for each primary and compensatory curve as well as for the junctional segments in-between. The anterior-posterior difference of spinal length, based on "true" anterior and posterior points on endplates, was +3.8% for thoracic and +9.4% for (thoraco)lumbar curves, while the junctional segments were almost straight. This differed significantly from control group thoracic kyphosis (-4.1%; P<0.001) and lumbar lordosis (+7.8%; P<0.001). For all primary as well as compensatory curves, we observed linear correlations between the coronal Cobb angle, axial rotation and the anterior-posterior length difference (r≥0.729 for thoracic curves; r≥0.485 for (thoraco)lumbar curves). Excess anterior length of the spine in AIS has been described as a generalized growth disturbance, causing relative anterior spinal overgrowth. This study is the first to demonstrate that this anterior overgrowth is not a generalized phenomenon. It is confined to the primary as well as the

  19. An improvement of the base bleed unit on base drag reduction and heat energy addition as well as mass addition

    International Nuclear Information System (INIS)

    Xue, Xiaochun; Yu, Yonggang

    2016-01-01

    Highlights: • A 2D axisymmetric Navier-Stokes equation for a multi-component reactive system is solved. • The coupling of the internal and wake flow field with secondary combustion is calculated. • Detailed data with combined effects of boattailing and post-combustion are obtained. • The mechanism of heat energy addition and thermodynamics performances is investigated. - Abstract: Numerical simulations are carried out to investigate the base drag and energy characteristics of a base-bleed projectile with and without containing the effect of a post-combustion process for a boattailed afterbody in a supersonic flow, and then to analyze the key factor of drag reduction and pressure decreasing of base bleed projectile. Detailed chemistry models for H_2−CO combustion have been incorporated into a Navier-Stokes computer code and applied to flow field simulation in the base region of a base-bleed projectile. Detailed numerical results for the flow patterns and heat energy addition as well as mass addition for different conditions are presented and compared with existing experimental data. The results shows that, the post-combustion contributes to energy addition and base drag reduction up to 78% on account of that the heat energy released from the post-combustion using fuel-rich reaction products as the fuel in the wake region is much higher than for the corresponding cold bleed and hot bleed cases. In addition, the temperature distribution regularities are changed under post-combustion effect, presenting that the peak appears in a couple of recirculation regions. The fuel-rich bleed gas flows into the shear layer along the crack between these two recirculation regions and then those can readily burn when mixing with the freestream, thus causing component changes of H_2 and CO in the base flowfield.

  20. Modelling curves of manufacturing feasibilities and demand

    Directory of Open Access Journals (Sweden)

    Soloninko K.S.

    2017-03-01

    Full Text Available The authors research the issue of functional properties of curves of manufacturing feasibilities and demand. Settlement of the problem, and its connection with important scientific and practical tasks. According to its nature, the market economy is unstable and is in constant movement. Economy has an effective instrument for explanation of changes in economic environment; this tool is called the modelling of economic processes. The modelling of economic processes depends first and foremost on the building of economic model which is the base for the formalization of economic process, that is, the building of mathematical model. The effective means for formalization of economic process is the creation of the model of hypothetic or imaginary economy. The building of demand model is significant for the market of goods and services. The problem includes the receiving (as the result of modelling definite functional properties of curves of manufacturing feasibilities and demand according to which one can determine their mathematical model. Another problem lies in obtaining majorant properties of curves of joint demand on the market of goods and services. Analysis of the latest researches and publications. Many domestic and foreign scientists dedicated their studies to the researches and building of the models of curves of manufacturing feasibilities and demand. In spite of considerable work of the scientists, such problems as functional properties of the curves and their practical use in modelling. The purpose of the article is to describe functional properties of curves of manufacturing feasibilities and demand on the market of goods and services on the base of modelling of their building. Scientific novelty and practical value. The theoretical regulations (for functional properties of curves of manufacturing feasibilities and demand received as a result of the present research, that is convexity, give extra practical possibilities in a microeconomic

  1. Guidelines for using the Delphi Technique to develop habitat suitability index curves

    Science.gov (United States)

    Crance, Johnie H.

    1987-01-01

    Habitat Suitability Index (SI) curves are one method of presenting species habitat suitability criteria. The curves are often used with the Habitat Evaluation Procedures (HEP) and are necessary components of the Instream Flow Incremental Methodology (IFIM) (Armour et al. 1984). Bovee (1986) described three categories of SI curves or habitat suitability criteria based on the procedures and data used to develop the criteria. Category I curves are based on professional judgment, with 1ittle or no empirical data. Both Category II (utilization criteria) and Category III (preference criteria) curves have as their source data collected at locations where target species are observed or collected. Having Category II and Category III curves for all species of concern would be ideal. In reality, no SI curves are available for many species, and SI curves that require intensive field sampling often cannot be developed under prevailing constraints on time and costs. One alternative under these circumstances is the development and interim use of SI curves based on expert opinion. The Delphi technique (Pill 1971; Delbecq et al. 1975; Linstone and Turoff 1975) is one method used for combining the knowledge and opinions of a group of experts. The purpose of this report is to describe how the Delphi technique may be used to develop expert-opinion-based SI curves.

  2. Sediment Curve Uncertainty Estimation Using GLUE and Bootstrap Methods

    Directory of Open Access Journals (Sweden)

    aboalhasan fathabadi

    2017-02-01

    3000 times. Sediment rating curves equation was fitted to each sampled suspended sediment and discharge data sets. Using these sediment rating curve and their residual suspended sediment concentration were calculate for test data. Finally using the 2.5 and 97.5 percentile of the B bootstrap realizations, 95% bootstrap prediction intervals were predicted. Results and Discussion: Results showed that Motorkhane and MiyaneTonelShomare 7 stations were best fitted by a sigmoid function and Stor and Glinak stations were best fitted by second order polynomial and liner function, respectively The first 50 of the B bootstrapped curves were plotted for all stations.with respect to these plots implied that bootstrapped curves more scattered whereas observed data were less. The suspended sediment curve parameters estimated more accurately where, the suspended sediments were sampled more, as a result of reduced uncertainty in estimated suspended sediment concentration due to parameter uncertainty. In addition to sampling density bootstrapped curves, uncertainty depends on the curve shape. For GLUE methodology to assess the impact of threshold values on the uncertainty results, threshold values systematically changed from 0.1 to 0.45. Study results showed that 95% confidence intervals are sensitive to the selected threshold values and higher threshold values will result in an increasing 95% confidence interval. However, the highest 95% confidence intervals obtained by GLUE method (when threshold value was set to 0.1 was little than those values obtained by Bootstrap. Conclusions: The uncertainty of sediment rating curves was addressed in this study by considering two different procedures based on the GLUE and bootstrap methods for four stations in Sefidrod watershed.Results showed that nonlinear equation fitted log-transformed values of sediment concentration and discharge better than linear equation. Uncertainty result using GLUE depend on chosen threshold values. As threshold

  3. Designing the Alluvial Riverbeds in Curved Paths

    Science.gov (United States)

    Macura, Viliam; Škrinár, Andrej; Štefunková, Zuzana; Muchová, Zlatica; Majorošová, Martina

    2017-10-01

    The paper presents the method of determining the shape of the riverbed in curves of the watercourse, which is based on the method of Ikeda (1975) developed for a slightly curved path in sandy riverbed. Regulated rivers have essentially slightly and smoothly curved paths; therefore, this methodology provides the appropriate basis for river restoration. Based on the research in the experimental reach of the Holeška Brook and several alluvial mountain streams the methodology was adjusted. The method also takes into account other important characteristics of bottom material - the shape and orientation of the particles, settling velocity and drag coefficients. Thus, the method is mainly meant for the natural sand-gravel material, which is heterogeneous and the particle shape of the bottom material is very different from spherical. The calculation of the river channel in the curved path provides the basis for the design of optimal habitat, but also for the design of foundations of armouring of the bankside of the channel. The input data is adapted to the conditions of design practice.

  4. Cost development of future technologies for power generation-A study based on experience curves and complementary bottom-up assessments

    International Nuclear Information System (INIS)

    Neij, Lena

    2008-01-01

    Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development

  5. Certificateless short sequential and broadcast multisignature schemes using elliptic curve bilinear pairings

    Directory of Open Access Journals (Sweden)

    SK Hafizul Islam

    2014-01-01

    Full Text Available Several certificateless short signature and multisignature schemes based on traditional public key infrastructure (PKI or identity-based cryptosystem (IBC have been proposed in the literature; however, no certificateless short sequential (or serial multisignature (CL-SSMS or short broadcast (or parallel multisignature (CL-SBMS schemes have been proposed. In this paper, we propose two such new CL-SSMS and CL-SBMS schemes based on elliptic curve bilinear pairing. Like any certificateless public key cryptosystem (CL-PKC, the proposed schemes are free from the public key certificate management burden and the private key escrow problem as found in PKI- and IBC-based cryptosystems, respectively. In addition, the requirements of the expected security level and the fixed length signature with constant verification time have been achieved in our schemes. The schemes are communication efficient as the length of the multisignature is equivalent to a single elliptic curve point and thus become the shortest possible multisignature scheme. The proposed schemes are then suitable for communication systems having resource constrained devices such as PDAs, mobile phones, RFID chips, and sensors where the communication bandwidth, battery life, computing power and storage space are limited.

  6. Higher Genus Abelian Functions Associated with Cyclic Trigonal Curves

    Directory of Open Access Journals (Sweden)

    Matthew England

    2010-03-01

    Full Text Available We develop the theory of Abelian functions associated with cyclic trigonal curves by considering two new cases. We investigate curves of genus six and seven and consider whether it is the trigonal nature or the genus which dictates certain areas of the theory. We present solutions to the Jacobi inversion problem, sets of relations between the Abelian function, links to the Boussinesq equation and a new addition formula.

  7. A Method of Timbre-Shape Synthesis Based On Summation of Spherical Curves

    DEFF Research Database (Denmark)

    Putnam, Lance Jonathan

    2014-01-01

    It is well-known that there is a rich correspondence between sound and visual curves, perhaps most widely explored through direct input of sound into an oscilloscope. However, there have been relatively few proposals on how to translate sound into three-dimensional curves. We present a novel meth...

  8. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  9. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.

    1999-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)

  10. A Pilot Study Verifying How the Curve Information Impacts on the Driver Performance with Cognition Model

    Directory of Open Access Journals (Sweden)

    Xiaohua Zhao

    2013-01-01

    Full Text Available Drivers' misjudgment is a significant issue for the curve safety. It is considered as a more influential factor than other traffic environmental conditions for inducing risk. The research suggested that the cognition theory could explain the process of drivers’ behavior at curves. In this simulator experiment, a principle cognition model was built to examine the rationality of this explanation. The core of this pilot study was using one of the driving decision strategies for braking at curves to verify the accuracy of the cognition model fundamentally. Therefore, the experiment designed three treatments of information providing modes. The result of the experiment presented that the warning information about curves in advance can move the position of first braking away from curves. This phenomenon is consistent with the model’s inference. Thus, the conclusion of this study indicates that the process of the drivers' behavior at curves can be explained by the cognition theory and represented by cognition model. In addition, the model’s characteristics and working parameters can be acquired by doing other research. Then based on the model it can afford the advice for giving the appropriate warning information that may avoid the driver’s mistake.

  11. Speed Choice and Curve Radius on Rural Roads

    DEFF Research Database (Denmark)

    Rimme, Nicolai; Nielsen, Lea; Kjems, Erik

    2016-01-01

    with informative speed-calming measures as traffic signs, reflectors or surface painting. However, it has been the hypothesis that people are reducing their speed insufficiently and are driving too fast in most curved alignments – especially when they are driving there frequently. By knowing the speed near...... and in the curved alignments compared to the geometry of the curved alignments, it can be clarified, if and which speed-calming measures that are required. Using GNSS-based floating car data (FCD) from driving cars the speed near and in curved alignments is found. Single observation of FCD are connected to trips...

  12. Precision-Recall-Gain Curves:PR Analysis Done Right

    OpenAIRE

    Flach, Peter; Kull, Meelis

    2015-01-01

    Precision-Recall analysis abounds in applications of binary classification where true negatives do not add value and hence should not affect assessment of the classifier's performance. Perhaps inspired by the many advantages of receiver operating characteristic (ROC) curves and the area under such curves for accuracy-based performance assessment, many researchers have taken to report Precision-Recall (PR) curves and associated areas as performance metric. We demonstrate in this paper that thi...

  13. Research on the Integration of Bionic Geometry Modeling and Simulation of Robot Foot Based on Characteristic Curve

    Science.gov (United States)

    He, G.; Zhu, H.; Xu, J.; Gao, K.; Zhu, D.

    2017-09-01

    The bionic research of shape is an important aspect of the research on bionic robot, and its implementation cannot be separated from the shape modeling and numerical simulation of the bionic object, which is tedious and time-consuming. In order to improve the efficiency of shape bionic design, the feet of animals living in soft soil and swamp environment are taken as bionic objects, and characteristic skeleton curve, section curve, joint rotation variable, position and other parameters are used to describe the shape and position information of bionic object’s sole, toes and flipper. The geometry modeling of the bionic object is established by using the parameterization of characteristic curves and variables. Based on this, the integration framework of parametric modeling and finite element modeling, dynamic analysis and post-processing of sinking process in soil is proposed in this paper. The examples of bionic ostrich foot and bionic duck foot are also given. The parametric modeling and integration technique can achieve rapid improved design based on bionic object, and it can also greatly improve the efficiency and quality of robot foot bionic design, and has important practical significance to improve the level of bionic design of robot foot’s shape and structure.

  14. Advanced topics in the arithmetic of elliptic curves

    CERN Document Server

    Silverman, Joseph H

    1994-01-01

    In the introduction to the first volume of The Arithmetic of Elliptic Curves (Springer-Verlag, 1986), I observed that "the theory of elliptic curves is rich, varied, and amazingly vast," and as a consequence, "many important topics had to be omitted." I included a brief introduction to ten additional topics as an appendix to the first volume, with the tacit understanding that eventually there might be a second volume containing the details. You are now holding that second volume. it turned out that even those ten topics would not fit Unfortunately, into a single book, so I was forced to make some choices. The following material is covered in this book: I. Elliptic and modular functions for the full modular group. II. Elliptic curves with complex multiplication. III. Elliptic surfaces and specialization theorems. IV. Neron models, Kodaira-Neron classification of special fibers, Tate's algorithm, and Ogg's conductor-discriminant formula. V. Tate's theory of q-curves over p-adic fields. VI. Neron's theory of can...

  15. On-machine measurement of a slow slide servo diamond-machined 3D microstructure with a curved substrate

    International Nuclear Information System (INIS)

    Zhu, Wu-Le; Yang, Shunyao; Ju, Bing-Feng; Jiang, Jiacheng; Sun, Anyu

    2015-01-01

    A scanning tunneling microscope-based multi-axis measuring system is specially developed for the on-machine measurement of three-dimensional (3D) microstructures, to address the quality control difficulty with the traditional off-line measurement process. A typical 3D microstructure of the curved compound eye was diamond-machined by the slow slide servo technique, and then the whole surface was on-machine scanned three-dimensionally based on the tip-tracking strategy by utilizing a spindle, two linear motion stages, and an additional rotary stage. The machined surface profile and its shape deviation were accurately measured on-machine. The distortion of imaged ommatidia on the curved substrate was distinctively evaluated based on the characterized points extracted from the measured surface. Furthermore, the machining errors were investigated in connection with the on-machine measured surface and its characteristic parameters. Through experiments, the proposed measurement system is demonstrated to feature versatile on-machine measurement of 3D microstructures with a curved substrate, which is highly meaningful for quality control in the fabrication field. (paper)

  16. According to Jim: The Flawed Normal Curve of Intelligence

    Science.gov (United States)

    Gallagher, James J.

    2008-01-01

    In this article, the author talks about the normal curve of intelligence which he thinks is flawed and contends that wrong conclusions have been drawn based on this spurious normal curve. An example is that of racial and ethnic differences wherein some authors maintain that some ethnic and racial groups are clearly superior to others based on…

  17. Genetic algorithm using independent component analysis in x-ray reflectivity curve fitting of periodic layer structures

    International Nuclear Information System (INIS)

    Tiilikainen, J; Bosund, V; Tilli, J-M; Sormunen, J; Mattila, M; Hakkarainen, T; Lipsanen, H

    2007-01-01

    A novel genetic algorithm (GA) utilizing independent component analysis (ICA) was developed for x-ray reflectivity (XRR) curve fitting. EFICA was used to reduce mutual information, or interparameter dependences, during the combinatorial phase. The performance of the new algorithm was studied by fitting trial XRR curves to target curves which were computed using realistic multilayer models. The median convergence properties of conventional GA, GA using principal component analysis and the novel GA were compared. GA using ICA was found to outperform the other methods with problems having 41 parameters or more to be fitted without additional XRR curve calculations. The computational complexity of the conventional methods was linear but the novel method had a quadratic computational complexity due to the applied ICA method which sets a practical limit for the dimensionality of the problem to be solved. However, the novel algorithm had the best capability to extend the fitting analysis based on Parratt's formalism to multiperiodic layer structures

  18. The standard centrifuge method accurately measures vulnerability curves of long-vesselled olive stems.

    Science.gov (United States)

    Hacke, Uwe G; Venturas, Martin D; MacKinnon, Evan D; Jacobsen, Anna L; Sperry, John S; Pratt, R Brandon

    2015-01-01

    The standard centrifuge method has been frequently used to measure vulnerability to xylem cavitation. This method has recently been questioned. It was hypothesized that open vessels lead to exponential vulnerability curves, which were thought to be indicative of measurement artifact. We tested this hypothesis in stems of olive (Olea europea) because its long vessels were recently claimed to produce a centrifuge artifact. We evaluated three predictions that followed from the open vessel artifact hypothesis: shorter stems, with more open vessels, would be more vulnerable than longer stems; standard centrifuge-based curves would be more vulnerable than dehydration-based curves; and open vessels would cause an exponential shape of centrifuge-based curves. Experimental evidence did not support these predictions. Centrifuge curves did not vary when the proportion of open vessels was altered. Centrifuge and dehydration curves were similar. At highly negative xylem pressure, centrifuge-based curves slightly overestimated vulnerability compared to the dehydration curve. This divergence was eliminated by centrifuging each stem only once. The standard centrifuge method produced accurate curves of samples containing open vessels, supporting the validity of this technique and confirming its utility in understanding plant hydraulics. Seven recommendations for avoiding artefacts and standardizing vulnerability curve methodology are provided. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.

  19. Adsorption of molecular additive onto lead halide perovskite surfaces: A computational study on Lewis base thiophene additive passivation

    Science.gov (United States)

    Zhang, Lei; Yu, Fengxi; Chen, Lihong; Li, Jingfa

    2018-06-01

    Organic additives, such as the Lewis base thiophene, have been successfully applied to passivate halide perovskite surfaces, improving the stability and properties of perovskite devices based on CH3NH3PbI3. Yet, the detailed nanostructure of the perovskite surface passivated by additives and the mechanisms of such passivation are not well understood. This study presents a nanoscopic view on the interfacial structure of an additive/perovskite interface, consisting of a Lewis base thiophene molecular additive and a lead halide perovskite surface substrate, providing insights on the mechanisms that molecular additives can passivate the halide perovskite surfaces and enhance the perovskite-based device performance. Molecular dynamics study on the interactions between water molecules and the perovskite surfaces passivated by the investigated additive reveal the effectiveness of employing the molecular additives to improve the stability of the halide perovskite materials. The additive/perovskite surface system is further probed via molecular engineering the perovskite surfaces. This study reveals the nanoscopic structure-property relationships of the halide perovskite surface passivated by molecular additives, which helps the fundamental understanding of the surface/interface engineering strategies for the development of halide perovskite based devices.

  20. Research on Standard and Automatic Judgment of Press-fit Curve of Locomotive Wheel-set Based on AAR Standard

    Science.gov (United States)

    Lu, Jun; Xiao, Jun; Gao, Dong Jun; Zong, Shu Yu; Li, Zhu

    2018-03-01

    In the production of the Association of American Railroads (AAR) locomotive wheel-set, the press-fit curve is the most important basis for the reliability of wheel-set assembly. In the past, Most of production enterprises mainly use artificial detection methods to determine the quality of assembly. There are cases of miscarriage of justice appear. For this reason, the research on the standard is carried out. And the automatic judgment of press-fit curve is analysed and designed, so as to provide guidance for the locomotive wheel-set production based on AAR standard.

  1. Projection of curves on B-spline surfaces using quadratic reparameterization

    KAUST Repository

    Yang, Yijun

    2010-09-01

    Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying completely on the surfaces by using iso-parameter curves of the reparameterized surfaces. The Hausdorff distance between the projected curve and the original curve is controlled under the user-specified distance tolerance. The projected curve is T-G 1 continuous, where T is the user-specified angle tolerance. Examples are given to show the performance of our algorithm. © 2010 Elsevier Inc. All rights reserved.

  2. Application of environmentally-corrected fatigue curves to nuclear power plant components

    International Nuclear Information System (INIS)

    Ware, A.G.; Morton, D.K.; Nitzel, M.E.

    1996-01-01

    Recent test data indicate that the effects of the light water reactor (LWR) environment could significantly reduce the fatigue resistance of materials used in the reactor coolant pressure boundary components of operating nuclear power plants. Argonne National Laboratory has developed interim fatigue curves based on test data simulating LWR conditions, and published them in NUREG/CR-5999. In order to assess the significance of these interim fatigue curves, fatigue evaluations of a sample of the components in the reactor coolant pressure boundary of LWRs were performed. The sample consists of components from facilities designed by each of the four US nuclear steam supply system vendors. For each facility, six locations were studied including two locations on the reactor pressure vessel. In addition, there are older vintage plants where components of the reactor coolant pressure boundary were designed to codes that did not require an explicit fatigue analysis of the components. In order to assess the fatigue resistance of the older vintage plants, an evaluation was also conducted on selected components of three of these plants. This paper discusses the insights gained from the application of the interim fatigue curves to components of seven operating nuclear power plants

  3. A NURBS approximation of experimental stress-strain curves

    International Nuclear Information System (INIS)

    Fedorov, Timofey V.; Morrev, Pavel G.

    2016-01-01

    A compact universal representation of monotonic experimental stress-strain curves of metals and alloys is proposed. It is based on the nonuniform rational Bezier splines (NURBS) of second order and may be used in a computer library of materials. Only six parameters per curve are needed; this is equivalent to a specification of only three points in a stress-strain plane. NURBS-functions of higher order prove to be surplus. Explicit expressions for both yield stress and hardening modulus are given. Two types of curves are considered: at a finite interval of strain and at infinite one. A broad class of metals and alloys of various chemical compositions subjected to various types of preliminary thermo-mechanical working is selected from a comprehensive data base in order to test the methodology proposed. The results demonstrate excellent correspondence to the experimental data. Keywords: work hardening, stress-strain curve, spline approximation, nonuniform rational B-spline, NURBS.

  4. Comparison of power curve monitoring methods

    Directory of Open Access Journals (Sweden)

    Cambron Philippe

    2017-01-01

    Full Text Available Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM of wind turbines (WT. In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.

  5. Laser-Based Additive Manufacturing of Zirconium

    Directory of Open Access Journals (Sweden)

    Himanshu Sahasrabudhe

    2018-03-01

    Full Text Available Additive manufacturing of zirconium is attempted using commercial Laser Engineered Net Shaping (LENSTM technique. A LENSTM-based approach towards processing coatings and bulk parts of zirconium, a reactive metal, aims to minimize the inconvenience of traditional metallurgical practices of handling and processing zirconium-based parts that are particularly suited to small volumes and one-of-a-kind parts. This is a single-step manufacturing approach for obtaining near net shape fabrication of components. In the current research, Zr metal powder was processed in the form of coating on Ti6Al4V alloy substrate. Scanning electron microscopy (SEM and energy dispersive spectroscopy (EDS as well as phase analysis via X-ray diffraction (XRD were studied on these coatings. In addition to coatings, bulk parts were also fabricated using LENS™ from Zr metal powders, and measured part accuracy.

  6. Crack resistance curves determination of tube cladding material

    Energy Technology Data Exchange (ETDEWEB)

    Bertsch, J. [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)]. E-mail: johannes.bertsch@psi.ch; Hoffelner, W. [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)

    2006-06-30

    Zirconium based alloys have been in use as fuel cladding material in light water reactors since many years. As claddings change their mechanical properties during service, it is essential for the assessment of mechanical integrity to provide parameters for potential rupture behaviour. Usually, fracture mechanics parameters like the fracture toughness K {sub IC} or, for high plastic strains, the J-integral based elastic-plastic fracture toughness J {sub IC} are employed. In claddings with a very small wall thickness the determination of toughness needs the extension of the J-concept beyond limits of standards. In the paper a new method based on the traditional J approach is presented. Crack resistance curves (J-R curves) were created for unirradiated thin walled Zircaloy-4 and aluminium cladding tube pieces at room temperature using the single sample method. The procedure of creating sharp fatigue starter cracks with respect to optical recording was optimized. It is shown that the chosen test method is appropriate for the determination of complete J-R curves including the values J {sub 0.2} (J at 0.2 mm crack length), J {sub m} (J corresponding to the maximum load) and the slope of the curve.

  7. Object-Image Correspondence for Algebraic Curves under Projections

    Directory of Open Access Journals (Sweden)

    Joseph M. Burdis

    2013-03-01

    Full Text Available We present a novel algorithm for deciding whether a given planar curve is an image of a given spatial curve, obtained by a central or a parallel projection with unknown parameters. The motivation comes from the problem of establishing a correspondence between an object and an image, taken by a camera with unknown position and parameters. A straightforward approach to this problem consists of setting up a system of conditions on the projection parameters and then checking whether or not this system has a solution. The computational advantage of the algorithm presented here, in comparison to algorithms based on the straightforward approach, lies in a significant reduction of a number of real parameters that need to be eliminated in order to establish existence or non-existence of a projection that maps a given spatial curve to a given planar curve. Our algorithm is based on projection criteria that reduce the projection problem to a certain modification of the equivalence problem of planar curves under affine and projective transformations. To solve the latter problem we make an algebraic adaptation of signature construction that has been used to solve the equivalence problems for smooth curves. We introduce a notion of a classifying set of rational differential invariants and produce explicit formulas for such invariants for the actions of the projective and the affine groups on the plane.

  8. Optimization on Spaces of Curves

    DEFF Research Database (Denmark)

    Møller-Andersen, Jakob

    in Rd, and methods to solve the initial and boundary value problem for geodesics allowing us to compute the Karcher mean and principal components analysis of data of curves. We apply the methods to study shape variation in synthetic data in the Kimia shape database, in HeLa cell nuclei and cycles...... of cardiac deformations. Finally we investigate a new application of Riemannian shape analysis in shape optimization. We setup a simple elliptic model problem, and describe how to apply shape calculus to obtain directional derivatives in the manifold of planar curves. We present an implementation based...

  9. Migration and the Wage Curve:

    DEFF Research Database (Denmark)

    Brücker, Herbert; Jahn, Elke J.

    in a general equilibrium framework. For the empirical analysis we employ the IABS, a two percent sample of the German labor force. We find that the elasticity of the wage curve is particularly high for young workers and workers with a university degree, while it is low for older workers and workers......  Based on a wage curve approach we examine the labor market effects of migration in Germany. The wage curve relies on the assumption that wages respond to a change in the unemployment rate, albeit imperfectly. This allows one to derive the wage and employment effects of migration simultaneously...... with a vocational degree. The wage and employment effects of migration are moderate: a 1 percent increase in the German labor force through immigration increases the aggregate unemployment rate by less than 0.1 percentage points and reduces average wages by less 0.1 percent. While native workers benefit from...

  10. Modelling stochastic chances in curve shape, with an application to cancer diagnostics

    DEFF Research Database (Denmark)

    Hobolth, A; Jensen, Eva B. Vedel

    2000-01-01

    Often, the statistical analysis of the shape of a random planar curve is based on a model for a polygonal approximation to the curve. In the present paper, we instead describe the curve as a continuous stochastic deformation of a template curve. The advantage of this continuous approach is that t......Often, the statistical analysis of the shape of a random planar curve is based on a model for a polygonal approximation to the curve. In the present paper, we instead describe the curve as a continuous stochastic deformation of a template curve. The advantage of this continuous approach...... is that the parameters in the model do not relate to a particular polygonal approximation. A somewhat similar approach has been used by Kent et al. (1996), who describe the limiting behaviour of a model with a first-order Markov property as the landmarks on the curve become closely spaced; see also Grenander(1993...

  11. Curves and surfaces for CAGD a practical guide

    CERN Document Server

    Farin, Gerald

    2002-01-01

    This fifth edition has been fully updated to cover the many advances made in CAGD and curve and surface theory since 1997, when the fourth edition appeared. Material has been restructured into theory and applications chapters. The theory material has been streamlined using the blossoming approach; the applications material includes least squares techniques in addition to the traditional interpolation methods. In all other respects, it is, thankfully, the same. This means you get the informal, friendly style and unique approach that has made Curves and Surfaces for CAGD: A Practical Gui

  12. Glow curve characteristics of bulb type thermoluminescent dosimeters

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Felszerfalvi, J.

    1993-01-01

    TL dosemeter readers are equipped usually with thermocouples connected to the heater unit. This layout can well be applied to stabilize the position of the glow curve as a function of heating-up time. Bulb type TL dosemeters do not have temperature sensors, no possibility for stabilization, which can cause an additional readout error of dose determination. For this reason, the time dependence of glow curves for bulb-type TL dosemeters was measured, and a new microprocessor controlled readout device was developed. (N.T.) 2 refs.; 2 figs

  13. Potentiometric titration curves of aluminium salt solutions and its ...

    African Journals Online (AJOL)

    Potentiometric titration curves of aluminium salt solutions and its species conversion ... of aluminium salt solutions under the moderate slow rate of base injection. ... silicate radical, and organic acid radical on the titration curves and its critical ...

  14. Electromagnetic field limits set by the V-Curve.

    Energy Technology Data Exchange (ETDEWEB)

    Warne, Larry Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jorgenson, Roy Eberhardt [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hudson, Howard Gerald [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-07-01

    When emitters of electromagnetic energy are operated in the vicinity of sensitive components, the electric field at the component location must be kept below a certain level in order to prevent the component from being damaged, or in the case of electro-explosive devices, initiating. The V-Curve is a convenient way to set the electric field limit because it requires minimal information about the problem configuration. In this report we will discuss the basis for the V-Curve. We also consider deviations from the original V-Curve resulting from inductive versus capacitive antennas, increases in directivity gain for long antennas, decreases in input impedance when operating in a bounded region, and mismatches dictated by transmission line losses. In addition, we consider mitigating effects resulting from limited antenna sizes.

  15. On-chip magnetic bead-based DNA melting curve analysis using a magnetoresistive sensor

    DEFF Research Database (Denmark)

    Rizzi, Giovanni; Østerberg, Frederik Westergaard; Henriksen, Anders Dahl

    2014-01-01

    We present real-time measurements of DNA melting curves in a chip-based system that detects the amount of surface-bound magnetic beads using magnetoresistive magnetic field sensors. The sensors detect the difference between the amount of beads bound to the top and bottom sensor branches....... The beads are magnetized by the field arising from the bias current passed through the sensors. We demonstrate the first on-chip measurements of the melting of DNA hybrids upon a ramping of the temperature. This overcomes the limitation of using a single washing condition at constant temperature. Moreover...

  16. Optical fiber sensors for process refractometry and temperature measuring based on curved fibers

    International Nuclear Information System (INIS)

    Willsch, R.; Schwotzer, G.; Haubenreisser, W.; Jahn, J.U.

    1986-01-01

    Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples. (orig.) [de

  17. Optical fiber sensors for process refractometry and temperature measuring based on curved fibers

    Energy Technology Data Exchange (ETDEWEB)

    Willsch, R; Schwotzer, G; Haubenreisser, W; Jahn, J U

    1986-01-01

    Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples.

  18. Stenting for curved lesions using a novel curved balloon: Preliminary experimental study.

    Science.gov (United States)

    Tomita, Hideshi; Higaki, Takashi; Kobayashi, Toshiki; Fujii, Takanari; Fujimoto, Kazuto

    2015-08-01

    Stenting may be a compelling approach to dilating curved lesions in congenital heart diseases. However, balloon-expandable stents, which are commonly used for congenital heart diseases, are usually deployed in a straight orientation. In this study, we evaluated the effect of stenting with a novel curved balloon considered to provide better conformability to the curved-angled lesion. In vitro experiments: A Palmaz Genesis(®) stent (Johnson & Johnson, Cordis Co, Bridgewater, NJ, USA) mounted on the Goku(®) curve (Tokai Medical Co. Nagoya, Japan) was dilated in vitro to observe directly the behavior of the stent and balloon assembly during expansion. Animal experiment: A short Express(®) Vascular SD (Boston Scientific Co, Marlborough, MA, USA) stent and a long Express(®) Vascular LD stent (Boston Scientific) mounted on the curved balloon were deployed in the curved vessel of a pig to observe the effect of stenting in vivo. In vitro experiments: Although the stent was dilated in a curved fashion, stent and balloon assembly also rotated conjointly during expansion of its curved portion. In the primary stenting of the short stent, the stent was dilated with rotation of the curved portion. The excised stent conformed to the curved vessel. As the long stent could not be negotiated across the mid-portion with the balloon in expansion when it started curving, the mid-portion of the stent failed to expand fully. Furthermore, the balloon, which became entangled with the stent strut, could not be retrieved even after complete deflation. This novel curved balloon catheter might be used for implantation of the short stent in a curved lesion; however, it should not be used for primary stenting of the long stent. Post-dilation to conform the stent to the angled vessel would be safer than primary stenting irrespective of stent length. Copyright © 2014 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  19. Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.

    Science.gov (United States)

    Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M

    2014-12-01

    In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.

  20. KEPLER ECLIPSING BINARY STARS. III. CLASSIFICATION OF KEPLER ECLIPSING BINARY LIGHT CURVES WITH LOCALLY LINEAR EMBEDDING

    International Nuclear Information System (INIS)

    Matijevič, Gal; Prša, Andrej; Orosz, Jerome A.; Welsh, William F.; Bloemen, Steven; Barclay, Thomas

    2012-01-01

    We present an automated classification of 2165 Kepler eclipsing binary (EB) light curves that accompanied the second Kepler data release. The light curves are classified using locally linear embedding, a general nonlinear dimensionality reduction tool, into morphology types (detached, semi-detached, overcontact, ellipsoidal). The method, related to a more widely used principal component analysis, produces a lower-dimensional representation of the input data while preserving local geometry and, consequently, the similarity between neighboring data points. We use this property to reduce the dimensionality in a series of steps to a one-dimensional manifold and classify light curves with a single parameter that is a measure of 'detachedness' of the system. This fully automated classification correlates well with the manual determination of morphology from the data release, and also efficiently highlights any misclassified objects. Once a lower-dimensional projection space is defined, the classification of additional light curves runs in a negligible time and the method can therefore be used as a fully automated classifier in pipeline structures. The classifier forms a tier of the Kepler EB pipeline that pre-processes light curves for the artificial intelligence based parameter estimator.

  1. Unraveling the photovoltaic technology learning curve by incorporation of input price changes and scale effects

    International Nuclear Information System (INIS)

    Yu, C.F.; van Sark, W.G.J.H.M.; Alsema, E.A.

    2011-01-01

    In a large number of energy models, the use of learning curves for estimating technological improvements has become popular. This is based on the assumption that technological development can be monitored by following cost development as a function of market size. However, recent data show that in some stages of photovoltaic technology (PV) production, the market price of PV modules stabilizes even though the cumulative capacity increases. This implies that no technological improvement takes place in these periods: the cost predicted by the learning curve in the PV study is lower than the market one. We propose that this bias results from ignoring the effects of input prices and scale effects, and that incorporating the input prices and scale effects into the learning curve theory is an important issue in making cost predictions more reliable. In this paper, a methodology is described to incorporate the scale and input-prices effect as the additional variables into the one factor learning curve, which leads to the definition of the multi-factor learning curve. This multi-factor learning curve is not only derived from economic theories, but also supported by an empirical study. The results clearly show that input prices and scale effects are to be included, and that, although market prices are stabilizing, learning is still taking place. (author)

  2. JUMPING THE CURVE

    Directory of Open Access Journals (Sweden)

    René Pellissier

    2012-01-01

    Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.

  3. Stress energy of elastic globe in curved space and a slip-out force

    International Nuclear Information System (INIS)

    Sokolov, S.N.

    1990-01-01

    The energy of stresses in an elastic globe in the flat space and in curved space is expressed through scalar invariants of the curved space. This energy creates an additional force acting on elastic bodies in a gravitational field. 4 refs

  4. Dynamic Response and Optimal Design of Curved Metallic Sandwich Panels under Blast Loading

    Science.gov (United States)

    Yang, Shu; Han, Shou-Hong; Lu, Zhen-Hua

    2014-01-01

    It is important to understand the effect of curvature on the blast response of curved structures so as to seek the optimal configurations of such structures with improved blast resistance. In this study, the dynamic response and protective performance of a type of curved metallic sandwich panel subjected to air blast loading were examined using LS-DYNA. The numerical methods were validated using experimental data in the literature. The curved panel consisted of an aluminum alloy outer face and a rolled homogeneous armour (RHA) steel inner face in addition to a closed-cell aluminum foam core. The results showed that the configuration of a “soft” outer face and a “hard” inner face worked well for the curved sandwich panel against air blast loading in terms of maximum deflection (MaxD) and energy absorption. The panel curvature was found to have a monotonic effect on the specific energy absorption (SEA) and a nonmonotonic effect on the MaxD of the panel. Based on artificial neural network (ANN) metamodels, multiobjective optimization designs of the panel were carried out. The optimization results revealed the trade-off relationships between the blast-resistant and the lightweight objectives and showed the great use of Pareto front in such design circumstances. PMID:25126606

  5. Dynamic response and optimal design of curved metallic sandwich panels under blast loading.

    Science.gov (United States)

    Qi, Chang; Yang, Shu; Yang, Li-Jun; Han, Shou-Hong; Lu, Zhen-Hua

    2014-01-01

    It is important to understand the effect of curvature on the blast response of curved structures so as to seek the optimal configurations of such structures with improved blast resistance. In this study, the dynamic response and protective performance of a type of curved metallic sandwich panel subjected to air blast loading were examined using LS-DYNA. The numerical methods were validated using experimental data in the literature. The curved panel consisted of an aluminum alloy outer face and a rolled homogeneous armour (RHA) steel inner face in addition to a closed-cell aluminum foam core. The results showed that the configuration of a "soft" outer face and a "hard" inner face worked well for the curved sandwich panel against air blast loading in terms of maximum deflection (MaxD) and energy absorption. The panel curvature was found to have a monotonic effect on the specific energy absorption (SEA) and a nonmonotonic effect on the MaxD of the panel. Based on artificial neural network (ANN) metamodels, multiobjective optimization designs of the panel were carried out. The optimization results revealed the trade-off relationships between the blast-resistant and the lightweight objectives and showed the great use of Pareto front in such design circumstances.

  6. Dynamic Response and Optimal Design of Curved Metallic Sandwich Panels under Blast Loading

    Directory of Open Access Journals (Sweden)

    Chang Qi

    2014-01-01

    Full Text Available It is important to understand the effect of curvature on the blast response of curved structures so as to seek the optimal configurations of such structures with improved blast resistance. In this study, the dynamic response and protective performance of a type of curved metallic sandwich panel subjected to air blast loading were examined using LS-DYNA. The numerical methods were validated using experimental data in the literature. The curved panel consisted of an aluminum alloy outer face and a rolled homogeneous armour (RHA steel inner face in addition to a closed-cell aluminum foam core. The results showed that the configuration of a “soft” outer face and a “hard” inner face worked well for the curved sandwich panel against air blast loading in terms of maximum deflection (MaxD and energy absorption. The panel curvature was found to have a monotonic effect on the specific energy absorption (SEA and a nonmonotonic effect on the MaxD of the panel. Based on artificial neural network (ANN metamodels, multiobjective optimization designs of the panel were carried out. The optimization results revealed the trade-off relationships between the blast-resistant and the lightweight objectives and showed the great use of Pareto front in such design circumstances.

  7. An extended CFD model to predict the pumping curve in low pressure plasma etch chamber

    Science.gov (United States)

    Zhou, Ning; Wu, Yuanhao; Han, Wenbin; Pan, Shaowu

    2014-12-01

    Continuum based CFD model is extended with slip wall approximation and rarefaction effect on viscosity, in an attempt to predict the pumping flow characteristics in low pressure plasma etch chambers. The flow regime inside the chamber ranges from slip wall (Kn ˜ 0.01), and up to free molecular (Kn = 10). Momentum accommodation coefficient and parameters for Kn-modified viscosity are first calibrated against one set of measured pumping curve. Then the validity of this calibrated CFD models are demonstrated in comparison with additional pumping curves measured in chambers of different geometry configurations. More detailed comparison against DSMC model for flow conductance over slits with contraction and expansion sections is also discussed.

  8. INVESTIGATION OF CURVES SET BY CUBIC DISTRIBUTION OF CURVATURE

    Directory of Open Access Journals (Sweden)

    S. A. Ustenko

    2014-03-01

    Full Text Available Purpose. Further development of the geometric modeling of curvelinear contours of different objects based on the specified cubic curvature distribution and setpoints of curvature in the boundary points. Methodology. We investigate the flat section of the curvilinear contour generating under condition that cubic curvature distribution is set. Curve begins and ends at the given points, where angles of tangent slope and curvature are also determined. It was obtained the curvature equation of this curve, depending on the section length and coefficient c of cubic curvature distribution. The analysis of obtained equation was carried out. As well as, it was investigated the conditions, in which the inflection points of the curve are appearing. One should find such an interval of parameter change (depending on the input data and the section length, in order to place the inflection point of the curvature graph outside the curve section borders. It was determined the dependence of tangent slope of angle to the curve at its arbitrary point, as well as it was given the recommendations to solve a system of integral equations that allow finding the length of the curve section and the coefficient c of curvature cubic distribution. Findings. As the result of curves research, it is found that the criterion for their selection one can consider the absence of inflection points of the curvature on the observed section. Influence analysis of the parameter c on the graph of tangent slope angle to the curve showed that regardless of its value, it is provided the same rate of angle increase of tangent slope to the curve. Originality. It is improved the approach to geometric modeling of curves based on cubic curvature distribution with its given values at the boundary points by eliminating the inflection points from the observed section of curvilinear contours. Practical value. Curves obtained using the proposed method can be used for geometric modeling of curvilinear

  9. Formulae for Arithmetic on Genus 2 Hyperelliptic Curves

    DEFF Research Database (Denmark)

    Lange, Tanja

    2005-01-01

    The ideal class group of hyperelliptic curves can be used in cryptosystems based on the discrete logarithm problem. In this article we present explicit formulae to perform the group operations for genus 2 curves. The formulae are completely general but to achieve the lowest number of operations we...... treat odd and even characteristic separately. We present 3 different coordinate systems which are suitable for different environments, e.g. on a smart card we should avoid inversions while in software a limited number is acceptable. The presented formulae render genus two hyperelliptic curves very...

  10. The EB factory project. I. A fast, neural-net-based, general purpose light curve classifier optimized for eclipsing binaries

    International Nuclear Information System (INIS)

    Paegert, Martin; Stassun, Keivan G.; Burger, Dan M.

    2014-01-01

    We describe a new neural-net-based light curve classifier and provide it with documentation as a ready-to-use tool for the community. While optimized for identification and classification of eclipsing binary stars, the classifier is general purpose, and has been developed for speed in the context of upcoming massive surveys such as the Large Synoptic Survey Telescope. A challenge for classifiers in the context of neural-net training and massive data sets is to minimize the number of parameters required to describe each light curve. We show that a simple and fast geometric representation that encodes the overall light curve shape, together with a chi-square parameter to capture higher-order morphology information results in efficient yet robust light curve classification, especially for eclipsing binaries. Testing the classifier on the ASAS light curve database, we achieve a retrieval rate of 98% and a false-positive rate of 2% for eclipsing binaries. We achieve similarly high retrieval rates for most other periodic variable-star classes, including RR Lyrae, Mira, and delta Scuti. However, the classifier currently has difficulty discriminating between different sub-classes of eclipsing binaries, and suffers a relatively low (∼60%) retrieval rate for multi-mode delta Cepheid stars. We find that it is imperative to train the classifier's neural network with exemplars that include the full range of light curve quality to which the classifier will be expected to perform; the classifier performs well on noisy light curves only when trained with noisy exemplars. The classifier source code, ancillary programs, a trained neural net, and a guide for use, are provided.

  11. Dual Smarandache Curves of a Timelike Curve lying on Unit dual Lorentzian Sphere

    OpenAIRE

    Kahraman, Tanju; Hüseyin Ugurlu, Hasan

    2016-01-01

    In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere. Firstly, we define the four types of dual Smarandache curves of a timelike curve lying on dual Lorentzian sphere.

  12. PV experience curves for the Netherlands

    International Nuclear Information System (INIS)

    Gerwig, R.

    2005-01-01

    Experience curves are one of several tools used by policy makers to take a look at market development. Numerous curves have been constructed for PV but none specific to the Netherlands. The objective of this report is to take a look at the price development of grid-connected PV systems in the Netherlands using the experience curve theory. After a literature and internet search and attempts to acquire information from PV companies information on 51% of the totally installed capacity was found. Curves for the period 1991-2001 were constructed based on system price, BOS (balance-of-system) price and inverter price. The progress ratio of the locally learning BOS was similar to the globally learning module market. This indicates that the pace of development of the Dutch PV market is similar to the globally followed pace. Improvement of the detail of the data might help to get a better idea of which BOS components have declined most. The similar progress ratio also shows the importance of investing both in module and system research as is the case in the Netherlands

  13. Additivity of Feature-based and Symmetry-based Grouping Effects in Multiple Object Tracking

    Directory of Open Access Journals (Sweden)

    Chundi eWang

    2016-05-01

    Full Text Available Multiple object tracking (MOT is an attentional process wherein people track several moving targets among several distractors. Symmetry, an important indicator of regularity, is a general spatial pattern observed in natural and artificial scenes. According to the laws of perceptual organization proposed by Gestalt psychologists, regularity is a principle of perceptual grouping, such as similarity and closure. A great deal of research reported that feature-based similarity grouping (e.g., grouping based on color, size, or shape among targets in MOT tasks can improve tracking performance. However, no additive feature-based grouping effects have been reported where the tracking objects had two or more features. Additive effect refers to a greater grouping effect produced by grouping based on multiple cues instead of one cue. Can spatial symmetry produce a similar grouping effect similar to that of feature similarity in MOT tasks? Are the grouping effects based on symmetry and feature similarity additive? This study includes four experiments to address these questions. The results of Experiments 1 and 2 demonstrated the automatic symmetry-based grouping effects. More importantly, an additive grouping effect of symmetry and feature similarity was observed in Experiments 3 and 4. Our findings indicate that symmetry can produce an enhanced grouping effect in MOT and facilitate the grouping effect based on color or shape similarity. The where and what pathways might have played an important role in the additive grouping effect.

  14. Separate base usages of genes located on the leading and lagging strands in Chlamydia muridarum revealed by the Z curve method

    Directory of Open Access Journals (Sweden)

    Yu Xiu-Juan

    2007-10-01

    Full Text Available Abstract Background The nucleotide compositional asymmetry between the leading and lagging strands in bacterial genomes has been the subject of intensive study in the past few years. It is interesting to mention that almost all bacterial genomes exhibit the same kind of base asymmetry. This work aims to investigate the strand biases in Chlamydia muridarum genome and show the potential of the Z curve method for quantitatively differentiating genes on the leading and lagging strands. Results The occurrence frequencies of bases of protein-coding genes in C. muridarum genome were analyzed by the Z curve method. It was found that genes located on the two strands of replication have distinct base usages in C. muridarum genome. According to their positions in the 9-D space spanned by the variables u1 – u9 of the Z curve method, K-means clustering algorithm can assign about 94% of genes to the correct strands, which is a few percent higher than those correctly classified by K-means based on the RSCU. The base usage and codon usage analyses show that genes on the leading strand have more G than C and more T than A, particularly at the third codon position. For genes on the lagging strand the biases is reverse. The y component of the Z curves for the complete chromosome sequences show that the excess of G over C and T over A are more remarkable in C. muridarum genome than in other bacterial genomes without separating base and/or codon usages. Furthermore, for the genomes of Borrelia burgdorferi, Treponema pallidum, Chlamydia muridarum and Chlamydia trachomatis, in which distinct base and/or codon usages have been observed, closer phylogenetic distance is found compared with other bacterial genomes. Conclusion The nature of the strand biases of base composition in C. muridarum is similar to that in most other bacterial genomes. However, the base composition asymmetry between the leading and lagging strands in C. muridarum is more significant than that in

  15. Growth curves in Down syndrome with congenital heart disease

    Directory of Open Access Journals (Sweden)

    Caroline D’Azevedo Sica

    Full Text Available SUMMARY Introduction: To assess dietary habits, nutritional status and food frequency in children and adolescents with Down syndrome (DS and congenital heart disease (CHD. Additionally, we attempted to compare body mass index (BMI classifications according to the World Health Organization (WHO curves and curves developed for individuals with DS. Method: Cross-sectional study including individuals with DS and CHD treated at a referral center for cardiology, aged 2 to 18 years. Weight, height, BMI, total energy and food frequency were measured. Nutritional status was assessed using BMI for age and gender, using curves for evaluation of patients with DS and those set by the WHO. Results: 68 subjects with DS and CHD were evaluated. Atrioventricular septal defect (AVSD was the most common heart disease (52.9%. There were differences in BMI classification between the curves proposed for patients with DS and those proposed by the WHO. There was an association between consumption of vitamin E and polyunsaturated fatty acids. Conclusion: Results showed that individuals with DS are mostly considered normal weight for age, when evaluated using specific curves for DS. Reviews on specific curves for DS would be the recommended practice for health professionals so as to avoid precipitated diagnosis of overweight and/or obesity in this population.

  16. Seismic Fragility Curves of Industrial Buildings by Using Nonlinear Analysis

    Directory of Open Access Journals (Sweden)

    Mohamed Nazri Fadzli

    2017-01-01

    Full Text Available This study presents the steel fragility curves and performance curves of industrial buildings of different geometries. The fragility curves were obtained for different building geometries, and the performance curves were developed based on lateral load, which is affected by the geometry of the building. Three records of far-field ground motion were used for incremental dynamic analysis (IDA, and the design lateral loads for pushover analysis (POA. All designs were based on British Standard (BS 5950; however, Eurocode 8 was preferred for seismic consideration in the analysis because BS 5950 does not specify any seismic provision. The five levels of performance stated by FEMA-273, namely, operational phase, immediate occupancy, damage control, life safety, and collapse prevention (CP were used as main guidelines for evaluating structural performance. For POA, Model 2 had highest base shear, followed by Model 1 and Model 3, even though Model 2 has a smaller structure compared with Model 3. Meanwhile, the fragility curves showed that the probability of reaching or exceeding the CP level of Model 2 is the highest, followed by that of Models 1 and 3.

  17. Mentorship, learning curves, and balance.

    Science.gov (United States)

    Cohen, Meryl S; Jacobs, Jeffrey P; Quintessenza, James A; Chai, Paul J; Lindberg, Harald L; Dickey, Jamie; Ungerleider, Ross M

    2007-09-01

    meet this challenge without a painful learning curve belongs to both the younger professionals, who must progress through the learning curve, and the more mature professionals who must create an appropriate environment for learning. In addition to mentorship, the detailed tracking of outcomes is an essential tool for mastering any learning curve. It is crucial to utilize a detailed database to track outcomes, to learn, and to protect both yourself and your patients. It is our professional responsibility to engage in self-evaluation, in part employing voluntary sharing of data. For cardiac surgical subspecialties, the databases now existing for The European Association for CardioThoracic Surgery and The Society of Thoracic Surgeons represent the ideal tool for monitoring outcomes. Evolving initiatives in the fields of paediatric cardiology, paediatric critical care, and paediatric cardiac anaesthesia will play similar roles.A variety of professional and personal challenges must be met by all those working in health care. The acquisition of learned skills, and the use of special tools, will facilitate the process of conquering these challenges. Choosing appropriate role models and mentors can help progression through any learning curve in a controlled and protected fashion. Professional and personal satisfaction are both necessities. Finding the satisfactory balance between work and home life is difficult, but possible with the right tools, organization skills, and support system at work and at home. The concepts of mentorship, learning curves and balance cannot be underappreciated.

  18. A variant of the Hubbert curve for world oil production forecasts

    International Nuclear Information System (INIS)

    Maggio, G.; Cacciola, G.

    2009-01-01

    In recent years, the economic and political aspects of energy problems have prompted many researchers and analysts to focus their attention on the Hubbert Peak Theory with the aim of forecasting future trends in world oil production. In this paper, a model that attempts to contribute in this regard is presented; it is based on a variant of the well-known Hubbert curve. In addition, the sum of multiple-Hubbert curves (two cycles) is used to provide a better fit for the historical data on oil production (crude and natural gas liquid (NGL)). Taking into consideration three possible scenarios for oil reserves, this approach allowed us to forecast when peak oil production, referring to crude oil and NGL, should occur. In particular, by assuming a range of 2250-3000 gigabarrels (Gb) for ultimately recoverable conventional oil, our predictions foresee a peak between 2009 and 2021 at 29.3-32.1 Gb/year.

  19. SU-E-T-488: An Iso-Dose Curve Based Interactive IMRT Optimization System for Physician-Driven Plan Tuning

    International Nuclear Information System (INIS)

    Shi, F; Tian, Z; Jia, X; Jiang, S; Zarepisheh, M; Cervino, L

    2014-01-01

    Purpose: In treatment plan optimization for Intensity Modulated Radiation Therapy (IMRT), after a plan is initially developed by a dosimetrist, the attending physician evaluates its quality and often would like to improve it. As opposed to having the dosimetrist implement the improvements, it is desirable to have the physician directly and efficiently modify the plan for a more streamlined and effective workflow. In this project, we developed an interactive optimization system for physicians to conveniently and efficiently fine-tune iso-dose curves. Methods: An interactive interface is developed under C++/Qt. The physician first examines iso-dose lines. S/he then picks an iso-dose curve to be improved and drags it to a more desired configuration using a computer mouse or touchpad. Once the mouse is released, a voxel-based optimization engine is launched. The weighting factors corresponding to voxels between the iso-dose lines before and after the dragging are modified. The underlying algorithm then takes these factors as input to re-optimize the plan in near real-time on a GPU platform, yielding a new plan best matching the physician's desire. The re-optimized DVHs and iso-dose curves are then updated for the next iteration of modifications. This process is repeated until a physician satisfactory plan is achieved. Results: We have tested this system for a series of IMRT plans. Results indicate that our system provides the physicians an intuitive and efficient tool to edit the iso-dose curves according to their preference. The input information is used to guide plan re-optimization, which is achieved in near real-time using our GPU-based optimization engine. Typically, a satisfactory plan can be developed by a physician in a few minutes using this tool. Conclusion: With our system, physicians are able to manipulate iso-dose curves according to their preferences. Preliminary results demonstrate the feasibility and effectiveness of this tool

  20. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Science.gov (United States)

    Yan, Hongxiang; Sun, Ning; Wigmosta, Mark; Skaggs, Richard; Hou, Zhangshuan; Leung, Ruby

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial overestimation/underestimation of design basis events and subsequent overdesign/underdesign of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to underdesign, many with significant underestimation of 100 year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for underdesign were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for overdesign at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  1. Covariant quantizations in plane and curved spaces

    International Nuclear Information System (INIS)

    Assirati, J.L.M.; Gitman, D.M.

    2017-01-01

    We present covariant quantization rules for nonsingular finite-dimensional classical theories with flat and curved configuration spaces. In the beginning, we construct a family of covariant quantizations in flat spaces and Cartesian coordinates. This family is parametrized by a function ω(θ), θ element of (1,0), which describes an ambiguity of the quantization. We generalize this construction presenting covariant quantizations of theories with flat configuration spaces but already with arbitrary curvilinear coordinates. Then we construct a so-called minimal family of covariant quantizations for theories with curved configuration spaces. This family of quantizations is parametrized by the same function ω(θ). Finally, we describe a more wide family of covariant quantizations in curved spaces. This family is already parametrized by two functions, the previous one ω(θ) and by an additional function Θ(x,ξ). The above mentioned minimal family is a part at Θ = 1 of the wide family of quantizations. We study constructed quantizations in detail, proving their consistency and covariance. As a physical application, we consider a quantization of a non-relativistic particle moving in a curved space, discussing the problem of a quantum potential. Applying the covariant quantizations in flat spaces to an old problem of constructing quantum Hamiltonian in polar coordinates, we directly obtain a correct result. (orig.)

  2. Covariant quantizations in plane and curved spaces

    Energy Technology Data Exchange (ETDEWEB)

    Assirati, J.L.M. [University of Sao Paulo, Institute of Physics, Sao Paulo (Brazil); Gitman, D.M. [Tomsk State University, Department of Physics, Tomsk (Russian Federation); P.N. Lebedev Physical Institute, Moscow (Russian Federation); University of Sao Paulo, Institute of Physics, Sao Paulo (Brazil)

    2017-07-15

    We present covariant quantization rules for nonsingular finite-dimensional classical theories with flat and curved configuration spaces. In the beginning, we construct a family of covariant quantizations in flat spaces and Cartesian coordinates. This family is parametrized by a function ω(θ), θ element of (1,0), which describes an ambiguity of the quantization. We generalize this construction presenting covariant quantizations of theories with flat configuration spaces but already with arbitrary curvilinear coordinates. Then we construct a so-called minimal family of covariant quantizations for theories with curved configuration spaces. This family of quantizations is parametrized by the same function ω(θ). Finally, we describe a more wide family of covariant quantizations in curved spaces. This family is already parametrized by two functions, the previous one ω(θ) and by an additional function Θ(x,ξ). The above mentioned minimal family is a part at Θ = 1 of the wide family of quantizations. We study constructed quantizations in detail, proving their consistency and covariance. As a physical application, we consider a quantization of a non-relativistic particle moving in a curved space, discussing the problem of a quantum potential. Applying the covariant quantizations in flat spaces to an old problem of constructing quantum Hamiltonian in polar coordinates, we directly obtain a correct result. (orig.)

  3. [Individual learning curve for radical robot-assisted prostatectomy based on the example of three professionals working in one clinic].

    Science.gov (United States)

    Rasner, P I; Pushkar', D Iu; Kolontarev, K B; Kotenkov, D V

    2014-01-01

    The appearance of new surgical technique always requires evaluation of its effectiveness and ease of acquisition. A comparative study of the results of the first three series of successive robot-assisted radical prostatectomy (RARP) performed on at time by three surgeons, was conducted. The series consisted of 40 procedures, and were divided into 4 groups of 10 operations for the analysis. When comparing data, statistically significant improvement of intra- and postoperative performance in each series was revealed, with increase in the number of operations performed, and in each subsequent series compared with the preceding one. We recommend to perform the planned conversion at the first operation. In our study, previous laparoscopic experience did not provide any significant advantages in the acquisition of robot-assisted technology. To characterize the individual learning curve, we recommend the use of the number of operations that the surgeon looked in the life-surgery regimen and/or in which he participated as an assistant before his own surgical activity, as well as the indicator "technical defect". In addition to the term "individual learning curve", we propose to introduce the terms "surgeon's individual training phase", and "clinic's learning curve".

  4. Ionization constants by curve fitting: determination of partition and distribution coefficients of acids and bases and their ions.

    Science.gov (United States)

    Clarke, F H; Cahoon, N M

    1987-08-01

    A convenient procedure has been developed for the determination of partition and distribution coefficients. The method involves the potentiometric titration of the compound, first in water and then in a rapidly stirred mixture of water and octanol. An automatic titrator is used, and the data is collected and analyzed by curve fitting on a microcomputer with 64 K of memory. The method is rapid and accurate for compounds with pKa values between 4 and 10. Partition coefficients can be measured for monoprotic and diprotic acids and bases. The partition coefficients of the neutral compound and its ion(s) can be determined by varying the ratio of octanol to water. Distribution coefficients calculated over a wide range of pH values are presented graphically as "distribution profiles". It is shown that subtraction of the titration curve of solvent alone from that of the compound in the solvent offers advantages for pKa determination by curve fitting for compounds of low aqueous solubility.

  5. Technological change in energy systems. Learning curves, logistic curves and input-output coefficients

    International Nuclear Information System (INIS)

    Pan, Haoran; Koehler, Jonathan

    2007-01-01

    Learning curves have recently been widely adopted in climate-economy models to incorporate endogenous change of energy technologies, replacing the conventional assumption of an autonomous energy efficiency improvement. However, there has been little consideration of the credibility of the learning curve. The current trend that many important energy and climate change policy analyses rely on the learning curve means that it is of great importance to critically examine the basis for learning curves. Here, we analyse the use of learning curves in energy technology, usually implemented as a simple power function. We find that the learning curve cannot separate the effects of price and technological change, cannot reflect continuous and qualitative change of both conventional and emerging energy technologies, cannot help to determine the time paths of technological investment, and misses the central role of R and D activity in driving technological change. We argue that a logistic curve of improving performance modified to include R and D activity as a driving variable can better describe the cost reductions in energy technologies. Furthermore, we demonstrate that the top-down Leontief technology can incorporate the bottom-up technologies that improve along either the learning curve or the logistic curve, through changing input-output coefficients. An application to UK wind power illustrates that the logistic curve fits the observed data better and implies greater potential for cost reduction than the learning curve does. (author)

  6. Deep-learnt classification of light curves

    DEFF Research Database (Denmark)

    Mahabal, Ashish; Gieseke, Fabian; Pai, Akshay Sadananda Uppinakudru

    2017-01-01

    is to derive statistical features from the time series and to use machine learning methods, generally supervised, to separate objects into a few of the standard classes. In this work, we transform the time series to two-dimensional light curve representations in order to classify them using modern deep......Astronomy light curves are sparse, gappy, and heteroscedastic. As a result standard time series methods regularly used for financial and similar datasets are of little help and astronomers are usually left to their own instruments and techniques to classify light curves. A common approach...... learning techniques. In particular, we show that convolutional neural networks based classifiers work well for broad characterization and classification. We use labeled datasets of periodic variables from CRTS survey and show how this opens doors for a quick classification of diverse classes with several...

  7. MULTICOMPONENT DETERMINATION OF CHLORINATED HYDROCARBONS USING A REACTION-BASED CHEMICAL SENSOR .2. CHEMICAL SPECIATION USING MULTIVARIATE CURVE RESOLUTION

    NARCIS (Netherlands)

    Tauler, R.; Smilde, A. K.; HENSHAW, J. M.; BURGESS, L. W.; KOWALSKI, B. R.

    1994-01-01

    A new multivariate curve resolution method that can extract analytical information from UV/visible spectroscopic data collected from a reaction-based chemical sensor is proposed. The method is demonstrated with the determination of mixtures of chlorinated hydrocarbons by estimating the kinetic and

  8. Gelfond–Bézier curves

    KAUST Repository

    Ait-Haddou, Rachid; Sakane, Yusuke; Nomura, Taishin

    2013-01-01

    We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.

  9. Gelfond–Bézier curves

    KAUST Repository

    Ait-Haddou, Rachid

    2013-02-01

    We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.

  10. Automatic Curve Fitting Based on Radial Basis Functions and a Hierarchical Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    G. Trejo-Caballero

    2015-01-01

    Full Text Available Curve fitting is a very challenging problem that arises in a wide variety of scientific and engineering applications. Given a set of data points, possibly noisy, the goal is to build a compact representation of the curve that corresponds to the best estimate of the unknown underlying relationship between two variables. Despite the large number of methods available to tackle this problem, it remains challenging and elusive. In this paper, a new method to tackle such problem using strictly a linear combination of radial basis functions (RBFs is proposed. To be more specific, we divide the parameter search space into linear and nonlinear parameter subspaces. We use a hierarchical genetic algorithm (HGA to minimize a model selection criterion, which allows us to automatically and simultaneously determine the nonlinear parameters and then, by the least-squares method through Singular Value Decomposition method, to compute the linear parameters. The method is fully automatic and does not require subjective parameters, for example, smooth factor or centre locations, to perform the solution. In order to validate the efficacy of our approach, we perform an experimental study with several tests on benchmarks smooth functions. A comparative analysis with two successful methods based on RBF networks has been included.

  11. Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation

    Directory of Open Access Journals (Sweden)

    Marisa W. Paryasto

    2012-04-01

    Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.

  12. Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation

    Directory of Open Access Journals (Sweden)

    Marisa W. Paryasto

    2013-09-01

    Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.

  13. Evaluation of PCR and high-resolution melt curve analysis for differentiation of Salmonella isolates.

    Science.gov (United States)

    Saeidabadi, Mohammad Sadegh; Nili, Hassan; Dadras, Habibollah; Sharifiyazdi, Hassan; Connolly, Joanne; Valcanis, Mary; Raidal, Shane; Ghorashi, Seyed Ali

    2017-06-01

    Consumption of poultry products contaminated with Salmonella is one of the major causes of foodborne diseases worldwide and therefore detection and differentiation of Salmonella spp. in poultry is important. In this study, oligonucleotide primers were designed from hemD gene and a PCR followed by high-resolution melt (HRM) curve analysis was developed for rapid differentiation of Salmonella isolates. Amplicons of 228 bp were generated from 16 different Salmonella reference strains and from 65 clinical field isolates mainly from poultry farms. HRM curve analysis of the amplicons differentiated Salmonella isolates and analysis of the nucleotide sequence of the amplicons from selected isolates revealed that each melting curve profile was related to a unique DNA sequence. The relationship between reference strains and tested specimens was also evaluated using a mathematical model without visual interpretation of HRM curves. In addition, the potential of the PCR-HRM curve analysis was evaluated for genotyping of additional Salmonella isolates from different avian species. The findings indicate that PCR followed by HRM curve analysis provides a rapid and robust technique for genotyping of Salmonella isolates to determine the serovar/serotype.

  14. Development of structural vulnerability curve associated with high magnitude torrent occurrences in Switzerland

    Science.gov (United States)

    Wing-Yuen Chow, Candace; Bründl, Michael; Keiler, Margreth

    2017-04-01

    In mountain regions, high economic losses have increased significantly in the past decades due to severe hazard processes, in spite of notable investments in hazard management. Assessing the vulnerability of built structures to high magnitude torrent events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. While vulnerability curves have been developed for different countries, the presented work contributes new data from Swiss-based case studies that address a known gap associated with the consequences of high magnitude events. Data for this stage of the investigation communicates the degree of loss associated with affected structures and has been provided by local authorities dealing with natural hazards (e.g. Amt für Wald des Kantons Bern (KAWA) and cantonal insurance providers). Information used for the empirical quantification of vulnerability to torrent processes is derived from detailed post-event documentation and the loss database and verified with field visits. Building the initial database supports data sharing and the systematic inclusion of additional case studies as they become available. The collection of this new data is fundamental to the development of a local vulnerability curve based on observed sediment deposition heights, a proxy for describing hazard intensity. The result will then be compared to curves derived from Austrian and Italian datasets.

  15. Projection of curves on B-spline surfaces using quadratic reparameterization

    KAUST Repository

    Yang, Yijun; Zeng, Wei; Zhang, Hui; Yong, Junhai; Paul, Jean Claude

    2010-01-01

    Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying

  16. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Hongxiang [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Sun, Ning [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Wigmosta, Mark [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Distinguished Faculty Fellow, Department of Civil and Environmental Engineering, University of Washington, Seattle Washington United States; Skaggs, Richard [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Hou, Zhangshuan [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Leung, Ruby [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland Washington United States

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial over-/under-estimation of design basis events and subsequent over-/under-design of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to under-design, many with significant under-estimation of 100-year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for under-design were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for over-design at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  17. Investigating the environmental Kuznets curve hypothesis in Vietnam

    International Nuclear Information System (INIS)

    Al-Mulali, Usama; Saboori, Behnaz; Ozturk, Ilhan

    2015-01-01

    This study investigates the existence of the environmental Kuznets curve (EKC) hypothesis in Vietnam during the period 1981–2011. To realize the goals of this study, a pollution model was established applying the Autoregressive Distributed Lag (ARDL) methodology. The results revealed that the pollution haven hypothesis does exist in Vietnam because capital increases pollution. In addition, imports also increase pollution which indicates that most of Vietnam's imported products are energy intensive and highly polluted. However, exports have no effect on pollution which indicates that the level of exports is not significant enough to affect pollution. Moreover, fossil fuel energy consumption increases pollution while renewable energy consumption has no significant effect in reducing pollution. Furthermore, labor force reduces pollution since most of Vietnam's labor force is in the agricultural and services sectors which are less energy intensive than the industrial sector. Based on the obtained results, the EKC hypothesis does not exist because the relationship between GDP and pollution is positive in both the short and long run. - Highlights: • The environmental Kuznets curve (EKC) hypothesis in Vietnam is investigated. • The Autoregressive Distributed Lag (ARDL) methodology was utilized. • The EKC hypothesis does not exist

  18. Application of a diffractive element-based sensor for detection of latent fingerprints from a curved smooth surface

    International Nuclear Information System (INIS)

    Kuivalainen, Kalle; Peiponen, Kai-Erik; Myller, Kari

    2009-01-01

    An optical measurement device, which is a diffractive element-based sensor, is presented for the detection of latent fingerprints on curved objects such as a ballpoint pen. The device provides image and gloss information on the ridges of a fingerprint. The device is expected to have applications in forensic studies. (technical design note)

  19. Learner Characteristic Based Learning Effort Curve Mode: The Core Mechanism on Developing Personalized Adaptive E-Learning Platform

    Science.gov (United States)

    Hsu, Pi-Shan

    2012-01-01

    This study aims to develop the core mechanism for realizing the development of personalized adaptive e-learning platform, which is based on the previous learning effort curve research and takes into account the learner characteristics of learning style and self-efficacy. 125 university students from Taiwan are classified into 16 groups according…

  20. Semiclassical methods in curved spacetime and black hole thermodynamics

    International Nuclear Information System (INIS)

    Camblong, Horacio E.; Ordonez, Carlos R.

    2005-01-01

    Improved semiclassical techniques are developed and applied to a treatment of a real scalar field in a D-dimensional gravitational background. This analysis, leading to a derivation of the thermodynamics of black holes, is based on the simultaneous use of (i) a near-horizon description of the scalar field in terms of conformal quantum mechanics; (ii) a novel generalized WKB framework; and (iii) curved-spacetime phase-space methods. In addition, this improved semiclassical approach is shown to be asymptotically exact in the presence of hierarchical expansions of a near-horizon type. Most importantly, this analysis further supports the claim that the thermodynamics of black holes is induced by their near-horizon conformal invariance

  1. Compact TXRF system using doubly curved crystal optics

    International Nuclear Information System (INIS)

    Chen, Z.W.

    2000-01-01

    Doubly curved crystal optics can provide large collection solid angle from a small x-ray source but were difficult to be fabricated in the past. The recent innovative doubly curved crystal optic technology provides accurate bending figure of thin crystal and produces high performance doubly curved crystal optics. The high quality doubly curved crystal can increase the intensity of the primary beam significantly for total reflection x-ray fluorescence application based on a low power x-ray source. In this report, toroidal Si(220) crystals are used to focused Cu Kα and Mo Kα x-rays from low power compact x-ray tubes that have maximum power setting at 50 kV and 1 mA. With a slit aperture to control the convergent angle, a fan Cu Kα1 beam with 15 degree x 0.2 degree convergent angles is obtained for TXRF excitation. Similarly, a fan Mo Kα1 beam with 6 degree x 0.1 degree convergent angles is used for high energy excitation. Si wafer based TXRF samples will be prepared and measured using this technique and the experimental data. (author)

  2. Periodic Solutions, Eigenvalue Curves, and Degeneracy of the Fractional Mathieu Equation

    International Nuclear Information System (INIS)

    Parra-Hinojosa, A; Gutiérrez-Vega, J C

    2016-01-01

    We investigate the eigenvalue curves, the behavior of the periodic solutions, and the orthogonality properties of the Mathieu equation with an additional fractional derivative term using the method of harmonic balance. The addition of the fractional derivative term breaks the hermiticity of the equation in such a way that its eigenvalues need not be real nor its eigenfunctions orthogonal. We show that for a certain choice of parameters the eigenvalue curves reveal the appearance of degenerate eigenvalues. We offer a detailed discussion of the matrix representation of the differential operator corresponding to the fractional Mathieu equation, as well as some numerical examples of its periodic solutions. (paper)

  3. Do reading additions improve reading in pre-presbyopes with low vision?

    Science.gov (United States)

    Alabdulkader, Balsam; Leat, Susan

    2012-09-01

    This study compared three different methods of determining a reading addition and the possible improvement on reading performance in children and young adults with low vision. Twenty-eight participants with low vision, aged 8 to 32 years, took part in the study. Reading additions were determined with (a) a modified Nott dynamic retinoscopy, (b) a subjective method, and (c) an age-based formula. Reading performance was assessed with MNREAD-style reading charts at 12.5 cm, with and without each reading addition in random order. Outcome measures were reading speed, critical print size, MNREAD threshold, and the area under the reading speed curve. For the whole group, there was no significant improvement in reading performance with any of the additions. When participants with normal accommodation at 12.5 cm were excluded, the area under the reading speed curve was significantly greater with all reading additions compared with no addition (p = 0.031, 0.028, and 0.028, respectively). Also, the reading acuity threshold was significantly better with all reading additions compared with no addition (p = 0.014, 0.030, and 0.036, respectively). Distance and near visual acuity, age, and contrast sensitivity did not predict improvement with a reading addition. All, but one, of the participants who showed a significant improvement in reading with an addition had reduced accommodation. A reading addition may improve reading performance for young people with low vision and should be considered as part of a low vision assessment, particularly when accommodation is reduced.

  4. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.

    Science.gov (United States)

    Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G

    2012-05-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.

  5. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program

    International Nuclear Information System (INIS)

    Afouxenidis, D.; Polymeris, G. S.; Tsirliganis, N. C.; Kitis, G.

    2012-01-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the Glow Curve Analysis Intercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters. (authors)

  6. Multi-q pattern classification of polarization curves

    Science.gov (United States)

    Fabbri, Ricardo; Bastos, Ivan N.; Neto, Francisco D. Moura; Lopes, Francisco J. P.; Gonçalves, Wesley N.; Bruno, Odemir M.

    2014-02-01

    Several experimental measurements are expressed in the form of one-dimensional profiles, for which there is a scarcity of methodologies able to classify the pertinence of a given result to a specific group. The polarization curves that evaluate the corrosion kinetics of electrodes in corrosive media are applications where the behavior is chiefly analyzed from profiles. Polarization curves are indeed a classic method to determine the global kinetics of metallic electrodes, but the strong nonlinearity from different metals and alloys can overlap and the discrimination becomes a challenging problem. Moreover, even finding a typical curve from replicated tests requires subjective judgment. In this paper, we used the so-called multi-q approach based on the Tsallis statistics in a classification engine to separate the multiple polarization curve profiles of two stainless steels. We collected 48 experimental polarization curves in an aqueous chloride medium of two stainless steel types, with different resistance against localized corrosion. Multi-q pattern analysis was then carried out on a wide potential range, from cathodic up to anodic regions. An excellent classification rate was obtained, at a success rate of 90%, 80%, and 83% for low (cathodic), high (anodic), and both potential ranges, respectively, using only 2% of the original profile data. These results show the potential of the proposed approach towards efficient, robust, systematic and automatic classification of highly nonlinear profile curves.

  7. Optimization In Searching Daily Rule Curve At Mosul Regulating Reservoir, North Iraq Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Thair M. Al-Taiee

    2013-05-01

    Full Text Available To obtain optimal operating rules for storage reservoirs, large numbers of simulation and optimization models have been developed over the past several decades, which vary significantly in their mechanisms and applications. Rule curves are guidelines for long term reservoir operation. An efficient technique is required to find the optimal rule curves that can mitigate water shortage in long term operation. The investigation of developed Genetic Algorithm (GA technique, which is an optimization approach base on the mechanics of natural selection, derived from the theory of natural evolution, was carried out to through the application to predict the daily rule curve of  Mosul regulating reservoir in Iraq.  Record daily inflows, outflow, water level in the reservoir for 19 year (1986-1990 and (1994-2007 were used in the developed model for assessing the optimal reservoir operation. The objective function is set to minimize the annual sum of squared deviation from the desired downstream release and desired storage volume in the reservoir. The decision variables are releases, storage volume, water level and outlet (demand from the reservoir. The results of the GA model gave a good agreement during the comparison with the actual rule curve and the designed rating curve of the reservoir. The simulated result shows that GA-derived policies are promising and competitive and can be effectively used for daily reservoir operation in addition to the rational monthly operation and predicting also rating curve of reservoirs.

  8. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, E.

    2008-09-15

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  9. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    International Nuclear Information System (INIS)

    Georgiev, E.

    2008-09-01

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  10. Reconstruction of thermally quenched glow curves in quartz

    International Nuclear Information System (INIS)

    Subedi, Bhagawan; Polymeris, George S.; Tsirliganis, Nestor C.; Pagonis, Vasilis; Kitis, George

    2012-01-01

    The experimentally measured thermoluminescence (TL) glow curves of quartz samples are influenced by the presence of the thermal quenching effect, which involves a variation of the luminescence efficiency as a function of temperature. The real shape of the thermally unquenched TL glow curves is completely unknown. In the present work an attempt is made to reconstruct these unquenched glow curves from the quenched experimental data, and for two different types of quartz samples. The reconstruction is based on the values of the thermal quenching parameter W (activation energy) and C (a dimensionless constant), which are known from recent experimental work on these two samples. A computerized glow-curve deconvolution (CGCD) analysis was performed twice for both the reconstructed and the experimental TL glow curves. Special attention was paid to check for consistency between the results of these two independent CGCD analyses. The investigation showed that the reconstruction attempt was successful, and it is concluded that the analysis of reconstructed TL glow curves can provide improved values of the kinetic parameters E, s for the glow peaks of quartz. This also leads to a better evaluation of the half-lives of electron trapping levels used for dosimetry and luminescence dating.

  11. From Experiment to Theory: What Can We Learn from Growth Curves?

    Science.gov (United States)

    Kareva, Irina; Karev, Georgy

    2018-01-01

    Finding an appropriate functional form to describe population growth based on key properties of a described system allows making justified predictions about future population development. This information can be of vital importance in all areas of research, ranging from cell growth to global demography. Here, we use this connection between theory and observation to pose the following question: what can we infer about intrinsic properties of a population (i.e., degree of heterogeneity, or dependence on external resources) based on which growth function best fits its growth dynamics? We investigate several nonstandard classes of multi-phase growth curves that capture different stages of population growth; these models include hyperbolic-exponential, exponential-linear, exponential-linear-saturation growth patterns. The constructed models account explicitly for the process of natural selection within inhomogeneous populations. Based on the underlying hypotheses for each of the models, we identify whether the population that it best fits by a particular curve is more likely to be homogeneous or heterogeneous, grow in a density-dependent or frequency-dependent manner, and whether it depends on external resources during any or all stages of its development. We apply these predictions to cancer cell growth and demographic data obtained from the literature. Our theory, if confirmed, can provide an additional biomarker and a predictive tool to complement experimental research.

  12. Simple algorithm to estimate mean-field effects from minor differential permeability curves based on the Preisach model

    International Nuclear Information System (INIS)

    Perevertov, Oleksiy

    2003-01-01

    The classical Preisach model (PM) of magnetic hysteresis requires that any minor differential permeability curve lies under minor curves with larger field amplitude. Measurements of ferromagnetic materials show that very often this is not true. By applying the classical PM formalism to measured minor curves one can discover that it leads to an oval-shaped region on each half of the Preisach plane where the calculations produce negative values in the Preisach function. Introducing an effective field, which differs from the applied one by a mean-field term proportional to the magnetization, usually solves this problem. Complex techniques exist to estimate the minimum necessary proportionality constant (the moving parameter). In this paper we propose a simpler way to estimate the mean-field effects for use in nondestructive testing, which is based on experience from the measurements of industrial steels. A new parameter (parameter of shift) is introduced, which monitors the mean-field effects. The relation between the shift parameter and the moving one was studied for a number of steels. From preliminary experiments no correlation was found between the shift parameter and the classical magnetic ones such as the coercive field, maximum differential permeability and remanent magnetization

  13. Influence of horizontally curved roadway section characteristics on motorcycle-to-barrier crash frequency.

    Science.gov (United States)

    Gabauer, Douglas J; Li, Xiaolong

    2015-04-01

    The purpose of this study was to investigate motorcycle-to-barrier crash frequency on horizontally curved roadway sections in Washington State using police-reported crash data linked with roadway data and augmented with barrier presence information. Data included 4915 horizontal curved roadway sections with 252 of these sections experiencing 329 motorcycle-to-barrier crashes between 2002 and 2011. Negative binomial regression was used to predict motorcycle-to-barrier crash frequency using horizontal curvature and other roadway characteristics. Based on the model results, the strongest predictor of crash frequency was found to be curve radius. This supports a motorcycle-to-barrier crash countermeasure placement criterion based, at the very least, on horizontal curve radius. With respect to the existing horizontal curve criterion of 820 feet or less, curves meeting this criterion were found to increase motorcycle-to-barrier crash frequency rate by a factor of 10 compared to curves not meeting this criterion. Other statistically significant predictors were curve length, traffic volume and the location of adjacent curves. Assuming curves of identical radius, the model results suggest that longer curves, those with higher traffic volume, and those that have no adjacent curved sections within 300 feet of either curve end would likely be better candidates for a motorcycle-to-barrier crash countermeasure. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Magneto-electro-elastic buckling analysis of nonlocal curved nanobeams

    Science.gov (United States)

    Ebrahimi, Farzad; Reza Barati, Mohammad

    2016-09-01

    In this work, a size-dependent curved beam model is developed to take into account the effects of nonlocal stresses on the buckling behavior of curved magneto-electro-elastic FG nanobeams for the first time. The governing differential equations are derived based on the principle of virtual work and Euler-Bernoulli beam theory. The power-law function is employed to describe the spatially graded magneto-electro-elastic properties. By extending the radius of the curved nanobeam to infinity, the results of straight nonlocal FG beams can be rendered. The effects of magnetic potential, electric voltage, opening angle, nonlocal parameter, power-law index and slenderness ratio on buckling loads of curved MEE-FG nanobeams are studied.

  15. Beyond Rating Curves: Time Series Models for in-Stream Turbidity Prediction

    Science.gov (United States)

    Wang, L.; Mukundan, R.; Zion, M.; Pierson, D. C.

    2012-12-01

    ARMA(1,2) errors were fit to the observations. Preliminary model validation exercises at a 30-day forecast horizon show that the ARMA error models generally improve the predictive skill of the linear regression rating curves. Skill seems to vary based on the ambient hydrologic conditions at the onset of the forecast. For example, ARMA error model forecasts issued before a high flow/turbidity event do not show significant improvements over the rating curve approach. However, ARMA error model forecasts issued during the "falling limb" of the hydrograph are significantly more accurate than rating curves for both single day and accumulated event predictions. In order to assist in reservoir operations decisions associated with turbidity events and general water supply reliability, DEP has initiated design of an Operations Support Tool (OST). OST integrates a reservoir operations model with 2D hydrodynamic water quality models and a database compiling near-real-time data sources and hydrologic forecasts. Currently, OST uses conventional flow-turbidity rating curves and hydrologic forecasts for predictive turbidity inputs. Given the improvements in predictive skill over traditional rating curves, the ARMA error models are currently being evaluated as an addition to DEP's Operations Support Tool.

  16. Automated curved planar reformation of 3D spine images

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2005-01-01

    Traditional techniques for visualizing anatomical structures are based on planar cross-sections from volume images, such as images obtained by computed tomography (CT) or magnetic resonance imaging (MRI). However, planar cross-sections taken in the coordinate system of the 3D image often do not provide sufficient or qualitative enough diagnostic information, because planar cross-sections cannot follow curved anatomical structures (e.g. arteries, colon, spine, etc). Therefore, not all of the important details can be shown simultaneously in any planar cross-section. To overcome this problem, reformatted images in the coordinate system of the inspected structure must be created. This operation is usually referred to as curved planar reformation (CPR). In this paper we propose an automated method for CPR of 3D spine images, which is based on the image transformation from the standard image-based to a novel spine-based coordinate system. The axes of the proposed spine-based coordinate system are determined on the curve that represents the vertebral column, and the rotation of the vertebrae around the spine curve, both of which are described by polynomial models. The optimal polynomial parameters are obtained in an image analysis based optimization framework. The proposed method was qualitatively and quantitatively evaluated on five CT spine images. The method performed well on both normal and pathological cases and was consistent with manually obtained ground truth data. The proposed spine-based CPR benefits from reduced structural complexity in favour of improved feature perception of the spine. The reformatted images are diagnostically valuable and enable easier navigation, manipulation and orientation in 3D space. Moreover, reformatted images may prove useful for segmentation and other image analysis tasks

  17. Non-sky-averaged sensitivity curves for space-based gravitational-wave observatories

    International Nuclear Information System (INIS)

    Vallisneri, Michele; Galley, Chad R

    2012-01-01

    The signal-to-noise ratio (SNR) is used in gravitational-wave observations as the basic figure of merit for detection confidence and, together with the Fisher matrix, for the amount of physical information that can be extracted from a detected signal. SNRs are usually computed from a sensitivity curve, which describes the gravitational-wave amplitude needed by a monochromatic source of given frequency to achieve a threshold SNR. Although the term 'sensitivity' is used loosely to refer to the detector's noise spectral density, the two quantities are not the same: the sensitivity includes also the frequency- and orientation-dependent response of the detector to gravitational waves and takes into account the duration of observation. For interferometric space-based detectors similar to LISA, which are sensitive to long-lived signals and have constantly changing position and orientation, exact SNRs need to be computed on a source-by-source basis. For convenience, most authors prefer to work with sky-averaged sensitivities, accepting inaccurate SNRs for individual sources and giving up control over the statistical distribution of SNRs for source populations. In this paper, we describe a straightforward end-to-end recipe to compute the non-sky-averaged sensitivity of interferometric space-based detectors of any geometry. This recipe includes the effects of spacecraft motion and of seasonal variations in the partially subtracted confusion foreground from Galactic binaries, and it can be used to generate a sampling distribution of sensitivities for a given source population. In effect, we derive error bars for the sky-averaged sensitivity curve, which provide a stringent statistical interpretation for previously unqualified statements about sky-averaged SNRs. As a worked-out example, we consider isotropic and Galactic-disk populations of monochromatic sources, as observed with the 'classic LISA' configuration. We confirm that the (standard) inverse-rms average sensitivity

  18. Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves

    Energy Technology Data Exchange (ETDEWEB)

    Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)

    1998-11-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)

  19. An Experimental Study on the Impact of Different-frequency Elastic Waves on Water Retention Curve

    Science.gov (United States)

    Deng, J. H.; Dai, J. Y.; Lee, J. W.; Lo, W. C.

    2017-12-01

    ABSTEACTOver the past few decades, theoretical and experimental studies on the connection between elastic wave attributes and the physical properties of a fluid-bearing porous medium have attracted the attention of many scholars in fields of porous medium flow and hydrogeology. It has been previously determined that the transmission of elastic waves in a porous medium containing two immiscible fluids will have an effect on the water retention curve, but it has not been found that the water retention curve will be affected by the frequency of elastic vibration waves or whether the effect on the soil is temporary or permanent. This research is based on a sand box test in which the soil is divided into three layers (a lower, middle, and upper layer). In this case, we discuss different impacts on the water retention curve during the drying process under sound waves (elastic waves) subject to three frequencies (150Hz, 300Hz, and 450Hz), respectively. The change in the water retention curve before and after the effect is then discussed. In addition, how sound waves affect the water retention curve at different depths is also observed. According to the experimental results, we discover that sound waves can cause soil either to expand or to contract. When the soil is induced to expand due to sound waves, it can contract naturally and return to the condition it was in before the influence of the sound waves. On the contrary, when the soil is induced to contract, it is unable to return to its initial condition. Due to the results discussed above, it is suggested that sound waves causing soil to expand have a temporary impact while those causing soil to contract have a permanent impact. In addition, our experimental results show how sound waves affect the water retention curve at different depths. The degree of soil expansion and contraction caused by the sound waves will differ at various soil depths. Nevertheless, the expanding or contracting of soil is only subject to the

  20. STR melting curve analysis as a genetic screening tool for crime scene samples.

    Science.gov (United States)

    Nguyen, Quang; McKinney, Jason; Johnson, Donald J; Roberts, Katherine A; Hardy, Winters R

    2012-07-01

    In this proof-of-concept study, high-resolution melt curve (HRMC) analysis was investigated as a postquantification screening tool to discriminate human CSF1PO and THO1 genotypes amplified with mini-STR primers in the presence of SYBR Green or LCGreen Plus dyes. A total of 12 CSF1PO and 11 HUMTHO1 genotypes were analyzed on the LightScanner HR96 and LS-32 systems and were correctly differentiated based upon their respective melt profiles. Short STR amplicon melt curves were affected by repeat number, and single-source and mixed DNA samples were additionally differentiated by the formation of heteroduplexes. Melting curves were shown to be unique and reproducible from DNA quantities ranging from 20 to 0.4 ng and distinguished identical from nonidentical genotypes from DNA derived from different biological fluids and compromised samples. Thus, a method is described which can assess both the quantity and the possible probative value of samples without full genotyping. 2012 American Academy of Forensic Sciences. Published 2012. This article is a U.S. Government work and is in the public domain in the U.S.A.

  1. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  2. Decline curve based models for predicting natural gas well performance

    Directory of Open Access Journals (Sweden)

    Arash Kamari

    2017-06-01

    Full Text Available The productivity of a gas well declines over its production life as cannot cover economic policies. To overcome such problems, the production performance of gas wells should be predicted by applying reliable methods to analyse the decline trend. Therefore, reliable models are developed in this study on the basis of powerful artificial intelligence techniques viz. the artificial neural network (ANN modelling strategy, least square support vector machine (LSSVM approach, adaptive neuro-fuzzy inference system (ANFIS, and decision tree (DT method for the prediction of cumulative gas production as well as initial decline rate multiplied by time as a function of the Arps' decline curve exponent and ratio of initial gas flow rate over total gas flow rate. It was concluded that the results obtained based on the models developed in current study are in satisfactory agreement with the actual gas well production data. Furthermore, the results of comparative study performed demonstrates that the LSSVM strategy is superior to the other models investigated for the prediction of both cumulative gas production, and initial decline rate multiplied by time.

  3. Multiphoton absorption coefficients in solids: an universal curve

    International Nuclear Information System (INIS)

    Brandi, H.S.; Araujo, C.B. de

    1983-04-01

    An universal curve for the frequency dependence of the multiphoton absorption coefficient is proposed based on a 'non-perturbative' approach. Specific applications have been made to obtain two, three, four and five photons absorption coefficient in different materials. Properly scaling of the two photon absorption coefficient and the use of the universal curve yields results for the higher order absorption coefficients in good agreement with the experimental data. (Author) [pt

  4. Deriving the suction stress of unsaturated soils from water retention curve, based on wetted surface area in pores

    Science.gov (United States)

    Greco, Roberto; Gargano, Rudy

    2016-04-01

    The evaluation of suction stress in unsaturated soils has important implications in several practical applications. Suction stress affects soil aggregate stability and soil erosion. Furthermore, the equilibrium of shallow unsaturated soil deposits along steep slopes is often possible only thanks to the contribution of suction to soil effective stress. Experimental evidence, as well as theoretical arguments, shows that suction stress is a nonlinear function of matric suction. The relationship expressing the dependence of suction stress on soil matric suction is usually indicated as Soil Stress Characteristic Curve (SSCC). In this study, a novel equation for the evaluation of the suction stress of an unsaturated soil is proposed, assuming that the exchange of stress between soil water and solid particles occurs only through the part of the surface of the solid particles which is in direct contact with water. The proposed equation, based only upon geometric considerations related to soil pore-size distribution, allows to easily derive the SSCC from the water retention curve (SWRC), with the assignment of two additional parameters. The first parameter, representing the projection of the external surface area of the soil over a generic plane surface, can be reasonably estimated from the residual water content of the soil. The second parameter, indicated as H0, is the water potential, below which adsorption significantly contributes to water retention. For the experimental verification of the proposed approach such a parameter is considered as a fitting parameter. The proposed equation is applied to the interpretation of suction stress experimental data, taken from the literature, spanning over a wide range of soil textures. The obtained results show that in all cases the proposed relationships closely reproduces the experimental data, performing better than other currently used expressions. The obtained results also show that the adopted values of the parameter H0

  5. Bragg Curve Spectroscopy

    International Nuclear Information System (INIS)

    Gruhn, C.R.

    1981-05-01

    An alternative utilization is presented for the gaseous ionization chamber in the detection of energetic heavy ions, which is called Bragg Curve Spectroscopy (BCS). Conceptually, BCS involves using the maximum data available from the Bragg curve of the stopping heavy ion (HI) for purposes of identifying the particle and measuring its energy. A detector has been designed that measures the Bragg curve with high precision. From the Bragg curve the range from the length of the track, the total energy from the integral of the specific ionization over the track, the dE/dx from the specific ionization at the beginning of the track, and the Bragg peak from the maximum of the specific ionization of the HI are determined. This last signal measures the atomic number, Z, of the HI unambiguously

  6. Energy and GHG abatement cost curves

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Rafael [BHP Billiton Base Metals (Australia)

    2010-07-01

    Global warming due to various reasons but especially to emission of green house gases (GHGs) has become a cause for serious concern. This paper discusses the steps taken by BHP Billiton to reduce energy consumption and GHG emissions using cost curves. According to forecasts, global warming is expected to impact Chile badly and the rise in temperature could be between 1 and more than 5 degrees Celsius. Mining in Chile consumes a lot of energy, particularly electricity. Total energy and electricity consumption in 2007 was 13 and 36 % respectively. BHP base metals developed a set of abatement cost curves for energy and GHG in Chile and these are shown in figures. The methodology for the curves consisted of consultant visits to each mine operation. The study also includes mass energy balance and feasibility maps. The paper concludes that it is important to evaluate the potential for reducing emissions and energy and their associated costs.

  7. Learning Curve? Which One?

    Directory of Open Access Journals (Sweden)

    Paulo Prochno

    2004-07-01

    Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.

  8. Optimal algorithm for automatic detection of microaneurysms based on receiver operating characteristic curve

    Science.gov (United States)

    Xu, Lili; Luo, Shuqian

    2010-11-01

    Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.

  9. Contractibility of curves

    Directory of Open Access Journals (Sweden)

    Janusz Charatonik

    1991-11-01

    Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.

  10. Roc curves for continuous data

    CERN Document Server

    Krzanowski, Wojtek J

    2009-01-01

    Since ROC curves have become ubiquitous in many application areas, the various advances have been scattered across disparate articles and texts. ROC Curves for Continuous Data is the first book solely devoted to the subject, bringing together all the relevant material to provide a clear understanding of how to analyze ROC curves.The fundamental theory of ROC curvesThe book first discusses the relationship between the ROC curve and numerous performance measures and then extends the theory into practice by describing how ROC curves are estimated. Further building on the theory, the authors prese

  11. A void ratio dependent water retention curve model including hydraulic hysteresis

    Directory of Open Access Journals (Sweden)

    Pasha Amin Y.

    2016-01-01

    Full Text Available Past experimental evidence has shown that Water Retention Curve (WRC evolves with mechanical stress and structural changes in soil matrix. Models currently available in the literature for capturing the volume change dependency of WRC are mainly empirical in nature requiring an extensive experimental programme for parameter identification which renders them unsuitable for practical applications. In this paper, an analytical model for the evaluation of the void ratio dependency of WRC in deformable porous media is presented. The approach proposed enables quantification of the dependency of WRC on void ratio solely based on the form of WRC at the reference void ratio and requires no additional parameters. The effect of hydraulic hysteresis on the evolution process is also incorporated in the model, an aspect rarely addressed in the literature. Expressions are presented for the evolution of main and scanning curves due to loading and change in the hydraulic path from scanning to main wetting/drying and vice versa as well as the WRC parameters such as air entry value, air expulsion value, pore size distribution index and slope of the scanning curve. The model is validated using experimental data on compacted and reconstituted soils subjected to various hydro-mechanical paths. Good agreement is obtained between model predictions and experimental data in all the cases considered.

  12. Spiral blood flows in an idealized 180-degree curved artery model

    Science.gov (United States)

    Bulusu, Kartik V.; Kulkarni, Varun; Plesniak, Michael W.

    2017-11-01

    Understanding of cardiovascular flows has been greatly advanced by the Magnetic Resonance Velocimetry (MRV) technique and its potential for three-dimensional velocity encoding in regions of anatomic interest. The MRV experiments were performed on a 180-degree curved artery model using a Newtonian blood analog fluid at the Richard M. Lucas Center at Stanford University employing a 3 Tesla General Electric (Discovery 750 MRI system) whole body scanner with an eight-channel cardiac coil. Analysis in two regions of the model-artery was performed for flow with Womersley number=4.2. In the entrance region (or straight-inlet pipe) the unsteady pressure drop per unit length, in-plane vorticity and wall shear stress for the pulsatile, carotid artery-based flow rate waveform were calculated. Along the 180-degree curved pipe (curvature ratio =1/7) the near-wall vorticity and the stretching of the particle paths in the vorticity field are visualized. The resultant flow behavior in the idealized curved artery model is associated with parameters such as Dean number and Womersley number. Additionally, using length scales corresponding to the axial and secondary flow we attempt to understand the mechanisms leading to the formation of various structures observed during the pulsatile flow cycle. Supported by GW Center for Biomimetics and Bioinspired Engineering (COBRE), MRV measurements in collaboration with Prof. John K. Eaton and, Dr. Chris Elkins at Stanford University.

  13. W-curve alignments for HIV-1 genomic comparisons.

    Directory of Open Access Journals (Sweden)

    Douglas J Cork

    2010-06-01

    Full Text Available The W-curve was originally developed as a graphical visualization technique for viewing DNA and RNA sequences. Its ability to render features of DNA also makes it suitable for computational studies. Its main advantage in this area is utilizing a single-pass algorithm for comparing the sequences. Avoiding recursion during sequence alignments offers advantages for speed and in-process resources. The graphical technique also allows for multiple models of comparison to be used depending on the nucleotide patterns embedded in similar whole genomic sequences. The W-curve approach allows us to compare large numbers of samples quickly.We are currently tuning the algorithm to accommodate quirks specific to HIV-1 genomic sequences so that it can be used to aid in diagnostic and vaccine efforts. Tracking the molecular evolution of the virus has been greatly hampered by gap associated problems predominantly embedded within the envelope gene of the virus. Gaps and hypermutation of the virus slow conventional string based alignments of the whole genome. This paper describes the W-curve algorithm itself, and how we have adapted it for comparison of similar HIV-1 genomes. A treebuilding method is developed with the W-curve that utilizes a novel Cylindrical Coordinate distance method and gap analysis method. HIV-1 C2-V5 env sequence regions from a Mother/Infant cohort study are used in the comparison.The output distance matrix and neighbor results produced by the W-curve are functionally equivalent to those from Clustal for C2-V5 sequences in the mother/infant pairs infected with CRF01_AE.Significant potential exists for utilizing this method in place of conventional string based alignment of HIV-1 genomes, such as Clustal X. With W-curve heuristic alignment, it may be possible to obtain clinically useful results in a short time-short enough to affect clinical choices for acute treatment. A description of the W-curve generation process, including a comparison

  14. W-curve alignments for HIV-1 genomic comparisons.

    Science.gov (United States)

    Cork, Douglas J; Lembark, Steven; Tovanabutra, Sodsai; Robb, Merlin L; Kim, Jerome H

    2010-06-01

    The W-curve was originally developed as a graphical visualization technique for viewing DNA and RNA sequences. Its ability to render features of DNA also makes it suitable for computational studies. Its main advantage in this area is utilizing a single-pass algorithm for comparing the sequences. Avoiding recursion during sequence alignments offers advantages for speed and in-process resources. The graphical technique also allows for multiple models of comparison to be used depending on the nucleotide patterns embedded in similar whole genomic sequences. The W-curve approach allows us to compare large numbers of samples quickly. We are currently tuning the algorithm to accommodate quirks specific to HIV-1 genomic sequences so that it can be used to aid in diagnostic and vaccine efforts. Tracking the molecular evolution of the virus has been greatly hampered by gap associated problems predominantly embedded within the envelope gene of the virus. Gaps and hypermutation of the virus slow conventional string based alignments of the whole genome. This paper describes the W-curve algorithm itself, and how we have adapted it for comparison of similar HIV-1 genomes. A treebuilding method is developed with the W-curve that utilizes a novel Cylindrical Coordinate distance method and gap analysis method. HIV-1 C2-V5 env sequence regions from a Mother/Infant cohort study are used in the comparison. The output distance matrix and neighbor results produced by the W-curve are functionally equivalent to those from Clustal for C2-V5 sequences in the mother/infant pairs infected with CRF01_AE. Significant potential exists for utilizing this method in place of conventional string based alignment of HIV-1 genomes, such as Clustal X. With W-curve heuristic alignment, it may be possible to obtain clinically useful results in a short time-short enough to affect clinical choices for acute treatment. A description of the W-curve generation process, including a comparison technique of

  15. Trends in scale and shape of survival curves.

    Science.gov (United States)

    Weon, Byung Mook; Je, Jung Ho

    2012-01-01

    The ageing of the population is an issue in wealthy countries worldwide because of increasing costs for health care and welfare. Survival curves taken from demographic life tables may help shed light on the hypotheses that humans are living longer and that human populations are growing older. We describe a methodology that enables us to obtain separate measurements of scale and shape variances in survival curves. Specifically, 'living longer' is associated with the scale variance of survival curves, whereas 'growing older' is associated with the shape variance. We show how the scale and shape of survival curves have changed over time during recent decades, based on period and cohort female life tables for selected wealthy countries. Our methodology will be useful for performing better tracking of ageing statistics and it is possible that this methodology can help identify the causes of current trends in human ageing.

  16. Newer developments on self-modeling curve resolution implementing equality and unimodality constraints.

    Science.gov (United States)

    Beyramysoltan, Samira; Abdollahi, Hamid; Rajkó, Róbert

    2014-05-27

    Analytical self-modeling curve resolution (SMCR) methods resolve data sets to a range of feasible solutions using only non-negative constraints. The Lawton-Sylvestre method was the first direct method to analyze a two-component system. It was generalized as a Borgen plot for determining the feasible regions in three-component systems. It seems that a geometrical view is required for considering curve resolution methods, because the complicated (only algebraic) conceptions caused a stop in the general study of Borgen's work for 20 years. Rajkó and István revised and elucidated the principles of existing theory in SMCR methods and subsequently introduced computational geometry tools for developing an algorithm to draw Borgen plots in three-component systems. These developments are theoretical inventions and the formulations are not always able to be given in close form or regularized formalism, especially for geometric descriptions, that is why several algorithms should have been developed and provided for even the theoretical deductions and determinations. In this study, analytical SMCR methods are revised and described using simple concepts. The details of a drawing algorithm for a developmental type of Borgen plot are given. Additionally, for the first time in the literature, equality and unimodality constraints are successfully implemented in the Lawton-Sylvestre method. To this end, a new state-of-the-art procedure is proposed to impose equality constraint in Borgen plots. Two- and three-component HPLC-DAD data set were simulated and analyzed by the new analytical curve resolution methods with and without additional constraints. Detailed descriptions and explanations are given based on the obtained abstract spaces. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Atlas of stress-strain curves

    CERN Document Server

    2002-01-01

    The Atlas of Stress-Strain Curves, Second Edition is substantially bigger in page dimensions, number of pages, and total number of curves than the previous edition. It contains over 1,400 curves, almost three times as many as in the 1987 edition. The curves are normalized in appearance to aid making comparisons among materials. All diagrams include metric (SI) units, and many also include U.S. customary units. All curves are captioned in a consistent format with valuable information including (as available) standard designation, the primary source of the curve, mechanical properties (including hardening exponent and strength coefficient), condition of sample, strain rate, test temperature, and alloy composition. Curve types include monotonic and cyclic stress-strain, isochronous stress-strain, and tangent modulus. Curves are logically arranged and indexed for fast retrieval of information. The book also includes an introduction that provides background information on methods of stress-strain determination, on...

  18. Tornado-Shaped Curves

    Science.gov (United States)

    Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio

    2017-01-01

    In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.

  19. Generation of large-scale PV scenarios using aggregated power curves

    DEFF Research Database (Denmark)

    Nuño Martinez, Edgar; Cutululis, Nicolaos Antonio

    2017-01-01

    The contribution of solar photovoltaic (PV) power to the generation is becoming more relevant in modern power system. Therefore, there is a need to model the variability large-scale PV generation accurately. This paper presents a novel methodology to generate regional PV scenarios based...... on aggregated power curves rather than traditional physical PV conversion models. Our approach is based on hourly mesoscale reanalysis irradiation data and power measurements and do not require additional variables such as ambient temperature or wind speed. It was used to simulate the PV generation...... on the German system between 2012 and 2015 showing high levels of correlation with actual measurements (93.02–97.60%) and small deviations from the expected capacity factors (0.02–1.80%). Therefore, we are confident about the ability of the proposed model to accurately generate realistic large-scale PV...

  20. Additive Manufacturing Materials Study for Gaseous Radiation Detection

    Energy Technology Data Exchange (ETDEWEB)

    Steer, C.A.; Durose, A.; Boakes, J. [AWE Aldermaston, Reading, Berkshire, RG7 4PR (United Kingdom)

    2015-07-01

    Additive manufacturing (AM) techniques may lead to improvements in many areas of radiation detector construction; notably the rapid manufacturing time allows for a reduced time between prototype iterations. The additive nature of the technique results in a granular microstructure which may be permeable to ingress by atmospheric gases and make it unsuitable for gaseous radiation detector development. In this study we consider the application of AM to the construction of enclosures and frames for wire-based gaseous radiation tracking detectors. We have focussed on oxygen impurity ingress as a measure of the permeability of the enclosure, and the gas charging and discharging curves of several simplistic enclosure shapes are reported. A prototype wire-frame is also presented to examine structural strength and positional accuracy of an AM produced frame. We lastly discuss the implications of this study for AM based radiation detection technology as a diagnostic tool for incident response scenarios, such as the interrogation of a suspect radiation-emitting package. (authors)

  1. Additive Manufacturing Materials Study for Gaseous Radiation Detection

    International Nuclear Information System (INIS)

    Steer, C.A.; Durose, A.; Boakes, J.

    2015-01-01

    Additive manufacturing (AM) techniques may lead to improvements in many areas of radiation detector construction; notably the rapid manufacturing time allows for a reduced time between prototype iterations. The additive nature of the technique results in a granular microstructure which may be permeable to ingress by atmospheric gases and make it unsuitable for gaseous radiation detector development. In this study we consider the application of AM to the construction of enclosures and frames for wire-based gaseous radiation tracking detectors. We have focussed on oxygen impurity ingress as a measure of the permeability of the enclosure, and the gas charging and discharging curves of several simplistic enclosure shapes are reported. A prototype wire-frame is also presented to examine structural strength and positional accuracy of an AM produced frame. We lastly discuss the implications of this study for AM based radiation detection technology as a diagnostic tool for incident response scenarios, such as the interrogation of a suspect radiation-emitting package. (authors)

  2. tgcd: An R package for analyzing thermoluminescence glow curves

    Directory of Open Access Journals (Sweden)

    Jun Peng

    2016-01-01

    Full Text Available Thermoluminescence (TL glow curves are widely used in dosimetric studies. Many commercial and free-distributed programs are used to deconvolute TL glow curves. This study introduces an open-source R package tgcd to conduct TL glow curve analysis, such as kinetic parameter estimation, glow peak simulation, and peak shape analysis. TL glow curves can be deconvoluted according to the general-order empirical expression or the semi-analytical expression derived from the one trap-one recombination center (OTOR model based on the Lambert W function by using a modified Levenberg–Marquardt algorithm from which any of the parameters can be constrained or fixed. The package provides an interactive environment to initialize parameters and offers an automated “trial-and-error” protocol to obtain optimal fit results. First-order, second-order, and general-order glow peaks (curves are simulated according to a number of simple kinetic models. The package was developed using a combination of Fortran and R programming languages to improve efficiency and flexibility.

  3. Curve fitting for RHB Islamic Bank annual net profit

    Science.gov (United States)

    Nadarajan, Dineswary; Noor, Noor Fadiya Mohd

    2015-05-01

    The RHB Islamic Bank net profit data are obtained from 2004 to 2012. Curve fitting is done by assuming the data are exact or experimental due to smoothing process. Higher order Lagrange polynomial and cubic spline with curve fitting procedure are constructed using Maple software. Normality test is performed to check the data adequacy. Regression analysis with curve estimation is conducted in SPSS environment. All the eleven models are found to be acceptable at 10% significant level of ANOVA. Residual error and absolute relative true error are calculated and compared. The optimal model based on the minimum average error is proposed.

  4. Shape optimization of self-avoiding curves

    Science.gov (United States)

    Walker, Shawn W.

    2016-04-01

    This paper presents a softened notion of proximity (or self-avoidance) for curves. We then derive a sensitivity result, based on shape differential calculus, for the proximity. This is combined with a gradient-based optimization approach to compute three-dimensional, parameterized curves that minimize the sum of an elastic (bending) energy and a proximity energy that maintains self-avoidance by a penalization technique. Minimizers are computed by a sequential-quadratic-programming (SQP) method where the bending energy and proximity energy are approximated by a finite element method. We then apply this method to two problems. First, we simulate adsorbed polymer strands that are constrained to be bound to a surface and be (locally) inextensible. This is a basic model of semi-flexible polymers adsorbed onto a surface (a current topic in material science). Several examples of minimizing curve shapes on a variety of surfaces are shown. An advantage of the method is that it can be much faster than using molecular dynamics for simulating polymer strands on surfaces. Second, we apply our proximity penalization to the computation of ideal knots. We present a heuristic scheme, utilizing the SQP method above, for minimizing rope-length and apply it in the case of the trefoil knot. Applications of this method could be for generating good initial guesses to a more accurate (but expensive) knot-tightening algorithm.

  5. An extended L-curve method for choosing a regularization parameter in electrical resistance tomography

    International Nuclear Information System (INIS)

    Xu, Yanbin; Pei, Yang; Dong, Feng

    2016-01-01

    The L-curve method is a popular regularization parameter choice method for the ill-posed inverse problem of electrical resistance tomography (ERT). However the method cannot always determine a proper parameter for all situations. An investigation into those situations where the L-curve method failed show that a new corner point appears on the L-curve and the parameter corresponding to the new corner point can obtain a satisfactory reconstructed solution. Thus an extended L-curve method, which determines the regularization parameter associated with either global corner or the new corner, is proposed. Furthermore, two strategies are provided to determine the new corner–one is based on the second-order differential of L-curve, and the other is based on the curvature of L-curve. The proposed method is examined by both numerical simulations and experimental tests. And the results indicate that the extended method can handle the parameter choice problem even in the case where the typical L-curve method fails. Finally, in order to reduce the running time of the method, the extended method is combined with a projection method based on the Krylov subspace, which was able to boost the extended L-curve method. The results verify that the speed of the extended L-curve method is distinctly improved. The proposed method extends the application of the L-curve in the field of choosing regularization parameter with an acceptable running time and can also be used in other kinds of tomography. (paper)

  6. Curvature Entropy for Curved Profile Generation

    Directory of Open Access Journals (Sweden)

    Koichiro Sato

    2012-03-01

    Full Text Available In a curved surface design, the overall shape features that emerge from combinations of shape elements are important. However, controlling the features of the overall shape in curved profiles is difficult using conventional microscopic shape information such as dimension. Herein two types of macroscopic shape information, curvature entropy and quadrature curvature entropy, quantitatively represent the features of the overall shape. The curvature entropy is calculated by the curvature distribution, and represents the complexity of a shape (one of the overall shape features. The quadrature curvature entropy is an improvement of the curvature entropy by introducing a Markov process to evaluate the continuity of a curvature and to approximate human cognition of the shape. Additionally, a shape generation method using a genetic algorithm as a calculator and the entropy as a shape generation index is presented. Finally, the applicability of the proposed method is demonstrated using the side view of an automobile as a design example.

  7. The Predictive Value of Ultrasound Learning Curves Across Simulated and Clinical Settings

    DEFF Research Database (Denmark)

    Madsen, Mette E; Nørgaard, Lone N; Tabor, Ann

    2017-01-01

    OBJECTIVES: The aim of the study was to explore whether learning curves on a virtual-reality (VR) sonographic simulator can be used to predict subsequent learning curves on a physical mannequin and learning curves during clinical training. METHODS: Twenty midwives completed a simulation-based tra......OBJECTIVES: The aim of the study was to explore whether learning curves on a virtual-reality (VR) sonographic simulator can be used to predict subsequent learning curves on a physical mannequin and learning curves during clinical training. METHODS: Twenty midwives completed a simulation......-based training program in transvaginal sonography. The training was conducted on a VR simulator as well as on a physical mannequin. A subgroup of 6 participants underwent subsequent clinical training. During each of the 3 steps, the participants' performance was assessed using instruments with established...... settings. RESULTS: A good correlation was found between time needed to achieve predefined performance levels on the VR simulator and the physical mannequin (Pearson correlation coefficient .78; P VR simulator correlated well to the clinical performance scores (Pearson...

  8. Developing Turbulent Flow in Strongly Curved Passages of Square and Circular Cross-Section

    Science.gov (United States)

    1984-03-01

    laser-velocimetry study known to us for developing tur- bulent flow in curved pipes, Enayet , et al. E113 investigated the motion in a 90* bend with Rc...flows are very similar, being De - Re (D/Rc) 1 / 2 6.8 x 104in Rowe’s bend and 2.6 x 104 in the bend of Enayet , et al., the difference in the maximum...a curved duct of square cross section. In addition to the data taken at three longitudioal stations in the curved pipe, (0 9 300, 60° and 900), Enayet

  9. In-Vehicle Dynamic Curve-Speed Warnings at High-Risk Rural Curves

    Science.gov (United States)

    2018-03-01

    Lane-departure crashes at horizontal curves represent a significant portion of fatal crashes on rural Minnesota roads. Because of this, solutions are needed to aid drivers in identifying upcoming curves and inform them of a safe speed at which they s...

  10. Buckling Capacity Curves for Steel Spherical Shells Loaded by the External Pressure

    Science.gov (United States)

    Błażejewski, Paweł; Marcinowski, Jakub

    2015-03-01

    Assessment of buckling resistance of pressurised spherical cap is not an easy task. There exist two different approaches which allow to achieve this goal. The first approach involves performing advanced numerical analyses in which material and geometrical nonlinearities would be taken into account as well as considering the worst imperfections of the defined amplitude. This kind of analysis is customarily called GMNIA and is carried out by means of the computer software based on FEM. The other, comparatively easier approach, relies on the utilisation of earlier prepared procedures which enable determination of the critical resistance pRcr, the plastic resistance pRpl and buckling parameters a, b, h, l 0 needed to the definition of the standard buckling resistance curve. The determination of the buckling capacity curve for the particular class of spherical caps is the principal goal of this work. The method of determination of the critical pressure and the plastic resistance were described by the authors in [1] whereas the worst imperfection mode for the considered class of spherical shells was found in [2]. The determination of buckling parameters defining the buckling capacity curve for the whole class of shells is more complicated task. For this reason the authors focused their attention on spherical steel caps with the radius to thickness ratio of R/t = 500, the semi angle j = 30o and the boundary condition BC2 (the clamped supporting edge). Taking into account all imperfection forms considered in [2] and different amplitudes expressed by the multiple of the shell thickness, sets of buckling parameters defining the capacity curve were determined. These parameters were determined by the methods proposed by Rotter in [3] and [4] where the method of determination of the exponent h by means of additional parameter k was presented. As a result of the performed analyses the standard capacity curves for all considered imperfection modes and amplitudes 0.5t, 1.0t, 1.5t

  11. Learning curve for laparoendoscopic single-site surgery for an experienced laparoscopic surgeon

    OpenAIRE

    Pao-Ling Torng; Kuan-Hung Lin; Jing-Shiang Hwang; Hui-Shan Liu; I-Hui Chen; Chi-Ling Chen; Su-Cheng Huang

    2013-01-01

    Objectives: To assess the learning curve and safety of laparoendoscopic single-site (LESS) surgery of gynecological surgeries. Materials and methods: Sixty-three women who underwent LESS surgery by a single experienced laparoscopic surgeon from February 2011 to August 2011 were included. Commercialized single-incision laparoscopic surgery homemade ports were used, along with conventional straight instruments. The learning curve has been defined as the additional surgical time with respect ...

  12. A semiparametric separation curve approach for comparing correlated ROC data from multiple markers

    Science.gov (United States)

    Tang, Liansheng Larry; Zhou, Xiao-Hua

    2012-01-01

    In this article we propose a separation curve method to identify the range of false positive rates for which two ROC curves differ or one ROC curve is superior to the other. Our method is based on a general multivariate ROC curve model, including interaction terms between discrete covariates and false positive rates. It is applicable with most existing ROC curve models. Furthermore, we introduce a semiparametric least squares ROC estimator and apply the estimator to the separation curve method. We derive a sandwich estimator for the covariance matrix of the semiparametric estimator. We illustrate the application of our separation curve method through two real life examples. PMID:23074360

  13. Remote sensing used for power curves

    DEFF Research Database (Denmark)

    Wagner, Rozenn; Ejsing Jørgensen, Hans; Schmidt Paulsen, Uwe

    2008-01-01

    Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviat...

  14. Application of NUREG/CR-5999 interim fatigue curves to selected nuclear power plant components

    International Nuclear Information System (INIS)

    Ware, A.G.; Morton, D.K.; Nitzel, M.E.

    1995-03-01

    Recent test data indicate that the effects of the light water reactor (LWR) environment could significantly reduce the fatigue resistance of materials used in the reactor coolant pressure boundary components of operating nuclear power plants. Argonne National Laboratory has developed interim fatigue curves based on test data simulating LWR conditions, and published them in NUREG/CR-5999. In order to assess the significance of these interim fatigue curves, fatigue evaluations of a sample of the components in the reactor coolant pressure boundary of LWRs were performed. The sample consists of components from facilities designed by each of the four U.S. nuclear steam supply system vendors. For each facility, six locations were studied, including two locations on the reactor pressure vessel. In addition, there are older vintage plants where components of the reactor coolant pressure boundary were designed to codes that did not require an explicit fatigue analysis of the components. In order to assess the fatigue resistance of the older vintage plants, an evaluation was also conducted on selected components of three of these plants. This report discusses the insights gained from the application of the interim fatigue curves to components of seven operating nuclear power plants

  15. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  16. Graphical evaluation of complexometric titration curves.

    Science.gov (United States)

    Guinon, J L

    1985-04-01

    A graphical method, based on logarithmic concentration diagrams, for construction, without any calculations, of complexometric titration curves is examined. The titration curves obtained for different kinds of unidentate, bidentate and quadridentate ligands clearly show why only chelating ligands are usually used in titrimetric analysis. The method has also been applied to two practical cases where unidentate ligands are used: (a) the complexometric determination of mercury(II) with halides and (b) the determination of cyanide with silver, which involves both a complexation and a precipitation system; for this purpose construction of the diagrams for the HgCl(2)/HgCl(+)/Hg(2+) and Ag(CN)(2)(-)/AgCN/CN(-) systems is considered in detail.

  17. Broadband Silicon-On-Insulator directional couplers using a combination of straight and curved waveguide sections.

    Science.gov (United States)

    Chen, George F R; Ong, Jun Rong; Ang, Thomas Y L; Lim, Soon Thor; Png, Ching Eng; Tan, Dawn T H

    2017-08-03

    Broadband Silicon-On-Insulator (SOI) directional couplers are designed based on a combination of curved and straight coupled waveguide sections. A design methodology based on the transfer matrix method (TMM) is used to determine the required coupler section lengths, radii, and waveguide cross-sections. A 50/50 power splitter with a measured bandwidth of 88 nm is designed and fabricated, with a device footprint of 20 μm × 3 μm. In addition, a balanced Mach-Zehnder interferometer is fabricated showing an extinction ratio of >16 dB over 100 nm of bandwidth.

  18. Dual kinetic curves in reversible electrochemical systems.

    Directory of Open Access Journals (Sweden)

    Michael J Hankins

    Full Text Available We introduce dual kinetic chronoamperometry, in which reciprocal relations are established between the kinetic curves of electrochemical reactions that start from symmetrical initial conditions. We have performed numerical and experimental studies in which the kinetic curves of the electron-transfer processes are analyzed for a reversible first order reaction. Experimental tests were done with the ferrocyanide/ferricyanide system in which the concentrations of each component could be measured separately using the platinum disk/gold ring electrode. It is shown that the proper ratio of the transient kinetic curves obtained from cathodic and anodic mass transfer limited regions give thermodynamic time invariances related to the reaction quotient of the bulk concentrations. Therefore, thermodynamic time invariances can be observed at any time using the dual kinetic curves for reversible reactions. The technique provides a unique possibility to extract the non-steady state trajectory starting from one initial condition based only on the equilibrium constant and the trajectory which starts from the symmetrical initial condition. The results could impact battery technology by predicting the concentrations and currents of the underlying non-steady state processes in a wide domain from thermodynamic principles and limited kinetic information.

  19. Determination of performance degradation of a marine diesel engine by using curve based approach

    International Nuclear Information System (INIS)

    Kökkülünk, Görkem; Parlak, Adnan; Erdem, Hasan Hüseyin

    2016-01-01

    Highlights: • Mathematical model was developed for a marine diesel engine. • Measurements were taken from Main Engine of M/V Ince Inebolu. • The model was validated for the marine diesel engine. • Curve Based Method was performed to evaluate the performance. • Degradation values of a marine diesel engine were found for power and SFC. - Abstract: Nowadays, energy efficiency measures on ships are the top priority topic for the maritime sector. One of the important key parameters of energy efficiency is to find the useful tool to improve the energy efficiency. There are two steps to improve the energy efficiency on ships: Measurement and Evaluation of performance of main fuel consumers. Performance evaluation is the method that evaluates how much the performance changes owing to engine component degradation which cause to reduce the performance due to wear, fouling, mechanical problems, etc. In this study, zero dimensional two zone combustion model is developed and validated for two stroke marine diesel engine (MITSUI MAN B&W 6S50MC). The measurements are taken from a real ship named M/V Ince Inebolu by the research team during the normal operation of the main engine in the region of the Marmara Sea. To evaluate the performance, “Curve based method” is used to calculate the total performance degradation. This total degradation is classified as parameters of compression pressure, injection timing, injection pressure, scavenge air temperature and scavenge air pressure by means of developed mathematical model. In conclusion, the total degradation of the applied ship is found as 620 kW by power and 26.74 g/kW h by specific fuel consumption.

  20. Wheelset curving guidance using H∞ control

    Science.gov (United States)

    Qazizadeh, Alireza; Stichel, Sebastian; Feyzmahdavian, Hamid Reza

    2018-03-01

    This study shows how to design an active suspension system for guidance of a rail vehicle wheelset in curve. The main focus of the study is on designing the controller and afterwards studying its effect on the wheel wear behaviour. The controller is designed based on the closed-loop transfer function shaping method and ? control strategy. The study discusses designing of the controller for both nominal and uncertain plants and considers both stability and performance. The designed controllers in Simulink are then applied to the vehicle model in Simpack to study the wheel wear behaviour in curve. The vehicle type selected for this study is a two-axle rail vehicle. This is because this type of vehicle is known to have very poor curving performance and high wheel wear. On the other hand, the relative simpler structure of this type of vehicle compared to bogie vehicles make it a more economic choice. Hence, equipping this type of vehicle with the active wheelset steering is believed to show high enough benefit to cost ratio to remain attractive to rail vehicle manufacturers and operators.

  1. Seismic fragility curves of bridge piers accounting for ground motions in Korea

    Science.gov (United States)

    Nguyen, Duy-Duan; Lee, Tae-Hyung

    2018-04-01

    Korea is located in a slight-to-moderate seismic zone. Nevertheless, several studies pointed that the peak earthquake magnitude in the region can be reached to approximately 6.5. Accordingly, a seismic vulnerability evaluation of the existing structures accounting for ground motions in Korea is momentous. The purpose of this paper is to develop seismic fragility curves for bridge piers of a steel box girder bridge equipped with and without base isolators based on a set of ground motions recorded in Korea. A finite element simulation platform, OpenSees, is utilized to perform nonlinear time history analyses of the bridges. A series of damage states is defined based on a damage index which is expressed in terms of the column displacement ductility ratio. The fragility curves based on Korean motions were thereafter compared with the fragility curves generated using worldwide earthquakes to assess the effect of the two ground motion groups on the seismic fragility curves of the bridge piers. The results reveal that both non- and base-isolated bridge piers are less vulnerable during the Korean ground motions than that under worldwide earthquakes.

  2. The curve shortening problem

    CERN Document Server

    Chou, Kai-Seng

    2001-01-01

    Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.

  3. Flow of viscous fluid along an exponentially stretching curved surface

    Directory of Open Access Journals (Sweden)

    N.F. Okechi

    Full Text Available In this paper, we present the boundary layer analysis of flow induced by rapidly stretching curved surface with exponential velocity. The governing boundary value problem is reduced into self-similar form using a new similarity transformation. The resulting equations are solved numerically using shooting and Runge-Kutta methods. The numerical results depicts that the fluid velocity as well as the skin friction coefficient increases with the surface curvature, similar trend is also observed for the pressure. The dimensionless wall shear stress defined for this problem is greater than that of a linearly stretching curved surface, but becomes comparably less for a surface stretching with a power-law velocity. In addition, the result for the plane surface is a special case of this study when the radius of curvature of the surface is sufficiently large. The numerical investigations presented in terms of the graphs are interpreted with the help of underlying physics of the fluid flow and the consequences arising from the curved geometry. Keywords: Boundary layer flow, Curved surface, Exponential stretching, Curvature

  4. Four-dimensional hilbert curves for R-trees

    DEFF Research Database (Denmark)

    Haverkort, Herman; Walderveen, Freek van

    2011-01-01

    Two-dimensional R-trees are a class of spatial index structures in which objects are arranged to enable fast window queries: report all objects that intersect a given query window. One of the most successful methods of arranging the objects in the index structure is based on sorting the objects...... according to the positions of their centers along a two-dimensional Hilbert space-filling curve. Alternatively, one may use the coordinates of the objects' bounding boxes to represent each object by a four-dimensional point, and sort these points along a four-dimensional Hilbert-type curve. In experiments...

  5. The antiproton depth–dose curve in water

    CERN Document Server

    Bassler, N; Jäkel, O; Knudsen, H V; Kovacevic, S

    2008-01-01

    We have measured the depth–dose curve of 126 MeV antiprotons in a water phantom using ionization chambers. Since the antiproton beam provided by CERN has a pulsed structure and possibly carries a high-LET component from the antiproton annihilation, it is necessary to correct the acquired charge for ion recombination effects. The results are compared with Monte Carlo calculations and were found to be in good agreement. Based on this agreement we calculate the antiproton depth–dose curve for antiprotons and compare it with that for protons and find a doubling of the physical dose in the peak region for antiprotons.

  6. Folding of non-Euclidean curved shells

    Science.gov (United States)

    Bende, Nakul; Evans, Arthur; Innes-Gold, Sarah; Marin, Luis; Cohen, Itai; Santangelo, Christian; Hayward, Ryan

    2015-03-01

    Origami-based folding of 2D sheets has been of recent interest for a variety of applications ranging from deployable structures to self-folding robots. Though folding of planar sheets follows well-established principles, folding of curved shells involves an added level of complexity due to the inherent influence of curvature on mechanics. In this study, we use principles from differential geometry and thin shell mechanics to establish fundamental rules that govern folding of prototypical creased shells. In particular, we show how the normal curvature of a crease line controls whether the deformation is smooth or discontinuous, and investigate the influence of shell thickness and boundary conditions. We show that snap-folding of shells provides a route to rapid actuation on time-scales dictated by the speed of sound. The simple geometric design principles developed can be applied at any length-scale, offering potential for bio-inspired soft actuators for tunable optics, microfluidics, and robotics. This work was funded by the National Science Foundation through EFRI ODISSEI-1240441 with additional support to S.I.-G. through the UMass MRSEC DMR-0820506 REU program.

  7. Stereoscopic visualization in curved spacetime: seeing deep inside a black hole

    International Nuclear Information System (INIS)

    Hamilton, Andrew J S; Polhemus, Gavin

    2010-01-01

    Stereoscopic visualization adds an additional dimension to the viewer's experience, giving them a sense of distance. In a general relativistic visualization, distance can be measured in a variety of ways. We argue that the affine distance, which matches the usual notion of distance in flat spacetime, is a natural distance to use in curved spacetime. As an example, we apply affine distance to the visualization of the interior of a black hole. Affine distance is not the distance perceived with normal binocular vision in curved spacetime. However, the failure of binocular vision is simply a limitation of animals that have evolved in flat spacetime, not a fundamental obstacle to depth perception in curved spacetime. Trinocular vision would provide superior depth perception.

  8. Master curve based correlation between static initiation toughness KIC and crack arrest toughness KIa

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1999-01-01

    Historically the ASME reference curve concept assumes a constant relation between static fracture toughness initiation toughness and crack arrest toughness. In reality, this is not the case. Experimental results show that the difference between K IC and K Ia is material specific. For some materials there is a big difference while for others they nearly coincide. So far, however, no systematic study regarding a possible correlation between the two parameters has been performed. The recent Master curve method, developed for brittle fracture initiation estimation, has enabled a consistent analysis of fracture initiation toughness data. The Master curve method has been modified to be able to describe also crack arrest toughness. Here, this modified 'crack arrest master curve' is further validated and used to develop a simple, but yet (for safety assessment purpose) adequately accurate correlation between the two fracture toughness parameters. The correlation enables the estimation of crack arrest toughness from small Charpy-sized static fracture toughness tests. The correlation is valid for low Nickel steels ≤ (1.2% Ni). If a more accurate description of the crack arrest toughness is required, it can either be measured experimentally or estimated from instrumented Charpy-V crack arrest load information. (orig.)

  9. Additives for cement compositions based on modified peat

    Energy Technology Data Exchange (ETDEWEB)

    Kopanitsa, Natalya, E-mail: kopanitsa@mail.ru; Sarkisov, Yurij, E-mail: sarkisov@tsuab.ru; Gorshkova, Aleksandra, E-mail: kasatkina.alexandra@gmail.com; Demyanenko, Olga, E-mail: angel-n@sibmail.com [Tomsk State University of Architecture and Building, 2, Solyanaya sq., Tomsk, 634003 (Russian Federation)

    2016-01-15

    High quality competitive dry building mixes require modifying additives for various purposes to be included in their composition. There is insufficient amount of quality additives having stable properties for controlling the properties of cement compositions produced in Russia. Using of foreign modifying additives leads to significant increasing of the final cost of the product. The cost of imported modifiers in the composition of the dry building mixes can be up to 90% of the material cost, depending on the composition complexity. Thus, the problem of import substitution becomes relevant, especially in recent years, due to difficult economic situation. The article discusses the possibility of using local raw materials as a basis for obtaining dry building mixtures components. The properties of organo-mineral additives for cement compositions based on thermally modified peat raw materials are studied. Studies of the structure and composition of the additives are carried out by physicochemical research methods: electron microscopy and X-ray analysis. Results of experimental research showed that the peat additives contribute to improving of cement-sand mortar strength and hydrophysical properties.

  10. Real-time interferometric monitoring and measuring of photopolymerization based stereolithographic additive manufacturing process: sensor model and algorithm

    International Nuclear Information System (INIS)

    Zhao, X; Rosen, D W

    2017-01-01

    As additive manufacturing is poised for growth and innovations, it faces barriers of lack of in-process metrology and control to advance into wider industry applications. The exposure controlled projection lithography (ECPL) is a layerless mask-projection stereolithographic additive manufacturing process, in which parts are fabricated from photopolymers on a stationary transparent substrate. To improve the process accuracy with closed-loop control for ECPL, this paper develops an interferometric curing monitoring and measuring (ICM and M) method which addresses the sensor modeling and algorithms issues. A physical sensor model for ICM and M is derived based on interference optics utilizing the concept of instantaneous frequency. The associated calibration procedure is outlined for ICM and M measurement accuracy. To solve the sensor model, particularly in real time, an online evolutionary parameter estimation algorithm is developed adopting moving horizon exponentially weighted Fourier curve fitting and numerical integration. As a preliminary validation, simulated real-time measurement by offline analysis of a video of interferograms acquired in the ECPL process is presented. The agreement between the cured height estimated by ICM and M and that measured by microscope indicates that the measurement principle is promising as real-time metrology for global measurement and control of the ECPL process. (paper)

  11. Beyond the SCS curve number: A new stochastic spatial runoff approach

    Science.gov (United States)

    Bartlett, M. S., Jr.; Parolari, A.; McDonnell, J.; Porporato, A. M.

    2015-12-01

    The Soil Conservation Service curve number (SCS-CN) method is the standard approach in practice for predicting a storm event runoff response. It is popular because its low parametric complexity and ease of use. However, the SCS-CN method does not describe the spatial variability of runoff and is restricted to certain geographic regions and land use types. Here we present a general theory for extending the SCS-CN method. Our new theory accommodates different event based models derived from alternative rainfall-runoff mechanisms or distributions of watershed variables, which are the basis of different semi-distributed models such as VIC, PDM, and TOPMODEL. We introduce a parsimonious but flexible description where runoff is initiated by a pure threshold, i.e., saturation excess, that is complemented by fill and spill runoff behavior from areas of partial saturation. To facilitate event based runoff prediction, we derive simple equations for the fraction of the runoff source areas, the probability density function (PDF) describing runoff variability, and the corresponding average runoff value (a runoff curve analogous to the SCS-CN). The benefit of the theory is that it unites the SCS-CN method, VIC, PDM, and TOPMODEL as the same model type but with different assumptions for the spatial distribution of variables and the runoff mechanism. The new multiple runoff mechanism description for the SCS-CN enables runoff prediction in geographic regions and site runoff types previously misrepresented by the traditional SCS-CN method. In addition, we show that the VIC, PDM, and TOPMODEL runoff curves may be more suitable than the SCS-CN for different conditions. Lastly, we explore predictions of sediment and nutrient transport by applying the PDF describing runoff variability within our new framework.

  12. Modular forms and special cycles on Shimura curves (AM-161)

    CERN Document Server

    Kudla, Stephen S; Yang, Tonghai

    2006-01-01

    Modular Forms and Special Cycles on Shimura Curves is a thorough study of the generating functions constructed from special cycles, both divisors and zero-cycles, on the arithmetic surface ""M"" attached to a Shimura curve ""M"" over the field of rational numbers. These generating functions are shown to be the q-expansions of modular forms and Siegel modular forms of genus two respectively, valued in the Gillet-Soulé arithmetic Chow groups of ""M"". The two types of generating functions are related via an arithmetic inner product formula. In addition, an analogue of the classical Siegel-Weil

  13. Study of the Vapor-Liquid Coexistence Curve and the Critical Curve for Nonazeotropic Refrigerant Mixture R152a + R114 System

    Science.gov (United States)

    Kabata, Yasuo; Higashi, Yukihiro; Uematsu, Masahiko; Watanabe, Koichi

    Measurements of the vapor-liquid coexistence curve in the critical region for the refrigerant mixture of R152a (CH3CHF2: 1, l-difluoroethane) +R 114 (CCIF2CCIF2 :1, 2-dichloro-1, 1, 2, 2-tetrafluoroethane) system were made by visual observation of the disappearance of the meniscus at the vapor-liquid interface within an optical cell. Forty-eight saturated densities along the vapor-liquid coexistence curve between 204 and 861 kg·m-3 for five different compositions of 10, 20, 50, 80 and 90 wt% R 152a were obtained in the temperature range 370 to 409 K. The experimental errors of temperature, density, and mass fraction were estimated within ±10mK, ±0.5% and +0.05 %, respectively. On the basis of these measurements, the critical parameters of five different compositions for the R 152a +R 114 system were determined in consideration of the meniscus disappearance level as well as intensity of the critical opalescence. In accordance with the previous results of three other refrigerant mixtures, i.e., R 12 +R 22 system, R 22 +R 114 system and R 13B1 + R 114 system, the coexistence curve and critical curve on the temperature-density diagram for binary refrigerant mixtures were discussed. In addition, correlations of its composition dependence for this system were proposed.

  14. The Component And System Reliability Analysis Of Multipurpose Reactor G.A. Subway's Based On The Failure Rate Curve

    International Nuclear Information System (INIS)

    Sriyono; Ismu Wahyono, Puradwi; Mulyanto, Dwijo; Kusmono, Siamet

    2001-01-01

    The main component of Multipurpose G.A.Siwabessy had been analyzed by its failure rate curve. The main component ha'..e been analyzed namely, the pump of ''Fuel Storage Pool Purification System'' (AK-AP), ''Primary Cooling System'' (JE01-AP), ''Primary Pool Purification System'' (KBE01-AP), ''Warm Layer System'' (KBE02-AP), ''Cooling Tower'' (PA/D-AH), ''Secondary Cooling System'', and Diesel (BRV). The Failure Rate Curve is made by component database that was taken from 'log book' operation of RSG GAS. The total operation of that curve is 2500 hours. From that curve it concluded that the failure rate of components form of bathtub curve. The maintenance processing causes the curve anomaly

  15. Bayesian Inference of Nonstationary Precipitation Intensity-Duration-Frequency Curves for Infrastructure Design

    Science.gov (United States)

    2016-03-01

    each IDF curve and subsequently used to force a calibrated and validated precipitation - runoff model. Probability-based, risk-informed hydrologic...ERDC/CHL CHETN-X-2 March 2016 Approved for public release; distribution is unlimited. Bayesian Inference of Nonstationary Precipitation Intensity...based means by which to develop local precipitation Intensity-Duration-Frequency (IDF) curves using historical rainfall time series data collected for

  16. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  17. Development of p-y curves of laterally loaded piles in cohesionless soil.

    Science.gov (United States)

    Khari, Mahdy; Kassim, Khairul Anuar; Adnan, Azlan

    2014-01-01

    The research on damages of structures that are supported by deep foundations has been quite intensive in the past decade. Kinematic interaction in soil-pile interaction is evaluated based on the p-y curve approach. Existing p-y curves have considered the effects of relative density on soil-pile interaction in sandy soil. The roughness influence of the surface wall pile on p-y curves has not been emphasized sufficiently. The presented study was performed to develop a series of p-y curves for single piles through comprehensive experimental investigations. Modification factors were studied, namely, the effects of relative density and roughness of the wall surface of pile. The model tests were subjected to lateral load in Johor Bahru sand. The new p-y curves were evaluated based on the experimental data and were compared to the existing p-y curves. The soil-pile reaction for various relative density (from 30% to 75%) was increased in the range of 40-95% for a smooth pile at a small displacement and 90% at a large displacement. For rough pile, the ratio of dense to loose relative density soil-pile reaction was from 2.0 to 3.0 at a small to large displacement. Direct comparison of the developed p-y curve shows significant differences in the magnitude and shapes with the existing load-transfer curves. Good comparison with the experimental and design studies demonstrates the multidisciplinary applications of the present method.

  18. Development of p-y Curves of Laterally Loaded Piles in Cohesionless Soil

    Science.gov (United States)

    Khari, Mahdy; Kassim, Khairul Anuar; Adnan, Azlan

    2014-01-01

    The research on damages of structures that are supported by deep foundations has been quite intensive in the past decade. Kinematic interaction in soil-pile interaction is evaluated based on the p-y curve approach. Existing p-y curves have considered the effects of relative density on soil-pile interaction in sandy soil. The roughness influence of the surface wall pile on p-y curves has not been emphasized sufficiently. The presented study was performed to develop a series of p-y curves for single piles through comprehensive experimental investigations. Modification factors were studied, namely, the effects of relative density and roughness of the wall surface of pile. The model tests were subjected to lateral load in Johor Bahru sand. The new p-y curves were evaluated based on the experimental data and were compared to the existing p-y curves. The soil-pile reaction for various relative density (from 30% to 75%) was increased in the range of 40–95% for a smooth pile at a small displacement and 90% at a large displacement. For rough pile, the ratio of dense to loose relative density soil-pile reaction was from 2.0 to 3.0 at a small to large displacement. Direct comparison of the developed p-y curve shows significant differences in the magnitude and shapes with the existing load-transfer curves. Good comparison with the experimental and design studies demonstrates the multidisciplinary applications of the present method. PMID:24574932

  19. Development of p-y Curves of Laterally Loaded Piles in Cohesionless Soil

    Directory of Open Access Journals (Sweden)

    Mahdy Khari

    2014-01-01

    Full Text Available The research on damages of structures that are supported by deep foundations has been quite intensive in the past decade. Kinematic interaction in soil-pile interaction is evaluated based on the p-y curve approach. Existing p-y curves have considered the effects of relative density on soil-pile interaction in sandy soil. The roughness influence of the surface wall pile on p-y curves has not been emphasized sufficiently. The presented study was performed to develop a series of p-y curves for single piles through comprehensive experimental investigations. Modification factors were studied, namely, the effects of relative density and roughness of the wall surface of pile. The model tests were subjected to lateral load in Johor Bahru sand. The new p-y curves were evaluated based on the experimental data and were compared to the existing p-y curves. The soil-pile reaction for various relative density (from 30% to 75% was increased in the range of 40–95% for a smooth pile at a small displacement and 90% at a large displacement. For rough pile, the ratio of dense to loose relative density soil-pile reaction was from 2.0 to 3.0 at a small to large displacement. Direct comparison of the developed p-y curve shows significant differences in the magnitude and shapes with the existing load-transfer curves. Good comparison with the experimental and design studies demonstrates the multidisciplinary applications of the present method.

  20. Potentiometric titration curves of aluminium salt solutions and its species conversion in the hydrolysis-polymerization course

    Directory of Open Access Journals (Sweden)

    Chenyi Wang

    2008-12-01

    Full Text Available A new concept of critical point is expounded by analysing the potentiometric titration curves of aluminium salt solutions under the moderate slow rate of base injection. The critical point is defined as the characteristic spot of the Al3+ salt solutions potentiometric titration curve, which is related to the experiment conditions. In addition, the changes of critical points reflect the influence of experiment conditions on the course of the hydrolysis-polymerization and the conversion of hydroxyl polynuclear aluminum species. According to the OH/Al mole ratio, the Al species can be divided into four regions quantitatively by three characteristic points on the titration curves: Part I, Al3+/Ala region, consist chiefly of Al3+ and mononuclear Al; Part II, the small/middle polynuclear Al region, including Al2-Al12; Part III, the large-size polynuclear aluminum region, consistent with predominantly Al13-Al54 and a little sol/gel Al(OH3; Part IV, the dissolving region of sol/gel Alc, only Al(OH 3 (aq or am and Al(OH4- species, which set up a base to study on the hydrolysis-polymerization of Al3+. At the same time, significant effects of total aluminum concentration, temperature, halide ion, silicate radical, and organic acid radical on the titration curves and its critical points were observed. Given the three critical points which demarcating the aluminum forms, we carry out a through investigation into the fundamental regulations of these factors’ influence, and offer a fresh train of thought to study the hydrolysis-polymerization of Al3+ in soil solutions.

  1. Photoelectic BV Light Curves of Algol and the Interpretations of the Light Curves

    Directory of Open Access Journals (Sweden)

    Ho-Il Kim

    1985-06-01

    Full Text Available Standardized B and V photoelectric light curves of Algol are made with the observations obtained during 1982-84 with the 40-cm and the 61-cm reflectors of Yonsei University Observatory. These light curves show asymmetry between ascending and descending shoulders. The ascending shoulder is 0.02 mag brighter than descending shoulder in V light curve and 0.03 mag in B light curve. These asymmetric light curves are interpreted as the result of inhomogeneous energy distribution on the surface of one star of the eclipsing pair rather than the result of gaseous stream flowing from KOIV to B8V star. The 180-year periodicity, so called great inequality, are most likely the result proposed by Kim et al. (1983 that the abrupt and discrete mass losses of cooler component may be the cause of this orbital change. The amount of mass loss deduced from these discrete period changes turned out to be of the order of 10^(-6 - 10^(-5 Msolar.

  2. A Journey Between Two Curves

    Directory of Open Access Journals (Sweden)

    Sergey A. Cherkis

    2007-03-01

    Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.

  3. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  4. Diagnosis of the σ-, π- and (σ+π-Aromaticity by the Shape of the NICSzz-Scan Curves and Symmetry-Based Selection Rules

    Directory of Open Access Journals (Sweden)

    Constantinos A. Tsipis

    2010-03-01

    Full Text Available The NICSzz-scan curves of aromatic organic, inorganic and “all-metal” molecules in conjunction with symmetry-based selection rules provide efficient diagnostic tools of the σ-, π- and/or double (σ + π-aromaticity. The NICSzz-scan curves of σ-aromatic molecules are symmetric around the z-axis, having half-band widths approximately less than 3 Å with the induced diatropic ring current arising from Tx,y-allowed transitions involving exclusively σ-type molecular orbitals. Broad NICSzz-scan curves (half-band width approximately higher than 3 Å characterize double (σ + π-aromaticity, the chief contribution to the induced diatropic ring current arising from Tx,y-allowed transitions involving both σ- and π-type molecular orbitals. NICSzz-scan curves exhibiting two maxima at a certain distance above and below the molecular plane are typical for (σ + π-aromatics where the π-diatropic ring current overwhelms the σ-type one. In the absence of any contribution from the σ-diatropic ring current, the NICSzz(0 value is close to zero and the molecule exhibits pure π-aromaticity.

  5. Avaliação do espaço adicional requerido no perímetro do arco inferior para o nivelamento da curva de Spee Assessment of the additional lower arch perimeter needed for leveling the curve of Spee

    Directory of Open Access Journals (Sweden)

    Marcio José da Silva Campos

    2009-08-01

    Full Text Available OBJETIVO: determinar o perímetro de arco adicional necessário para o nivelamento da curva de Spee através de uma técnica laboratorial em modelos de estudo. MÉTODOS: foram utilizados 70 modelos inferiores nos quais se mediu a profundidade da curva de Spee e o perímetro do arco (de segundo molar a segundo molar. Nos mesmos modelos, após a simulação do nivelamento da curva de Spee, o perímetro de arco foi novamente avaliado, mantendo sua forma e comprimento. RESULTADOS: foi confirmada a correlação entre a profundidade da curva de Spee e o perímetro de arco adicional, sendo deduzida a fórmula [Paa = 0,21 CSmax - 0,04]. CONCLUSÃO: a técnica proposta permitiu, através da avaliação do perímetro de arco com curva de Spee nivelada, a determinação do espaço disponível para o alinhamento dentário.AIM: To determine the additional arch perimeter needed for leveling the curve of Spee by means of a laboratory technique using dental casts. METHODS: Seventy lower dental models were used for measuring the depth of the curve of Spee and assessing the arch perimeter from second molar to second molar. In these dental casts, after simulation of leveling the curve of Spee, arch perimeter was reevaluated, after maintaining its form and length. RESULTS: The correlation between the depth of the curve of Spee and the additional arch perimeter was ratified, as can be shown by the formula [Paa = 0,21 CSmax - 0,04]. CONCLUSION: The proposed technique allowed, through the assessment of the arch perimeter with the leveled curve of Spee, the determination of the space available for tooth alignment.

  6. Optimization of curved drift tubes for ultraviolet-ion mobility spectrometry

    Science.gov (United States)

    Ni, Kai; Ou, Guangli; Zhang, Xiaoguo; Yu, Zhou; Yu, Quan; Qian, Xiang; Wang, Xiaohao

    2015-08-01

    Ion mobility spectrometry (IMS) is a key trace detection technique for toxic pollutants and explosives in the atmosphere. Ultraviolet radiation photoionization source is widely used as an ionization source for IMS due to its advantages of high selectivity and non-radioactivity. However, UV-IMS bring problems that UV rays will be launched into the drift tube which will cause secondary ionization and lead to the photoelectric effect of the Faraday disk. So air is often used as working gas to reduce the effective distance of UV rays, but it will limit the application areas of UV-IMS. In this paper, we propose a new structure of curved drift tube, which can avoid abnormally incident UV rays. Furthermore, using curved drift tube may increase the length of drift tube and then improve the resolution of UV-IMS according to previous research. We studied the homogeneity of electric field in the curved drift tube, which determined the performance of UV-IMS. Numerical simulation of electric field in curved drift tube was conducted by SIMION in our study. In addition, modeling method and homogeneity standard for electric field were also presented. The influences of key parameters include radius of gyration, gap between electrode as well as inner diameter of curved drift tube, on the homogeneity of electric field were researched and some useful laws were summarized. Finally, an optimized curved drift tube is designed to achieve homogenous drift electric field. There is more than 98.75% of the region inside the curved drift tube where the fluctuation of the electric field strength along the radial direction is less than 0.2% of that along the axial direction.

  7. Bond yield curve construction

    Directory of Open Access Journals (Sweden)

    Kožul Nataša

    2014-01-01

    Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.

  8. Curve Digitizer – A software for multiple curves digitizing

    Directory of Open Access Journals (Sweden)

    Florentin ŞPERLEA

    2010-06-01

    Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document

  9. Strong laws for generalized absolute Lorenz curves when data are stationary and ergodic sequences

    NARCIS (Netherlands)

    R. Helmers (Roelof); R. Zitikis

    2004-01-01

    textabstractWe consider generalized absolute Lorenz curves that include, as special cases, classical and generalized L - statistics as well as absolute or, in other words, generalized Lorenz curves. The curves are based on strictly stationary and ergodic sequences of random variables. Most of the

  10. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-10-01

    When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model

  11. Aspects of Pairing Based Cryptography on Jacobians of Genus Two Curves

    DEFF Research Database (Denmark)

    Ravnshøj, Christian Robenhagen

    The thesis concerns properties of Jacobians of genus two curves defined over a finite field. Such Jacobians have a wide range of applications in data security; e.g. netbanking and digital signature. New properties of the Jacobians are proved; here, a description of the embedding of -torsion point...

  12. IDF-curves for precipitation In Belgium

    International Nuclear Information System (INIS)

    Mohymont, Bernard; Demarde, Gaston R.

    2004-01-01

    The Intensity-Duration-Frequency (IDF) curves for precipitation constitute a relationship between the intensity, the duration and the frequency of rainfall amounts. The intensity of precipitation is expressed in mm/h, the duration or aggregation time is the length of the interval considered while the frequency stands for the probability of occurrence of the event. IDF-curves constitute a classical and useful tool that is primarily used to dimension hydraulic structures in general, as e.g., sewer systems and which are consequently used to assess the risk of inundation. In this presentation, the IDF relation for precipitation is studied for different locations in Belgium. These locations correspond to two long-term, high-quality precipitation networks of the RMIB: (a) the daily precipitation depths of the climatological network (more than 200 stations, 1951-2001 baseline period); (b) the high-frequency 10-minutes precipitation depths of the hydro meteorological network (more than 30 stations, 15 to 33 years baseline period). For the station of Uccle, an uninterrupted time-series of more than one hundred years of 10-minutes rainfall data is available. The proposed technique for assessing the curves is based on maximum annual values of precipitation. A new analytical formula for the IDF-curves was developed such that these curves stay valid for aggregation times ranging from 10 minutes to 30 days (when fitted with appropriate data). Moreover, all parameters of this formula have physical dimensions. Finally, adequate spatial interpolation techniques are used to provide nationwide extreme values precipitation depths for short- to long-term durations With a given return period. These values are estimated on the grid points of the Belgian ALADIN-domain used in the operational weather forecasts at the RMIB.(Author)

  13. Search procedure for models based on the evolution of experimental curves

    International Nuclear Information System (INIS)

    Delforge, J.

    1975-01-01

    The possibilities offered by numerical analysis regarding the identification of parameters for the model are outlined. The use of a large number of experimental measurements is made possible by the flexibility of the proposed method. It is shown that the errors of numerical identification over all parameters are proportional to experimental errors, and to a proportionality factor called conditioning of the identification problem which is easily computed. Moreover, it is possible to define and calculate, for each parameter, a factor of sensitivity to experimental errors. The numerical values of conditioning and sensitivity factor depend on all experimental conditions, that is, on the one hand, the specific definition of the experiments, and on the other hand, the number and quality of the undertaken measurements. The identification procedure proposed includes several phases. The preliminary phase consists in a first definition of experimental conditions, in agreement with the experimenter. From the data thus obtained, it is generally possible to evaluate the minimum number of equivalence classes required for an interpretation compatible with the morphology of experimental curves. Possibly, from this point, some additional measurements may prove useful or required. The numerical phase comes afterwards to determine a first approximate model by means of the methods previously described. Next phases again require a close collaboration between experimenters and theoreticians. They consist mainly in refining the first model [fr

  14. Interpolating Spline Curve-Based Perceptual Encryption for 3D Printing Models

    Directory of Open Access Journals (Sweden)

    Giao N. Pham

    2018-02-01

    Full Text Available With the development of 3D printing technology, 3D printing has recently been applied to many areas of life including healthcare and the automotive industry. Due to the benefit of 3D printing, 3D printing models are often attacked by hackers and distributed without agreement from the original providers. Furthermore, certain special models and anti-weapon models in 3D printing must be protected against unauthorized users. Therefore, in order to prevent attacks and illegal copying and to ensure that all access is authorized, 3D printing models should be encrypted before being transmitted and stored. A novel perceptual encryption algorithm for 3D printing models for secure storage and transmission is presented in this paper. A facet of 3D printing model is extracted to interpolate a spline curve of degree 2 in three-dimensional space that is determined by three control points, the curvature coefficients of degree 2, and an interpolating vector. Three control points, the curvature coefficients, and interpolating vector of the spline curve of degree 2 are encrypted by a secret key. The encrypted features of the spline curve are then used to obtain the encrypted 3D printing model by inverse interpolation and geometric distortion. The results of experiments and evaluations prove that the entire 3D triangle model is altered and deformed after the perceptual encryption process. The proposed algorithm is responsive to the various formats of 3D printing models. The results of the perceptual encryption process is superior to those of previous methods. The proposed algorithm also provides a better method and more security than previous methods.

  15. Optimization of ISOL targets based on Monte-Carlo simulations of ion release curves

    International Nuclear Information System (INIS)

    Mustapha, B.; Nolen, J.A.

    2003-01-01

    A detailed model for simulating release curves from ISOL targets has been developed. The full 3D geometry is implemented using Geant-4. Produced particles are followed individually from production to release. The delay time is computed event by event. All processes involved: diffusion, effusion and decay are included to obtain the overall release curve. By fitting to the experimental data, important parameters of the release process (diffusion coefficient, sticking time, ...) are extracted. They can be used to improve the efficiency of existing targets and design new ones more suitable to produce beams of rare isotopes

  16. Optimization of ISOL targets based on Monte-Carlo simulations of ion release curves

    CERN Document Server

    Mustapha, B

    2003-01-01

    A detailed model for simulating release curves from ISOL targets has been developed. The full 3D geometry is implemented using Geant-4. Produced particles are followed individually from production to release. The delay time is computed event by event. All processes involved: diffusion, effusion and decay are included to obtain the overall release curve. By fitting to the experimental data, important parameters of the release process (diffusion coefficient, sticking time, ...) are extracted. They can be used to improve the efficiency of existing targets and design new ones more suitable to produce beams of rare isotopes.

  17. The holographic RG flow in a field theory on a curved background

    International Nuclear Information System (INIS)

    Cardoso, Gabriel Lopes; Luest, Dieter

    2002-01-01

    As shown by Freedman, Gubser, Pilch and Warner, the RG flow in N=4 super-Yang-Mills theory broken to an N=1 theory by the addition of a mass term can be described in terms of a supersymmetric domain wall solution in five-dimensional N=8 gauged supergravity. The FGPW flow is an example of a holographic RG flow in a field theory on a flat background. Here we put the field theory studied by Freedman, Gubser, Pilch and Warner on a curved AdS 4 background, and we construct the supersymmetric domain wall solution which describes the RG flow in this field theory. This solution is a curved (non-Ricci flat) domain wall solution. This example demonstrates that holographic RG flows in supersymmetric field theories on a curved AdS 4 background can be described in terms of curved supersymmetric domain wall solutions. (author)

  18. Biocompatibility of hydroxyapatite scaffolds processed by lithography-based additive manufacturing.

    Science.gov (United States)

    Tesavibul, Passakorn; Chantaweroad, Surapol; Laohaprapanon, Apinya; Channasanon, Somruethai; Uppanan, Paweena; Tanodekaew, Siriporn; Chalermkarnnon, Prasert; Sitthiseripratip, Kriskrai

    2015-01-01

    The fabrication of hydroxyapatite scaffolds for bone tissue engineering applications by using lithography-based additive manufacturing techniques has been introduced due to the abilities to control porous structures with suitable resolutions. In this research, the use of hydroxyapatite cellular structures, which are processed by lithography-based additive manufacturing machine, as a bone tissue engineering scaffold was investigated. The utilization of digital light processing system for additive manufacturing machine in laboratory scale was performed in order to fabricate the hydroxyapatite scaffold, of which biocompatibilities were eventually evaluated by direct contact and cell-culturing tests. In addition, the density and compressive strength of the scaffolds were also characterized. The results show that the hydroxyapatite scaffold at 77% of porosity with 91% of theoretical density and 0.36 MPa of the compressive strength are able to be processed. In comparison with a conventionally sintered hydroxyapatite, the scaffold did not present any cytotoxic signs while the viability of cells at 95.1% was reported. After 14 days of cell-culturing tests, the scaffold was able to be attached by pre-osteoblasts (MC3T3-E1) leading to cell proliferation and differentiation. The hydroxyapatite scaffold for bone tissue engineering was able to be processed by the lithography-based additive manufacturing machine while the biocompatibilities were also confirmed.

  19. Curved Josephson junction

    International Nuclear Information System (INIS)

    Dobrowolski, Tomasz

    2012-01-01

    The constant curvature one and quasi-one dimensional Josephson junction is considered. On the base of Maxwell equations, the sine–Gordon equation that describes an influence of curvature on the kink motion was obtained. It is showed that the method of geometrical reduction of the sine–Gordon model from three to lower dimensional manifold leads to an identical form of the sine–Gordon equation. - Highlights: ► The research on dynamics of the phase in a curved Josephson junction is performed. ► The geometrical reduction is applied to the sine–Gordon model. ► The results of geometrical reduction and the fundamental research are compared.

  20. Estimating reaction rate constants: comparison between traditional curve fitting and curve resolution

    NARCIS (Netherlands)

    Bijlsma, S.; Boelens, H. F. M.; Hoefsloot, H. C. J.; Smilde, A. K.

    2000-01-01

    A traditional curve fitting (TCF) algorithm is compared with a classical curve resolution (CCR) approach for estimating reaction rate constants from spectral data obtained in time of a chemical reaction. In the TCF algorithm, reaction rate constants an estimated from the absorbance versus time data

  1. A catalog of special plane curves

    CERN Document Server

    Lawrence, J Dennis

    2014-01-01

    Among the largest, finest collections available-illustrated not only once for each curve, but also for various values of any parameters present. Covers general properties of curves and types of derived curves. Curves illustrated by a CalComp digital incremental plotter. 12 illustrations.

  2. Intersection numbers of spectral curves

    CERN Document Server

    Eynard, B.

    2011-01-01

    We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the Marino-Vafa formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV, Marino-Vafa formula, and Mumford formula.

  3. The phase curve survey of the irregular saturnian satellites: A possible method of physical classification

    Science.gov (United States)

    Bauer, James M.; Grav, Tommy; Buratti, Bonnie J.; Hicks, Michael D.

    2006-09-01

    During its 2005 January opposition, the saturnian system could be viewed at an unusually low phase angle. We surveyed a subset of Saturn's irregular satellites to obtain their true opposition magnitudes, or nearly so, down to phase angle values of 0.01°. Combining our data taken at the Palomar 200-inch and Cerro Tololo Inter-American Observatory's 4-m Blanco telescope with those in the literature, we present the first phase curves for nearly half the irregular satellites originally reported by Gladman et al. [2001. Nature 412, 163-166], including Paaliaq (SXX), Siarnaq (SXXIX), Tarvos (SXXI), Ijiraq (SXXII), Albiorix (SXVI), and additionally Phoebe's narrowest angle brightness measured to date. We find centaur-like steepness in the phase curves or opposition surges in most cases with the notable exception of three, Albiorix and Tarvos, which are suspected to be of similar origin based on dynamical arguments, and Siarnaq.

  4. Placement Design of Changeable Message Signs on Curved Roadways

    Directory of Open Access Journals (Sweden)

    Zhongren Wang, Ph.D. P.E. T.E.

    2015-01-01

    Full Text Available This paper presented a fundamental framework for Changeable Message Sign (CMS placement design along roadways with horizontal curves. This analytical framework determines the available distance for motorists to read and react to CMS messages based on CMS character height, driver's cone of vision, CMS pixel's cone of legibility, roadway horizontal curve radius, and CMS lateral and vertical placement. Sample design charts were developed to illustrate how the analytical framework may facilitate CMS placement design.

  5. Elliptic curves for applications (Tutorial)

    NARCIS (Netherlands)

    Lange, T.; Bernstein, D.J.; Chatterjee, S.

    2011-01-01

    More than 25 years ago, elliptic curves over finite fields were suggested as a group in which the Discrete Logarithm Problem (DLP) can be hard. Since then many researchers have scrutinized the security of the DLP on elliptic curves with the result that for suitably chosen curves only exponential

  6. Differential geometry and topology of curves

    CERN Document Server

    Animov, Yu

    2001-01-01

    Differential geometry is an actively developing area of modern mathematics. This volume presents a classical approach to the general topics of the geometry of curves, including the theory of curves in n-dimensional Euclidean space. The author investigates problems for special classes of curves and gives the working method used to obtain the conditions for closed polygonal curves. The proof of the Bakel-Werner theorem in conditions of boundedness for curves with periodic curvature and torsion is also presented. This volume also highlights the contributions made by great geometers. past and present, to differential geometry and the topology of curves.

  7. Models of genus one curves

    OpenAIRE

    Sadek, Mohammad

    2010-01-01

    In this thesis we give insight into the minimisation problem of genus one curves defined by equations other than Weierstrass equations. We are interested in genus one curves given as double covers of P1, plane cubics, or complete intersections of two quadrics in P3. By minimising such a curve we mean making the invariants associated to its defining equations as small as possible using a suitable change of coordinates. We study the non-uniqueness of minimisations of the genus one curves des...

  8. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  9. Representative Stress-Strain Curve by Spherical Indentation on Elastic-Plastic Materials

    Directory of Open Access Journals (Sweden)

    Chao Chang

    2018-01-01

    Full Text Available Tensile stress-strain curve of metallic materials can be determined by the representative stress-strain curve from the spherical indentation. Tabor empirically determined the stress constraint factor (stress CF, ψ, and strain constraint factor (strain CF, β, but the choice of value for ψ and β is still under discussion. In this study, a new insight into the relationship between constraint factors of stress and strain is analytically described based on the formation of Tabor’s equation. Experiment tests were performed to evaluate these constraint factors. From the results, representative stress-strain curves using a proposed strain constraint factor can fit better with nominal stress-strain curve than those using Tabor’s constraint factors.

  10. LINS Curve in Romanian Economy

    Directory of Open Access Journals (Sweden)

    Emilian Dobrescu

    2016-02-01

    Full Text Available The paper presents theoretical considerations and empirical evidence to test the validity of the Laffer in Narrower Sense (LINS curve as a parabola with a maximum. Attention is focused on the so-called legal-effective tax gap (letg. The econometric application is based on statistical data (1990-2013 for Romania as an emerging European economy. Three cointegrating regressions (fully modified least squares, canonical cointegrating regression and dynamic least squares and three algorithms, which are based on instrumental variables (two-stage least squares, generalized method of moments, and limited information maximum likelihood, are involved.

  11. The crime kuznets curve

    OpenAIRE

    Buonanno, Paolo; Fergusson, Leopoldo; Vargas, Juan Fernando

    2014-01-01

    We document the existence of a Crime Kuznets Curve in US states since the 1970s. As income levels have risen, crime has followed an inverted U-shaped pattern, first increasing and then dropping. The Crime Kuznets Curve is not explained by income inequality. In fact, we show that during the sample period inequality has risen monotonically with income, ruling out the traditional Kuznets Curve. Our finding is robust to adding a large set of controls that are used in the literature to explain the...

  12. PLOTTAB, Curve and Point Plotting with Error Bars

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: PLOTTAB is designed to plot any combination of continuous curves and/or discrete points (with associated error bars) using user supplied titles and X and Y axis labels and units. If curves are plotted, the first curve may be used as a standard; the data and the ratio of the data to the standard will be plotted. 2 - Method of solution: PLOTTAB: The program has no idea of what data is being plotted and yet by supplying titles, X and Y axis labels and units the user can produce any number of plots with each plot containing almost any combination of curves and points with each plot properly identified. In order to define a continuous curve between tabulated points, this program must know how to interpolate between points. By input the user may specify either the default option of linear x versus linear y interpolation or alternatively log x and/or log Y interpolation. In all cases, regardless of the interpolation specified, the program will always interpolate the data to the plane of the plot (linear or log x and y plane) in order to present the true variation of the data between tabulated points, based on the user specified interpolation law. Tabulated points should be tabulated at a sufficient number of x values to insure that the difference between the specified interpolation and the 'true' variation of a curve between tabulated values is relatively small. 3 - Restrictions on the complexity of the problem: A combination of up to 30 curves and sets of discrete points may appear on each plot. If the user wishes to use this program to compare different sets of data, all of the data must be in the same units

  13. Thermal effect on water retention curve of bentonite: experiment and thermodynamic modeling

    International Nuclear Information System (INIS)

    Qin Bing; Chen Zhenghai; Sun Faxin; Liu Yuemiao; Wang Ju

    2012-01-01

    The thermal effects on water retention curve of GMZ bentonite were investigated experimentally and theoretically. Water retention tests were conducted on GMZ bentonite at five temperatures ranging from 20℃ to 100℃. Test results showed that the water retention capacity and the hysteresis of the water retention curve decreased with increasing temperature, and that the water retention curves at different temperatures were almost parallel to each other. Based on the thermodynamics of sorption, a model was established to describe the temperature influence on the water retention curve. The model was validated by comparing the model predictions and the test results. (authors)

  14. Numerical analysis of unsteady conjugate heat transfer for initial evolution of thermal stratification in a curved pipe

    International Nuclear Information System (INIS)

    Jo, Jong Chull; Kim, Wee Kyung; Kim, Yun Il; Cho, Sang Jin; Choi, Seok Ki

    2000-01-01

    A detailed numerical analysis of initial evolution of thermal stratification in a curved pipe with a finite wall thickness is performed. A primary emphasis of the present study is placed on the investigation of the effect of existence of pipe wall thickness on the evolution of thermal stratification. A simple and convenient numerical method of treating the unsteady conjugate heat transfer in Cartesian as well as non-orthogonal coordinate systems is presented. The proposed unsteady conjugate heat transfer analysis method is implemented in a finite volume thermal-hydraulic computer code based on a cell-centered, non-staggered grid arrangement, the SIMPLEC algorithm and a higher-order bounded convection scheme. Calculations are performed for initial evolution of thermal stratification with high Richardson number in a curved pipe. The predicted results show that the thermally stratified flow and transient conjugate heat transfer in a curved pipe with a specified wall thickness can be satisfactorily analyzed by using the numerical method presented in this paper. As the result, the present analysis method is considered to be effective for the determination of transient temperature distributions in the wall of curved piping system subjected to internally thermal stratification. In addition, the method can be extended to be applicable for the simulation of turbulent flow of thermally stratified fluid

  15. Extending World Health Organization weight-for-age reference curves to older children.

    Science.gov (United States)

    Rodd, Celia; Metzger, Daniel L; Sharma, Atul

    2014-02-03

    For ages 5-19 years, the World Health Organization (WHO) publishes reference charts based on 'core data' from the US National Center for Health Statistics (NCHS), collected from 1963-75 on 22,917 US children. To promote the use of body mass index in older children, weight-for-age was omitted after age 10. Health providers have subsequently expressed concerns about this omission and the selection of centiles. We therefore sought to extend weight-for-age reference curves from 10 to 19 years by applying WHO exclusion criteria and curve fitting methods to the core NCHS data and to revise the choice of displayed centiles. WHO analysts first excluded ~ 3% of their reference population in order to achieve a "non-obese sample with equal height". Based on these exclusion criteria, 314 girls and 304 boys were first omitted for 'unhealthy' weights-for-height. By applying WHO global deviance and information criteria, optimal Box-Cox power exponential models were used to fit smoothed weight-for-age centiles. Bootstrap resampling was used to assess the precision of centile estimates. For all charts, additional centiles were included in the healthy range (3 to 97%), and the more extreme WHO centiles 0.1 and 99.9% were dropped. In addition to weight-for-age beyond 10 years, our charts provide more granularity in the centiles in the healthy range -2 to +2 SD (3-97%). For both weight and BMI, the bootstrap confidence intervals for the 99.9th centile were at least an order of magnitude wider than the corresponding 50th centile values. These charts complement existing WHO charts by allowing weight-for-age to be plotted concurrently with height in older children. All modifications followed strict WHO methodology and utilized the same core data from the US NCHS. The additional centiles permit a more precise assessment of normal growth and earlier detection of aberrant growth as it crosses centiles. Elimination of extreme centiles reduces the risk of misclassification. A complete set of

  16. Study through potentiodynamic techniques of the corrosion resistance of different aluminium base MMC's with boron additions

    International Nuclear Information System (INIS)

    Abenojar, J.; Bautista, A.; Guzman, S.; Velasco, F.; Martinez, M.A.

    2009-01-01

    This paper compares a wrought aluminium with a PM aluminium and PM aluminium alloys with boron-base additions, containing boron carbide and Fe/B (obtained by mechanical alloying during 36 hours from a Fe-B 50% mixture by weight). The effect of sintering temperature for the Fe/B containing material and the effect of mechanical alloying for the boron carbide containing aluminium alloy on the corrosion resistance of those materials have been studied. Their behaviour is followed through cyclic anodic polarization curves in chloride media. In the Al+20%Fe/B composite, low sintering temperatures (650- 950 deg C) exert a negative effect. However, when the material was sintered at high temperature (1000-1100 deg C) its behaviour was very similar to the PM pure aluminium. The effect of mechanical alloying studied in aluminium with boron carbide was also important in corrosion resistance, finding a lower corrosion rate in the mechanically alloyed material. (author)

  17. Numerical Integration Techniques for Curved-Element Discretizations of Molecule–Solvent Interfaces

    Science.gov (United States)

    Bardhan, Jaydeep P.; Altman, Michael D.; Willis, David J.; Lippow, Shaun M.; Tidor, Bruce; White, Jacob K.

    2012-01-01

    Surface formulations of biophysical modeling problems offer attractive theoretical and computational properties. Numerical simulations based on these formulations usually begin with discretization of the surface under consideration; often, the surface is curved, possessing complicated structure and possibly singularities. Numerical simulations commonly are based on approximate, rather than exact, discretizations of these surfaces. To assess the strength of the dependence of simulation accuracy on the fidelity of surface representation, we have developed methods to model several important surface formulations using exact surface discretizations. Following and refining Zauhar’s work (J. Comp.-Aid. Mol. Des. 9:149-159, 1995), we define two classes of curved elements that can exactly discretize the van der Waals, solvent-accessible, and solvent-excluded (molecular) surfaces. We then present numerical integration techniques that can accurately evaluate nonsingular and singular integrals over these curved surfaces. After validating the exactness of the surface discretizations and demonstrating the correctness of the presented integration methods, we present a set of calculations that compare the accuracy of approximate, planar-triangle-based discretizations and exact, curved-element-based simulations of surface-generalized-Born (sGB), surface-continuum van der Waals (scvdW), and boundary-element method (BEM) electrostatics problems. Results demonstrate that continuum electrostatic calculations with BEM using curved elements, piecewise-constant basis functions, and centroid collocation are nearly ten times more accurate than planartriangle BEM for basis sets of comparable size. The sGB and scvdW calculations give exceptional accuracy even for the coarsest obtainable discretized surfaces. The extra accuracy is attributed to the exact representation of the solute–solvent interface; in contrast, commonly used planar-triangle discretizations can only offer improved

  18. An analysis on the environmental Kuznets curve of Chengdu

    Science.gov (United States)

    Gao, Zijian; Peng, Yue; Zhao, Yue

    2017-12-01

    In this paper based on the environmental and economic data of Chengdu from 2005 to 2014, the measurement models were established to analyze 3 kinds of environmental flow indicators and 4 kinds of environmental stock indicators to obtain their EKC evolution trajectories and characters. The results show that the relationship curve between the discharge of SO2 from industry and the GDP per capita is a positive U shape, just as the curve between discharge of COD from industry and the GDP per person. The relationship curve between the dust discharge from industry and the GDP per capita is an inverted N shape. In the central of the urban the relationship curve between the concentration of SO2 in the air and the GDP per person is a positive U shape. The relationship curves between the concentration of NO2 in the air and the GDP per person, between the concentration of the particulate matters and the GDP per person, and between the concentration of the fallen dusts and the GDP per person are fluctuating. So the EKC curves of the 7 kinds of environmental indicators are not accord with inverted U shape feature. In the development of this urban the environmental problems can’t be resolved only by economic growth. The discharge of industrial pollutants should be controlled to improve the atmospheric environmental quality and reduce the environmental risks.

  19. Provincial carbon intensity abatement potential estimation in China: A PSO–GA-optimized multi-factor environmental learning curve method

    International Nuclear Information System (INIS)

    Yu, Shiwei; Zhang, Junjie; Zheng, Shuhong; Sun, Han

    2015-01-01

    This study aims to estimate carbon intensity abatement potential in China at the regional level by proposing a particle swarm optimization–genetic algorithm (PSO–GA) multivariate environmental learning curve estimation method. The model uses two independent variables, namely, per capita gross domestic product (GDP) and the proportion of the tertiary industry in GDP, to construct carbon intensity learning curves (CILCs), i.e., CO 2 emissions per unit of GDP, of 30 provinces in China. Instead of the traditional ordinary least squares (OLS) method, a PSO–GA intelligent optimization algorithm is used to optimize the coefficients of a learning curve. The carbon intensity abatement potentials of the 30 Chinese provinces are estimated via PSO–GA under the business-as-usual scenario. The estimation reveals the following results. (1) For most provinces, the abatement potentials from improving a unit of the proportion of the tertiary industry in GDP are higher than the potentials from raising a unit of per capita GDP. (2) The average potential of the 30 provinces in 2020 will be 37.6% based on the emission's level of 2005. The potentials of Jiangsu, Tianjin, Shandong, Beijing, and Heilongjiang are over 60%. Ningxia is the only province without intensity abatement potential. (3) The total carbon intensity in China weighted by the GDP shares of the 30 provinces will decline by 39.4% in 2020 compared with that in 2005. This intensity cannot achieve the 40%–45% carbon intensity reduction target set by the Chinese government. Additional mitigation policies should be developed to uncover the potentials of Ningxia and Inner Mongolia. In addition, the simulation accuracy of the CILCs optimized by PSO–GA is higher than that of the CILCs optimized by the traditional OLS method. - Highlights: • A PSO–GA-optimized multi-factor environmental learning curve method is proposed. • The carbon intensity abatement potentials of the 30 Chinese provinces are estimated by

  20. Effects of gypsum and bulk density on neutron probe calibration curves

    International Nuclear Information System (INIS)

    Arslan, Awadis; Razzouk, A.K.

    1993-10-01

    The effects of gypsum and bulk density on the neutron probe calibration curve were studied in the laboratory and in the field. The effect of bulk density was negligible for the soil studied in the laboratory, while it was significant for the field calibration. An increase in the slope of moisture content on a volume basis vs. count ratio with increasing gypsum content at the soil was observed in the laboratory calibration. A simple method for correction of the calibration curve for gypsum content was adopted to obtain a specific curve for each layer. The adapted method requires the gypsum fraction to be estimated for each layer and then incorporated in the calibration curve to improve the coefficient of determination. A field calibration showed an improvement of the determination coefficient by introducing bulk density and gypsum fraction, in addition to count ratio using moisture content on a volume basis as a dependent variable in multi linear regression analysis. The same procedure was successful with variable gravel fractions. (author). 18 refs., 3 figs., 2 tabs

  1. An Interoperability Consideration in Selecting Domain Parameters for Elliptic Curve Cryptography

    Science.gov (United States)

    Ivancic, Will (Technical Monitor); Eddy, Wesley M.

    2005-01-01

    Elliptic curve cryptography (ECC) will be an important technology for electronic privacy and authentication in the near future. There are many published specifications for elliptic curve cryptosystems, most of which contain detailed descriptions of the process for the selection of domain parameters. Selecting strong domain parameters ensures that the cryptosystem is robust to attacks. Due to a limitation in several published algorithms for doubling points on elliptic curves, some ECC implementations may produce incorrect, inconsistent, and incompatible results if domain parameters are not carefully chosen under a criterion that we describe. Few documents specify the addition or doubling of points in such a manner as to avoid this problematic situation. The safety criterion we present is not listed in any ECC specification we are aware of, although several other guidelines for domain selection are discussed in the literature. We provide a simple example of how a set of domain parameters not meeting this criterion can produce catastrophic results, and outline a simple means of testing curve parameters for interoperable safety over doubling.

  2. ROBUST DECLINE CURVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sutawanir Darwis

    2012-05-01

    Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.

  3. Assessment of two theoretical methods to estimate potentiometric titration curves of peptides: comparison with experiment.

    Science.gov (United States)

    Makowska, Joanna; Bagiñska, Katarzyna; Makowski, Mariusz; Jagielska, Anna; Liwo, Adam; Kasprzykowski, Franciszek; Chmurzyñski, Lech; Scheraga, Harold A

    2006-03-09

    We compared the ability of two theoretical methods of pH-dependent conformational calculations to reproduce experimental potentiometric titration curves of two models of peptides: Ac-K5-NHMe in 95% methanol (MeOH)/5% water mixture and Ac-XX(A)7OO-NH2 (XAO) (where X is diaminobutyric acid, A is alanine, and O is ornithine) in water, methanol (MeOH), and dimethyl sulfoxide (DMSO), respectively. The titration curve of the former was taken from the literature, and the curve of the latter was determined in this work. The first theoretical method involves a conformational search using the electrostatically driven Monte Carlo (EDMC) method with a low-cost energy function (ECEPP/3 plus the SRFOPT surface-solvation model, assumming that all titratable groups are uncharged) and subsequent reevaluation of the free energy at a given pH with the Poisson-Boltzmann equation, considering variable protonation states. In the second procedure, molecular dynamics (MD) simulations are run with the AMBER force field and the generalized Born model of electrostatic solvation, and the protonation states are sampled during constant-pH MD runs. In all three solvents, the first pKa of XAO is strongly downshifted compared to the value for the reference compounds (ethylamine and propylamine, respectively); the water and methanol curves have one, and the DMSO curve has two jumps characteristic of remarkable differences in the dissociation constants of acidic groups. The predicted titration curves of Ac-K5-NHMe are in good agreement with the experimental ones; better agreement is achieved with the MD-based method. The titration curves of XAO in methanol and DMSO, calculated using the MD-based approach, trace the shape of the experimental curves, reproducing the pH jump, while those calculated with the EDMC-based approach and the titration curve in water calculated using the MD-based approach have smooth shapes characteristic of the titration of weak multifunctional acids with small differences

  4. Using GAMM to examine inter-individual heterogeneity in thermal performance curves for Natrix natrix indicates bet hedging strategy by mothers.

    Science.gov (United States)

    Vickers, Mathew J; Aubret, Fabien; Coulon, Aurélie

    2017-01-01

    The thermal performance curve (TPC) illustrates the dependence on body- and therefore environmental- temperature of many fitness-related aspects of ectotherm ecology and biology including foraging, growth, predator avoidance, and reproduction. The typical thermal performance curve model is linear in its parameters despite the well-known, strong, non-linearity of the response of performance to temperature. In addition, it is usual to consider a single model based on few individuals as descriptive of a species-level response to temperature. To overcome these issues, we used generalized additive mixed modeling (GAMM) to estimate thermal performance curves for 73 individual hatchling Natrix natrix grass snakes from seven clutches, taking advantage of the structure of GAMM to demonstrate that almost 16% of the deviance in thermal performance curves is attributed to inter-individual variation, while only 1.3% is attributable to variation amongst clutches. GAMM allows precise estimation of curve characteristics, which we used to test hypotheses on tradeoffs thought to constrain the thermal performance curve: hotter is better, the specialist-generalist trade off, and resource allocation/acquisition. We observed a negative relationship between maximum performance and performance breadth, indicating a specialist-generalist tradeoff, and a positive relationship between thermal optimum and maximum performance, suggesting "hotter is better". There was a significant difference among matrilines in the relationship between Area Under the Curve and maximum performance - relationship that is an indicator of evenness in acquisition or allocation of resources. As we used unfed hatchlings, the observed matriline effect indicates divergent breeding strategies among mothers, with some mothers provisioning eggs unequally resulting in some offspring being better than others, while other mothers provisioned the eggs more evenly, resulting in even performance throughout the clutch. This

  5. On a framework for generating PoD curves assisted by numerical simulations

    Energy Technology Data Exchange (ETDEWEB)

    Subair, S. Mohamed, E-mail: prajagopal@iitm.ac.in; Agrawal, Shweta, E-mail: prajagopal@iitm.ac.in; Balasubramaniam, Krishnan, E-mail: prajagopal@iitm.ac.in; Rajagopal, Prabhu, E-mail: prajagopal@iitm.ac.in [Indian Institute of Technology Madras, Department of Mechanical Engineering, Chennai, T.N. (India); Kumar, Anish; Rao, Purnachandra B.; Tamanna, Jayakumar [Indira Gandhi Centre for Atomic Research, Metallurgy and Materials Group, Kalpakkam, T.N. (India)

    2015-03-31

    The Probability of Detection (PoD) curve method has emerged as an important tool for the assessment of the performance of NDE techniques, a topic of particular interest to the nuclear industry where inspection qualification is very important. The conventional experimental means of generating PoD curves though, can be expensive, requiring large data sets (covering defects and test conditions), and equipment and operator time. Several methods of achieving faster estimates for PoD curves using physics-based modelling have been developed to address this problem. Numerical modelling techniques are also attractive, especially given the ever-increasing computational power available to scientists today. Here we develop procedures for obtaining PoD curves, assisted by numerical simulation and based on Bayesian statistics. Numerical simulations are performed using Finite Element analysis for factors that are assumed to be independent, random and normally distributed. PoD curves so generated are compared with experiments on austenitic stainless steel (SS) plates with artificially created notches. We examine issues affecting the PoD curve generation process including codes, standards, distribution of defect parameters and the choice of the noise threshold. We also study the assumption of normal distribution for signal response parameters and consider strategies for dealing with data that may be more complex or sparse to justify this. These topics are addressed and illustrated through the example case of generation of PoD curves for pulse-echo ultrasonic inspection of vertical surface-breaking cracks in SS plates.

  6. NEW CONCEPTS AND TEST METHODS OF CURVE PROFILE AREA DENSITY IN SURFACE: ESTIMATION OF AREAL DENSITY ON CURVED SPATIAL SURFACE

    OpenAIRE

    Hong Shen

    2011-01-01

    The concepts of curve profile, curve intercept, curve intercept density, curve profile area density, intersection density in containing intersection (or intersection density relied on intersection reference), curve profile intersection density in surface (or curve intercept intersection density relied on intersection of containing curve), and curve profile area density in surface (AS) were defined. AS expressed the amount of curve profile area of Y phase in the unit containing surface area, S...

  7. Addition compounds of lanthamide (III) and yttrium (III) hexafluorophosphates and N,N - dimethylformamide

    International Nuclear Information System (INIS)

    Braga, L.S.P.

    1983-01-01

    Addition compounds of lanthanide (III) and yttrium (III) hexafluorophosphates and N-N-Dimetylformamide are described to characterize the complexes, elemental analysis, melting ranges, molar conductance measurements, X-ray powder patters infrared and Raman spectra, TG and DTA curves, are studied. Information concerning the decomposition of the adducts through the thermogravimetric curves and the differential thermal analysis curves is obtained. (M.J.C.) [pt

  8. Evaluation of viewing experiences induced by a curved three-dimensional display

    Science.gov (United States)

    Mun, Sungchul; Park, Min-Chul; Yano, Sumio

    2015-10-01

    Despite an increased need for three-dimensional (3-D) functionality in curved displays, comparisons pertinent to human factors between curved and flat panel 3-D displays have rarely been tested. This study compared stereoscopic 3-D viewing experiences induced by a curved display with those of a flat panel display by evaluating subjective and objective measures. Twenty-four participants took part in the experiments and viewed 3-D content with two different displays (flat and curved 3-D display) within a counterbalanced and within-subject design. For the 30-min viewing condition, a paired t-test showed significantly reduced P300 amplitudes, which were caused by engagement rather than cognitive fatigue, in the curved 3-D viewing condition compared to the flat 3-D viewing condition at P3 and P4. No significant differences in P300 amplitudes were observed for 60-min viewing. Subjective ratings of realness and engagement were also significantly higher in the curved 3-D viewing condition than in the flat 3-D viewing condition for 30-min viewing. Our findings support that curved 3-D displays can be effective for enhancing engagement among viewers based on specific viewing times and environments.

  9. M-curves and symmetric products

    Indian Academy of Sciences (India)

    Indranil Biswas

    2017-08-03

    Aug 3, 2017 ... is bounded above by g + 1, where g is the genus of X [11]. Curves which have exactly the maximum number (i.e., genus +1) of components of the real part are called M-curves. Classifying real algebraic curves up to homeomorphism is straightforward, however, classifying even planar non-singular real ...

  10. IS THE J-CURVE EFFECT OBSERVABLE IN TURKISH AGRICULTURAL SECTOR?

    Directory of Open Access Journals (Sweden)

    Mehmet YAZICI

    2006-12-01

    Full Text Available This paper investigates whether or not the J-curve hypothesis holds in Turkish agricultural sector. The analysis is conducted using the model the most commonly employed in j-curve literature. Based on the data covering the period from 1986: I to 1998: III, our results indicate that, following devaluation, agricultural trade balance initially improves, then worsens, and then improves again. This pattern shows that J-curve effect does not exist in Turkish agricultural sector. Another important fi nding is that devaluation worsens the trade balance of the sector in the long run, a result contradicting with the earlier fi ndings for the Turkish economy as a whole.

  11. A New Model of Stopping Sight Distance of Curve Braking Based on Vehicle Dynamics

    Directory of Open Access Journals (Sweden)

    Rong-xia Xia

    2016-01-01

    Full Text Available Compared with straight-line braking, cornering brake has longer braking distance and poorer stability. Therefore, drivers are more prone to making mistakes. The braking process and the dynamics of vehicles in emergency situations on curves were analyzed. A biaxial four-wheel vehicle was simplified to a single model. Considering the braking process, dynamics, force distribution, and stability, a stopping sight distance of the curve braking calculation model was built. Then a driver-vehicle-road simulation platform was built using multibody dynamic software. The vehicle test of brake-in-turn was realized in this platform. The comparison of experimental and calculated values verified the reliability of the computational model. Eventually, the experimental values and calculated values were compared with the stopping sight distance recommended by the Highway Route Design Specification (JTGD20-2006; the current specification of stopping sight distance does not apply to cornering brake sight distance requirements. In this paper, the general values and limits of the curve stopping sight distance are presented.

  12. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves.

    Science.gov (United States)

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-22

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f (α) noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  13. Study of plutonium-addition systems

    International Nuclear Information System (INIS)

    Kuchar, L.; Wozniakova, B.

    1976-11-01

    Steady state phase diagrams and calculated values of concentrations on the solid and liquid curves, the steady state distribution coefficient and thermodynamic control are presented for temperatures ranging from the eutectic reaction temperatures to the Pu melting point temperature for binary systems plutonium-addition (Mg, Al, Si, Ti, Mn, Fe, Co, Ni, Cu, Zn, Ga, Zr, Ru, Os, Th, U, Np). (J.P.)

  14. Experimental Tracking of Limit-Point Bifurcations and Backbone Curves Using Control-Based Continuation

    Science.gov (United States)

    Renson, Ludovic; Barton, David A. W.; Neild, Simon A.

    Control-based continuation (CBC) is a means of applying numerical continuation directly to a physical experiment for bifurcation analysis without the use of a mathematical model. CBC enables the detection and tracking of bifurcations directly, without the need for a post-processing stage as is often the case for more traditional experimental approaches. In this paper, we use CBC to directly locate limit-point bifurcations of a periodically forced oscillator and track them as forcing parameters are varied. Backbone curves, which capture the overall frequency-amplitude dependence of the system’s forced response, are also traced out directly. The proposed method is demonstrated on a single-degree-of-freedom mechanical system with a nonlinear stiffness characteristic. Results are presented for two configurations of the nonlinearity — one where it exhibits a hardening stiffness characteristic and one where it exhibits softening-hardening.

  15. Designing learning curves for carbon capture based on chemical absorption according to the minimum work of separation

    International Nuclear Information System (INIS)

    Rochedo, Pedro R.R.; Szklo, Alexandre

    2013-01-01

    Highlights: • This work defines the minimum work of separation (MWS) for a capture process. • Findings of the analysis indicated a MWS of 0.158 GJ/t for post-combustion. • A review of commercially available processes based on chemical absorption was made. • A review of learning models was conducted, with the addition on a novel model. • A learning curve for post-combustion carbon capture was successfully designed. - Abstract: Carbon capture is one of the most important alternatives for mitigating greenhouse gas emissions in energy facilities. The post-combustion route based on chemical absorption with amine solvents is the most feasible alternative for the short term. However, this route implies in huge energy penalties, mainly related to the solvent regeneration. By defining the minimum work of separation (MWS), this study estimated the minimum energy required to capture the CO 2 emitted by coal-fired thermal power plants. Then, by evaluating solvents and processes and comparing it to the MWS, it proposes the learning model with the best fit for the post-combustion chemical absorption of CO 2 . Learning models are based on earnings from experience, which can include the intensity of research and development. In this study, three models are tested: Wright, DeJong and D and L. Findings of the thermochemical analysis indicated a MWS of 0.158 GJ/t for post-combustion. Conventional solvents currently present an energy penalty eight times the MWS. By using the MWS as a constraint, this study found that the D and L provided the best fit to the available data of chemical solvents and absorption plants. The learning rate determined through this model is very similar to the ones found in the literature

  16. UBVRIz LIGHT CURVES OF 51 TYPE II SUPERNOVAE

    International Nuclear Information System (INIS)

    Galbany, Lluis; Hamuy, Mario; Jaeger, Thomas de; Moraga, Tania; González-Gaitán, Santiago; Gutiérrez, Claudia P.; Phillips, Mark M.; Morrell, Nidia I.; Thomas-Osip, Joanna; Suntzeff, Nicholas B.; Maza, José; González, Luis; Antezana, Roberto; Wishnjewski, Marina; Krisciunas, Kevin; Krzeminski, Wojtek; McCarthy, Patrick; Anderson, Joseph P.; Stritzinger, Maximilian; Folatelli, Gastón

    2016-01-01

    We present a compilation of UBVRIz light curves of 51 type II supernovae discovered during the course of four different surveys during 1986–2003: the Cerro Tololo Supernova Survey, the Calán/Tololo Supernova Program (C and T), the Supernova Optical and Infrared Survey (SOIRS), and the Carnegie Type II Supernova Survey (CATS). The photometry is based on template-subtracted images to eliminate any potential host galaxy light contamination, and calibrated from foreground stars. This work presents these photometric data, studies the color evolution using different bands, and explores the relation between the magnitude at maximum brightness and the brightness decline parameter (s) from maximum light through the end of the recombination phase. This parameter is found to be shallower for redder bands and appears to have the best correlation in the B band. In addition, it also correlates with the plateau duration, being shorter (longer) for larger (smaller) s values

  17. UBVRIz LIGHT CURVES OF 51 TYPE II SUPERNOVAE

    Energy Technology Data Exchange (ETDEWEB)

    Galbany, Lluis; Hamuy, Mario; Jaeger, Thomas de; Moraga, Tania; González-Gaitán, Santiago; Gutiérrez, Claudia P. [Millennium Institute of Astrophysics, Universidad de Chile (Chile); Phillips, Mark M.; Morrell, Nidia I.; Thomas-Osip, Joanna [Carnegie Observatories, Las Campanas Observatory, Casilla 60, La Serena (Chile); Suntzeff, Nicholas B. [Department of Physics and Astronomy, Texas A and M University, College Station, TX 77843 (United States); Maza, José; González, Luis; Antezana, Roberto; Wishnjewski, Marina [Departamento de Astronomía, Universidad de Chile, Camino El Observatorio 1515, Las Condes, Santiago (Chile); Krisciunas, Kevin [George P. and Cynthia Woods Mitchell Institute for Fundamental Physics and Astronomy, Texas A. and M. University, Department of Physics and Astronomy, 4242 TAMU, College Station, TX 77843 (United States); Krzeminski, Wojtek [N. Copernicus Astronomical Center, ul. Bartycka 18, 00-716 Warszawa (Poland); McCarthy, Patrick [The Observatories of the Carnegie Institution for Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); Anderson, Joseph P. [European Southern Observatory, Alonso de Cordova 3107, Vitacura, Casilla 19001, Santiago (Chile); Stritzinger, Maximilian [Department of Physics and Astronomy, Aarhus University (Denmark); Folatelli, Gastón, E-mail: lgalbany@das.uchile.cl [Instituto de Astrofísica de La Plata (IALP, CONICET) (Argentina); and others

    2016-02-15

    We present a compilation of UBVRIz light curves of 51 type II supernovae discovered during the course of four different surveys during 1986–2003: the Cerro Tololo Supernova Survey, the Calán/Tololo Supernova Program (C and T), the Supernova Optical and Infrared Survey (SOIRS), and the Carnegie Type II Supernova Survey (CATS). The photometry is based on template-subtracted images to eliminate any potential host galaxy light contamination, and calibrated from foreground stars. This work presents these photometric data, studies the color evolution using different bands, and explores the relation between the magnitude at maximum brightness and the brightness decline parameter (s) from maximum light through the end of the recombination phase. This parameter is found to be shallower for redder bands and appears to have the best correlation in the B band. In addition, it also correlates with the plateau duration, being shorter (longer) for larger (smaller) s values.

  18. SCEW: a Microsoft Excel add-in for easy creation of survival curves.

    Science.gov (United States)

    Khan, Haseeb Ahmad

    2006-07-01

    Survival curves are frequently used for reporting survival or mortality outcomes of experimental pharmacological/toxicological studies and of clinical trials. Microsoft Excel is a simple and widely used tool for creation of numerous types of graphic presentations however it is difficult to create step-wise survival curves in Excel. Considering the familiarity of clinicians and biomedical scientists with Excel, an algorithm survival curves in Excel worksheet (SCEW) has been developed for easy creation of survival curves directly in Excel worksheets. The algorithm has been integrated in the form of Excel add-in for easy installation and usage. The program is based on modification of frequency data for binary break-up using the spreadsheet formula functions whereas a macro subroutine automates the creation of survival curves. The advantages of this program are simple data input, minimal procedural steps and the creation of survival curves in the familiar confines of Excel.

  19. Optimization of equivalent uniform dose using the L-curve criterion

    International Nuclear Information System (INIS)

    Chvetsov, Alexei V; Dempsey, James F; Palta, Jatinder R

    2007-01-01

    Optimization of equivalent uniform dose (EUD) in inverse planning for intensity-modulated radiation therapy (IMRT) prevents variation in radiobiological effect between different radiotherapy treatment plans, which is due to variation in the pattern of dose nonuniformity. For instance, the survival fraction of clonogens would be consistent with the prescription when the optimized EUD is equal to the prescribed EUD. One of the problems in the practical implementation of this approach is that the spatial dose distribution in EUD-based inverse planning would be underdetermined because an unlimited number of nonuniform dose distributions can be computed for a prescribed value of EUD. Together with ill-posedness of the underlying integral equation, this may significantly increase the dose nonuniformity. To optimize EUD and keep dose nonuniformity within reasonable limits, we implemented into an EUD-based objective function an additional criterion which ensures the smoothness of beam intensity functions. This approach is similar to the variational regularization technique which was previously studied for the dose-based least-squares optimization. We show that the variational regularization together with the L-curve criterion for the regularization parameter can significantly reduce dose nonuniformity in EUD-based inverse planning

  20. Optimization of equivalent uniform dose using the L-curve criterion

    Energy Technology Data Exchange (ETDEWEB)

    Chvetsov, Alexei V; Dempsey, James F; Palta, Jatinder R [Department of Radiation Oncology, University of Florida, Gainesville, FL 32610-0385 (United States)

    2007-09-21

    Optimization of equivalent uniform dose (EUD) in inverse planning for intensity-modulated radiation therapy (IMRT) prevents variation in radiobiological effect between different radiotherapy treatment plans, which is due to variation in the pattern of dose nonuniformity. For instance, the survival fraction of clonogens would be consistent with the prescription when the optimized EUD is equal to the prescribed EUD. One of the problems in the practical implementation of this approach is that the spatial dose distribution in EUD-based inverse planning would be underdetermined because an unlimited number of nonuniform dose distributions can be computed for a prescribed value of EUD. Together with ill-posedness of the underlying integral equation, this may significantly increase the dose nonuniformity. To optimize EUD and keep dose nonuniformity within reasonable limits, we implemented into an EUD-based objective function an additional criterion which ensures the smoothness of beam intensity functions. This approach is similar to the variational regularization technique which was previously studied for the dose-based least-squares optimization. We show that the variational regularization together with the L-curve criterion for the regularization parameter can significantly reduce dose nonuniformity in EUD-based inverse planning.

  1. Optimization of equivalent uniform dose using the L-curve criterion.

    Science.gov (United States)

    Chvetsov, Alexei V; Dempsey, James F; Palta, Jatinder R

    2007-10-07

    Optimization of equivalent uniform dose (EUD) in inverse planning for intensity-modulated radiation therapy (IMRT) prevents variation in radiobiological effect between different radiotherapy treatment plans, which is due to variation in the pattern of dose nonuniformity. For instance, the survival fraction of clonogens would be consistent with the prescription when the optimized EUD is equal to the prescribed EUD. One of the problems in the practical implementation of this approach is that the spatial dose distribution in EUD-based inverse planning would be underdetermined because an unlimited number of nonuniform dose distributions can be computed for a prescribed value of EUD. Together with ill-posedness of the underlying integral equation, this may significantly increase the dose nonuniformity. To optimize EUD and keep dose nonuniformity within reasonable limits, we implemented into an EUD-based objective function an additional criterion which ensures the smoothness of beam intensity functions. This approach is similar to the variational regularization technique which was previously studied for the dose-based least-squares optimization. We show that the variational regularization together with the L-curve criterion for the regularization parameter can significantly reduce dose nonuniformity in EUD-based inverse planning.

  2. Retrograde curves of solidus and solubility

    International Nuclear Information System (INIS)

    Vasil'ev, M.V.

    1979-01-01

    The investigation was concerned with the constitutional diagrams of the eutectic type with ''retrograde solidus'' and ''retrograde solubility curve'' which must be considered as diagrams with degenerate monotectic transformation. The solidus and the solubility curves form a retrograde curve with a common retrograde point representing the solubility maximum. The two branches of the Aetrograde curve can be described with the aid of two similar equations. Presented are corresponding equations for the Cd-Zn system and shown is the possibility of predicting the run of the solubility curve

  3. [Customized and non-customized French intrauterine growth curves. II - Comparison with existing curves and benefits of customization].

    Science.gov (United States)

    Ego, A; Prunet, C; Blondel, B; Kaminski, M; Goffinet, F; Zeitlin, J

    2016-02-01

    Our aim is to compare the new French EPOPé intrauterine growth curves, developed to address the guidelines 2013 of the French College of Obstetricians and Gynecologists, with reference curves currently used in France, and to evaluate the consequences of their adjustment for fetal sex and maternal characteristics. Eight intrauterine and birthweight curves, used in France were compared to the EPOPé curves using data from the French Perinatal Survey 2010. The influence of adjustment on the rate of SGA births and the characteristics of these births was analysed. Due to their birthweight values and distribution, the selected intrauterine curves are less suitable for births in France than the new curves. Birthweight curves led to low rates of SGA births from 4.3 to 8.5% compared to 10.0% with the EPOPé curves. The adjustment for maternal and fetal characteristics avoids the over-representation of girls among SGA births, and reclassifies 4% of births. Among births reclassified as SGA, the frequency of medical and obstetrical risk factors for growth restriction, smoking (≥10 cigarettes/day), and neonatal transfer is higher than among non-SGA births (P<0.01). The EPOPé curves are more suitable for French births than currently used curves, and their adjustment improves the identification of mothers and babies at risk of growth restriction and poor perinatal outcomes. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  4. Extended analysis of cooling curves

    International Nuclear Information System (INIS)

    Djurdjevic, M.B.; Kierkus, W.T.; Liliac, R.E.; Sokolowski, J.H.

    2002-01-01

    Thermal Analysis (TA) is the measurement of changes in a physical property of a material that is heated through a phase transformation temperature range. The temperature changes in the material are recorded as a function of the heating or cooling time in such a manner that allows for the detection of phase transformations. In order to increase accuracy, characteristic points on the cooling curve have been identified using the first derivative curve plotted versus time. In this paper, an alternative approach to the analysis of the cooling curve has been proposed. The first derivative curve has been plotted versus temperature and all characteristic points have been identified with the same accuracy achieved using the traditional method. The new cooling curve analysis also enables the Dendrite Coherency Point (DCP) to be detected using only one thermocouple. (author)

  5. A wide-range embrittlement trend curve for western RPV steels

    International Nuclear Information System (INIS)

    Kirk, M.T.

    2011-01-01

    Embrittlement trend curves (ETCs) are used to estimate neutron irradiation embrittlement as a function of both exposure (fluence, flux, temperature, ...) and composition variables. ETCs provide information needed to assess the structural integrity of operating nuclear reactors, and to determine their suitability for continued safe operation. Past efforts on ETC development in the United States have used data drawn from domestic licensees. While this approach has addressed past needs well, future needs such as power up-rates, license extensions to 60 years and beyond, and the use of low copper materials in new reactors produce future operating conditions for the US reactor fleet that may differ from past experience, suggesting that data from sources other than licensee surveillance programs may be needed. In this paper we draw together embrittlement data expressed in terms of ΔT41J and ΔYS from a wide variety of data sources as a first step in examining future embrittlement trends. We develop a 'wide range' ETC based on a collection of over 2500 data. We assess how well this ETC models the whole database, as well as significant data subsets. Comparisons presented herein indicate that a single algebraic model, denoted WR-C(5), represents reasonably well both the trends evident in the data overall as well as trends exhibited by four special data subsets. The WR-C(5) model indicates the existence of trends in high fluence data (Φ > 2-3*10 19 n/cm 2 , E > 1 MeV) that are not as apparent in the US surveillance data due to the limited quantity of ΔT30 data measured at high fluence in this dataset. Additionally, WR-C(5) models well the trends in both test and power reactor data despite the fact it has not term to account for flux. It is suggested that one appropriate use of the WR-C(5) trend curve may include the design irradiation studies to validate or refute the findings presented herein. Additionally, WR-C(5) could be used, along with other information (e.g., other

  6. Numerical analysis of thermoluminescence glow curves

    International Nuclear Information System (INIS)

    Gomez Ros, J. M.; Delgado, A.

    1989-01-01

    This report presents a method for the numerical analysis of complex thermoluminescence glow curves resolving the individual glow peak components. The method employs first order kinetics analytical expressions and is based In a Marquart-Levenberg minimization procedure. A simplified version of this method for thermoluminescence dosimetry (TLD) is also described and specifically developed to operate whit Lithium Fluoride TLD-100. (Author). 36 refs

  7. Rational Degenerations of M-Curves, Totally Positive Grassmannians and KP2-Solitons

    Science.gov (United States)

    Abenda, Simonetta; Grinevich, Petr G.

    2018-03-01

    We establish a new connection between the theory of totally positive Grassmannians and the theory of M-curves using the finite-gap theory for solitons of the KP equation. Here and in the following KP equation denotes the Kadomtsev-Petviashvili 2 equation [see (1)], which is the first flow from the KP hierarchy. We also assume that all KP times are real. We associate to any point of the real totally positive Grassmannian Gr^{tp} (N,M) a reducible curve which is a rational degeneration of an M-curve of minimal genus {g=N(M-N)} , and we reconstruct the real algebraic-geometric data á la Krichever for the underlying real bounded multiline KP soliton solutions. From this construction, it follows that these multiline solitons can be explicitly obtained by degenerating regular real finite-gap solutions corresponding to smooth M-curves. In our approach, we rule the addition of each new rational component to the spectral curve via an elementary Darboux transformation which corresponds to a section of a specific projection Gr^{tp} (r+1,M-N+r+1)\\mapsto Gr^{tp} (r,M-N+r).

  8. An assessment of mode-coupling and falling-friction mechanisms in railway curve squeal through a simplified approach

    Science.gov (United States)

    Ding, Bo; Squicciarini, Giacomo; Thompson, David; Corradi, Roberto

    2018-06-01

    Curve squeal is one of the most annoying types of noise caused by the railway system. It usually occurs when a train or tram is running around tight curves. Although this phenomenon has been studied for many years, the generation mechanism is still the subject of controversy and not fully understood. A negative slope in the friction curve under full sliding has been considered to be the main cause of curve squeal for a long time but more recently mode coupling has been demonstrated to be another possible explanation. Mode coupling relies on the inclusion of both the lateral and vertical dynamics at the contact and an exchange of energy occurs between the normal and the axial directions. The purpose of this paper is to assess the role of the mode-coupling and falling-friction mechanisms in curve squeal through the use of a simple approach based on practical parameter values representative of an actual situation. A tramway wheel is adopted to study the effect of the adhesion coefficient, the lateral contact position, the contact angle and the damping ratio. Cases corresponding to both inner and outer wheels in the curve are considered and it is shown that there are situations in which both wheels can squeal due to mode coupling. Additionally, a negative slope is introduced in the friction curve while keeping active the vertical dynamics in order to analyse both mechanisms together. It is shown that, in the presence of mode coupling, the squealing frequency can differ from the natural frequency of either of the coupled wheel modes. Moreover, a phase difference between wheel vibration in the vertical and lateral directions is observed as a characteristic of mode coupling. For both these features a qualitative comparison is shown with field measurements which show the same behaviour.

  9. Rectification of light refraction in curved waveguide arrays.

    Science.gov (United States)

    Longhi, Stefano

    2009-02-15

    An "optical ratchet" for discretized light in photonic lattices, which enables observing rectification of light refraction at any input beam conditions, is theoretically presented, and a possible experimental implementation based on periodically curved zigzag waveguide arrays is proposed.

  10. Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.

    Directory of Open Access Journals (Sweden)

    Liping Zhang

    Full Text Available In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.

  11. Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.

    Science.gov (United States)

    Zhang, Liping; Tang, Shanyu; Luo, He

    2016-01-01

    In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.

  12. Enhanced performance of ultracapacitors using redox additive-based electrolytes

    Science.gov (United States)

    Jain, Dharmendra; Kanungo, Jitendra; Tripathi, S. K.

    2018-05-01

    Different concentrations of potassium iodide (KI) as redox additive had been added to 1 M sulfuric acid (H2SO4) electrolyte with an aim of enhancing the capacitance and energy density of ultracapacitors via redox reactions at the interfaces of electrode-electrolyte. Ultracapacitors were fabricated using chemically treated activated carbon as electrode with H2SO4 and H2SO4-KI as an electrolyte. The electrochemical performances of fabricated supercapacitors were investigated by impedance spectroscopy, cyclic voltammetry and charge-discharge techniques. The maximum capacitance ` C' was observed with redox additives-based electrolyte system comprising 1 M H2SO4-0.3 M KI (1072 F g- 1), which is very much higher than conventional 1 M H2SO4 (61.3 F g- 1) aqueous electrolyte-based ultracapacitors. It corresponds to an energy density of 20.49 Wh kg- 1 at 2.1 A g- 1 for redox additive-based electrolyte, which is six times higher as compared to that of pristine electrolyte (1 M H2SO4) having energy density of only 3.36 Wh kg- 1. The temperature dependence behavior of fabricated cell was also analyzed, which shows increasing pattern in its capacitance values in a temperature range of 5-70 °C. Under cyclic stability test, redox electrolyte-based system shows almost 100% capacitance retention up to 5000 cycles and even more. For comparison, ultracapacitors based on polymer gel electrolyte polyvinyl alcohol (PVA) (10 wt%)—{H2SO4 (1 M)-KI (0.3 M)} (90 wt%) have been fabricated and characterized with the same electrode materials.

  13. Fast parallel molecular algorithms for DNA-based computation: solving the elliptic curve discrete logarithm problem over GF2.

    Science.gov (United States)

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.

  14. Growing Up and Cleaning Up: The Environmental Kuznets Curve Redux.

    Science.gov (United States)

    Franklin, Rachel S; Ruth, Matthias

    2012-01-01

    Borrowing from the Kuznets curve literature, researchers have coined the term "environmental Kuznets curve" or EKC to characterize the relationship between pollution levels and income: pollution levels will increase with income but some threshold of income will eventually be reached, beyond which pollution levels will decrease. The link between the original Kuznets curve, which posited a similar relationship between income and inequality, and its pollution-concerned offspring lies primarily with the shape of both curves (an upside-down U) and the central role played by income change. Although the EKC literature has burgeoned over the past several years, few concrete conclusions have been drawn, the main themes of the literature have remained constant, and no consensus has been reached regarding the existence of an environmental Kuznets curve. EKC research has used a variety of types of data and a range of geographical units to examine the effects of income levels on pollution. Changes in pollution levels might also be at least partly explained by countries' position in the demographic transition and their general population structure, however little research has included this important aspect in the analysis. In addition, few analyses confine themselves to an evaluation for one country of the long-term relationship between income and pollution. Using United States CO 2 emissions as well as demographic, employment, trade and energy price data, this paper seeks to highlight the potential impact of population and economic structure in explaining the relationship between income and pollution levels.

  15. Residual stress behaviors induced by laser peening along the edge of curved models

    International Nuclear Information System (INIS)

    Im, Jong Bin; Grandhi, Ramana V.; Ro, Young Hee

    2012-01-01

    Laser peening (LP) induces high magnitude compressive residual stresses in a small region of a component. The compressive residual stresses cause plastic deformation that is resistant to fatigue fracture. Fatigue cracks are generally nucleated at critical areas, and LP is applied for those regions so as to delay the crack initiation. Many critical regions are located on the edge of the curved portion of structures because of stress concentration effects. Several investigations that are available for straight components may not give meaningful guidelines for peening curved components. Therefore, in this paper, we investigate residual stress behaviors induced by LP along the edge of curved models. Three curved models that have different curvatures are investigated for peening performance. Two types of peening configurations, which are simultaneous corner shot and sequential corner shots, are considered in order to obtain compressive residual stresses along an edge. LP simulations of multiple shots are performed to identify overlapping effects on the edge portion of a curved model. In addition, the uncertainty calculation of residual stress induced by LP considering laser pulse duration is performed

  16. A mathematical function for the description of nutrient-response curve.

    Directory of Open Access Journals (Sweden)

    Hamed Ahmadi

    Full Text Available Several mathematical equations have been proposed to modeling nutrient-response curve for animal and human justified on the goodness of fit and/or on the biological mechanism. In this paper, a functional form of a generalized quantitative model based on Rayleigh distribution principle for description of nutrient-response phenomena is derived. The three parameters governing the curve a has biological interpretation, b may be used to calculate reliable estimates of nutrient response relationships, and c provide the basis for deriving relationships between nutrient and physiological responses. The new function was successfully applied to fit the nutritional data obtained from 6 experiments including a wide range of nutrients and responses. An evaluation and comparison were also done based simulated data sets to check the suitability of new model and four-parameter logistic model for describing nutrient responses. This study indicates the usefulness and wide applicability of the new introduced, simple and flexible model when applied as a quantitative approach to characterizing nutrient-response curve. This new mathematical way to describe nutritional-response data, with some useful biological interpretations, has potential to be used as an alternative approach in modeling nutritional responses curve to estimate nutrient efficiency and requirements.

  17. Tempo curves considered harmful

    NARCIS (Netherlands)

    Desain, P.; Honing, H.

    1993-01-01

    In the literature of musicology, computer music research and the psychology of music, timing or tempo measurements are mostly presented in the form of continuous curves. The notion of these tempo curves is dangerous, despite its widespread use, because it lulls its users into the false impression

  18. Rectification of light refraction in curved waveguide arrays

    OpenAIRE

    Longhi, S.

    2010-01-01

    An 'optical ratchet' for discretized light in photonic lattices, which enables to observe rectification of light refraction at any input beam conditions, is theoretically presented, and a possible experimental implementation based on periodically-curved zigzag waveguide arrays is proposed.

  19. Climbing the health learning curve together | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-01-25

    Jan 25, 2011 ... Climbing the health learning curve together ... Many of the projects are creating master's programs at their host universities ... Formerly based in the high Arctic, Atlantis is described by Dr Martin Forde of St George's University ...

  20. Testing the validity of stock-recruitment curve fits

    International Nuclear Information System (INIS)

    Christensen, S.W.; Goodyear, C.P.

    1988-01-01

    The utilities relied heavily on the Ricker stock-recruitment model as the basis for quantifying biological compensation in the Hudson River power case. They presented many fits of the Ricker model to data derived from striped bass catch and effort records compiled by the National Marine Fisheries Service. Based on this curve-fitting exercise, a value of 4 was chosen for the parameter alpha in the Ricker model, and this value was used to derive the utilities' estimates of the long-term impact of power plants on striped bass populations. A technique was developed and applied to address a single fundamental question: if the Ricker model were applicable to the Hudson River striped bass population, could the estimates of alpha from the curve-fitting exercise be considered reliable. The technique involved constructing a simulation model that incorporated the essential biological features of the population and simulated the characteristics of the available actual catch-per-unit-effort data through time. The ability or failure to retrieve the known parameter values underlying the simulation model via the curve-fitting exercise was a direct test of the reliability of the results of fitting stock-recruitment curves to the real data. The results demonstrated that estimates of alpha from the curve-fitting exercise were not reliable. The simulation-modeling technique provides an effective way to identify whether or not particular data are appropriate for use in fitting such models. 39 refs., 2 figs., 3 tabs

  1. Free Vibration Analysis of Functionally Graded Porous Doubly-Curved Shells Based on the First-Order Shear Deformation Theory

    Directory of Open Access Journals (Sweden)

    Farajollah Zare Jouneghani

    2017-12-01

    Full Text Available Due to some technical issues that can appear during the manufacturing process of Functionally Graded Materials (FGMs, it can be extremely difficult to produce perfect materials. Indeed, one of the biggest problems is the presence of porosities. For this purpose, the vibrational behavior of doubly-curved shells made of FGM including porosities is investigated in this paper. With respect to previous research, the porosity has been added to the mechanical model that characterizes the through-the-thickness distribution of the graded constituents and applied to doubly-curved shell structures. Few papers have been published on this topic. In fact, it is easier to find works related to one-dimensional structures and beam models that take account the effect of porosities. The First-order Shear Deformation Theory (FSDT is considered as the theoretical framework. In addition, the mechanical properties of the constituents vary along the thickness direction. For this purpose, two power-law distributions are employed to characterize their volume fraction. Strain components are established in an orthogonal curvilinear coordinate system and the governing equations are derived according to the Hamilton’s principle. Finally, Navier’s solution method is used and the numerical results concerning three different types of shell structures are presented.

  2. An Efficient Method for Detection of Outliers in Tracer Curves Derived from Dynamic Contrast-Enhanced Imaging

    Directory of Open Access Journals (Sweden)

    Linning Ye

    2018-01-01

    Full Text Available Presence of outliers in tracer concentration-time curves derived from dynamic contrast-enhanced imaging can adversely affect the analysis of the tracer curves by model-fitting. A computationally efficient method for detecting outliers in tracer concentration-time curves is presented in this study. The proposed method is based on a piecewise linear model and implemented using a robust clustering algorithm. The method is noniterative and all the parameters are automatically estimated. To compare the proposed method with existing Gaussian model based and robust regression-based methods, simulation studies were performed by simulating tracer concentration-time curves using the generalized Tofts model and kinetic parameters derived from different tissue types. Results show that the proposed method and the robust regression-based method achieve better detection performance than the Gaussian model based method. Compared with the robust regression-based method, the proposed method can achieve similar detection performance with much faster computation speed.

  3. Optimal Joint Detection and Estimation That Maximizes ROC-Type Curves.

    Science.gov (United States)

    Wunderlich, Adam; Goossens, Bart; Abbey, Craig K

    2016-09-01

    Combined detection-estimation tasks are frequently encountered in medical imaging. Optimal methods for joint detection and estimation are of interest because they provide upper bounds on observer performance, and can potentially be utilized for imaging system optimization, evaluation of observer efficiency, and development of image formation algorithms. We present a unified Bayesian framework for decision rules that maximize receiver operating characteristic (ROC)-type summary curves, including ROC, localization ROC (LROC), estimation ROC (EROC), free-response ROC (FROC), alternative free-response ROC (AFROC), and exponentially-transformed FROC (EFROC) curves, succinctly summarizing previous results. The approach relies on an interpretation of ROC-type summary curves as plots of an expected utility versus an expected disutility (or penalty) for signal-present decisions. We propose a general utility structure that is flexible enough to encompass many ROC variants and yet sufficiently constrained to allow derivation of a linear expected utility equation that is similar to that for simple binary detection. We illustrate our theory with an example comparing decision strategies for joint detection-estimation of a known signal with unknown amplitude. In addition, building on insights from our utility framework, we propose new ROC-type summary curves and associated optimal decision rules for joint detection-estimation tasks with an unknown, potentially-multiple, number of signals in each observation.

  4. Application of Learning Curves for Didactic Model Evaluation: Case Studies

    Directory of Open Access Journals (Sweden)

    Felix Mödritscher

    2013-01-01

    Full Text Available The success of (online courses depends, among other factors, on the underlying didactical models which have always been evaluated with qualitative and quantitative research methods. Several new evaluation techniques have been developed and established in the last years. One of them is ‘learning curves’, which aim at measuring error rates of users when they interact with adaptive educational systems, thereby enabling the underlying models to be evaluated and improved. In this paper, we report how we have applied this new method to two case studies to show that learning curves are useful to evaluate didactical models and their implementation in educational platforms. Results show that the error rates follow a power law distribution with each additional attempt if the didactical model of an instructional unit is valid. Furthermore, the initial error rate, the slope of the curve and the goodness of fit of the curve are valid indicators for the difficulty level of a course and the quality of its didactical model. As a conclusion, the idea of applying learning curves for evaluating didactical model on the basis of usage data is considered to be valuable for supporting teachers and learning content providers in improving their online courses.

  5. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  6. Image scaling curve generation

    NARCIS (Netherlands)

    2012-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  7. Image scaling curve generation.

    NARCIS (Netherlands)

    2011-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  8. Definition and measurement of statistical gloss parameters from curved objects

    Energy Technology Data Exchange (ETDEWEB)

    Kuivalainen, Kalle; Oksman, Antti; Peiponen, Kai-Erik

    2010-09-20

    Gloss standards are commonly defined for gloss measurement from flat surfaces, and, accordingly, glossmeters are typically developed for flat objects. However, gloss inspection of convex, concave, and small products is also important. In this paper, we define statistical gloss parameters for curved objects and measure gloss data on convex and concave surfaces using two different diffractive-optical-element-based glossmeters. Examples of measurements with the two diffractive-optical-element-based glossmeters are given for convex and concave aluminum pipe samples with and without paint. The defined gloss parameters for curved objects are useful in the characterization of the surface quality of metal pipes and other objects.

  9. Definition and measurement of statistical gloss parameters from curved objects

    International Nuclear Information System (INIS)

    Kuivalainen, Kalle; Oksman, Antti; Peiponen, Kai-Erik

    2010-01-01

    Gloss standards are commonly defined for gloss measurement from flat surfaces, and, accordingly, glossmeters are typically developed for flat objects. However, gloss inspection of convex, concave, and small products is also important. In this paper, we define statistical gloss parameters for curved objects and measure gloss data on convex and concave surfaces using two different diffractive-optical-element-based glossmeters. Examples of measurements with the two diffractive-optical-element-based glossmeters are given for convex and concave aluminum pipe samples with and without paint. The defined gloss parameters for curved objects are useful in the characterization of the surface quality of metal pipes and other objects.

  10. Dose-effect Curve for X-radiation in Lymphocytes in Goats

    International Nuclear Information System (INIS)

    Hasanbasic, D.; Saracevic, L.; Sacirbegovic, A.

    1998-01-01

    Dose-effect curve for X-radiation was made based on the analysis of chromosome aberrations in lympocytes of goats. Blood samples from seven goats were irradiated using MOORHEAD method, slightly modified and adapted to our conditions. Linear-square model was used, and the dose-effect curves were fitted by the smallest squares method. Dose-effect curve (collective) for goats is displayed as the following expression: y(D)= 8,6639·10 -3 D + 2,9748·10 -2 D 2 +2,9475·10 -3 . Comparison with some domestic animals such as sheep and pigs showed differences not only with respect to linear-square model, but to other mathematical presentations as well. (author)

  11. On-chip magnetic bead-based DNA melting curve analysis using a magnetoresistive sensor

    International Nuclear Information System (INIS)

    Rizzi, Giovanni; Østerberg, Frederik W.; Henriksen, Anders D.; Dufva, Martin; Hansen, Mikkel F.

    2015-01-01

    We present real-time measurements of DNA melting curves in a chip-based system that detects the amount of surface-bound magnetic beads using magnetoresistive magnetic field sensors. The sensors detect the difference between the amount of beads bound to the top and bottom sensor branches of the differential sensor geometry. The sensor surfaces are functionalized with wild type (WT) and mutant type (MT) capture probes, differing by a single base insertion (a single nucleotide polymorphism, SNP). Complementary biotinylated targets in suspension couple streptavidin magnetic beads to the sensor surface. The beads are magnetized by the field arising from the bias current passed through the sensors. We demonstrate the first on-chip measurements of the melting of DNA hybrids upon a ramping of the temperature. This overcomes the limitation of using a single washing condition at constant temperature. Moreover, we demonstrate that a single sensor bridge can be used to genotype a SNP. - Highlights: • We apply magnetoresistive sensors to study solid-surface hybridization kinetics of DNA. • We measure DNA melting profiles for perfectly matching DNA duplexes and for a single base mismatch. • We present a procedure to correct for temperature dependencies of the sensor output. • We reliably extract melting temperatures for the DNA hybrids. • We demonstrate direct measurement of differential binding signal for two probes on a single sensor

  12. Image Features Based on Characteristic Curves and Local Binary Patterns for Automated HER2 Scoring

    Directory of Open Access Journals (Sweden)

    Ramakrishnan Mukundan

    2018-02-01

    Full Text Available This paper presents novel feature descriptors and classification algorithms for the automated scoring of HER2 in Whole Slide Images (WSI of breast cancer histology slides. Since a large amount of processing is involved in analyzing WSI images, the primary design goal has been to keep the computational complexity to the minimum possible level and to use simple, yet robust feature descriptors that can provide accurate classification of the slides. We propose two types of feature descriptors that encode important information about staining patterns and the percentage of staining present in ImmunoHistoChemistry (IHC-stained slides. The first descriptor is called a characteristic curve, which is a smooth non-increasing curve that represents the variation of percentage of staining with saturation levels. The second new descriptor introduced in this paper is a local binary pattern (LBP feature curve, which is also a non-increasing smooth curve that represents the local texture of the staining patterns. Both descriptors show excellent interclass variance and intraclass correlation and are suitable for the design of automatic HER2 classification algorithms. This paper gives the detailed theoretical aspects of the feature descriptors and also provides experimental results and a comparative analysis.

  13. Spectral optimization simulation of white light based on the photopic eye-sensitivity curve

    International Nuclear Information System (INIS)

    Dai, Qi; Hao, Luoxi; Lin, Yi; Cui, Zhe

    2016-01-01

    Spectral optimization simulation of white light is studied to boost maximum attainable luminous efficacy of radiation at high color-rendering index (CRI) and various color temperatures. The photopic eye-sensitivity curve V(λ) is utilized as the dominant portion of white light spectra. Emission spectra of a blue InGaN light-emitting diode (LED) and a red AlInGaP LED are added to the spectrum of V(λ) to match white color coordinates. It is demonstrated that at the condition of color temperature from 2500 K to 6500 K and CRI above 90, such white sources can achieve spectral efficacy of 330–390 lm/W, which is higher than the previously reported theoretical maximum values. We show that this eye-sensitivity-based approach also has advantages on component energy conversion efficiency compared with previously reported optimization solutions

  14. Spectral optimization simulation of white light based on the photopic eye-sensitivity curve

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qi, E-mail: qidai@tongji.edu.cn [College of Architecture and Urban Planning, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Institute for Advanced Study, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Key Laboratory of Ecology and Energy-saving Study of Dense Habitat (Tongji University), Ministry of Education, 1239 Siping Road, Shanghai 200092 (China); Hao, Luoxi; Lin, Yi; Cui, Zhe [College of Architecture and Urban Planning, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Key Laboratory of Ecology and Energy-saving Study of Dense Habitat (Tongji University), Ministry of Education, 1239 Siping Road, Shanghai 200092 (China)

    2016-02-07

    Spectral optimization simulation of white light is studied to boost maximum attainable luminous efficacy of radiation at high color-rendering index (CRI) and various color temperatures. The photopic eye-sensitivity curve V(λ) is utilized as the dominant portion of white light spectra. Emission spectra of a blue InGaN light-emitting diode (LED) and a red AlInGaP LED are added to the spectrum of V(λ) to match white color coordinates. It is demonstrated that at the condition of color temperature from 2500 K to 6500 K and CRI above 90, such white sources can achieve spectral efficacy of 330–390 lm/W, which is higher than the previously reported theoretical maximum values. We show that this eye-sensitivity-based approach also has advantages on component energy conversion efficiency compared with previously reported optimization solutions.

  15. The learning curve for narrow-band imaging in the diagnosis of precancerous gastric lesions by using Web-based video.

    Science.gov (United States)

    Dias-Silva, Diogo; Pimentel-Nunes, Pedro; Magalhães, Joana; Magalhães, Ricardo; Veloso, Nuno; Ferreira, Carlos; Figueiredo, Pedro; Moutinho, Pedro; Dinis-Ribeiro, Mário

    2014-06-01

    A simplified narrow-band imaging (NBI) endoscopy classification of gastric precancerous and cancerous lesions was derived and validated in a multicenter study. This classification comes with the need for dissemination through adequate training. To address the learning curve of this classification by endoscopists with differing expertise and to assess the feasibility of a YouTube-based learning program to disseminate it. Prospective study. Five centers. Six gastroenterologists (3 trainees, 3 fully trained endoscopists [FTs]). Twenty tests provided through a Web-based program containing 10 randomly ordered NBI videos of gastric mucosa were taken. Feedback was sent 7 days after every test submission. Measures of accuracy of the NBI classification throughout the time. From the first to the last 50 videos, a learning curve was observed with a 10% increase in global accuracy, for both trainees (from 64% to 74%) and FTs (from 56% to 65%). After 200 videos, sensitivity and specificity of 80% and higher for intestinal metaplasia were observed in half the participants, and a specificity for dysplasia greater than 95%, along with a relevant likelihood ratio for a positive result of 7 to 28 and likelihood ratio for a negative result of 0.21 to 0.82, were achieved by all of the participants. No constant learning curve was observed for the identification of Helicobacter pylori gastritis and sensitivity to dysplasia. The trainees had better results in all of the parameters, except specificity for dysplasia, compared with the FTs. Globally, participants agreed that the program's structure was adequate, except on the feedback, which should have consisted of a more detailed explanation of each answer. No formal sample size estimate. A Web-based learning program could be used to teach and disseminate classifications in the endoscopy field. In this study, an NBI classification for gastric mucosal features seems to be easily learned for the identification of gastric preneoplastic

  16. Defining the learning curve of laparoendoscopic single-site Heller myotomy.

    Science.gov (United States)

    Ross, Sharona B; Luberice, Kenneth; Kurian, Tony J; Paul, Harold; Rosemurgy, Alexander S

    2013-08-01

    Initial outcomes suggest laparoendoscopic single-site (LESS) Heller myotomy with anterior fundoplication provides safe, efficacious, and cosmetically superior outcomes relative to conventional laparoscopy. This study was undertaken to define the learning curve of LESS Heller myotomy with anterior fundoplication. One hundred patients underwent LESS Heller myotomy with anterior fundoplication. Symptom frequency and severity were scored using a Likert scale (0 = never/not bothersome to 10 = always/very bothersome). Symptom resolution, additional trocars, and complications were compared among patient quartiles. Median data are presented. Preoperative frequency/severity scores were: dysphagia = 10/8 and regurgitation = 8/7. Additional trocars were placed in 12 patients (10%), of whom all were in the first two quartiles. Esophagotomy/gastrotomy occurred in three patients. Postoperative complications occurred in 9 per cent. No conversions to "open" operations occurred. Length of stay was 1 day. Postoperative frequency/severity scores were: dysphagia = 2/0 and regurgitation = 0/0; scores were less than before myotomy (P Heller myotomy with anterior fundoplication well palliates symptoms of achalasia with no apparent scar. Placement of additional trocars only occurred early in the experience. For surgeons proficient with the conventional laparoscopic approach, the learning curve of LESS Heller myotomy with anterior fundoplication is short and safe, because proficiency is quickly attained.

  17. Planning of the energetic operation based on storage guide-curves; Planejamento da operacao energetica baseado em curvas-guias de armazenamento

    Energy Technology Data Exchange (ETDEWEB)

    Zambelli, Monica de S.; Cicogna, Marcelo A.; Soares, Secundino [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Eletrica

    2006-07-01

    The proposal of this work is to present a long term hydrothermal scheduling operating policy based on the concept of storage guide-curves. According to this policy, at each stage of the planning period the decision of the amount of water to be discharged by each hydrothermal unit must be such that keep its reservoir at levels pre-determined by curves obtained by an optimization method. The performance analysis for this operating policy is given by simulation with historical inflow data, considering a single hydrothermal system, constituted by a single hydro plant, and a composite system, constituted by hydro plants in cascade, adopting as performance criteria the minimization of the expected operating cost. The results demonstrate that, although simple and clear, this operating policy presents a competitive performance in the long term hydrothermal scheduling. (author)

  18. Effusion plate using additive manufacturing methods

    Science.gov (United States)

    Johnson, Thomas Edward; Keener, Christopher Paul; Ostebee, Heath Michael; Wegerif, Daniel Gerritt

    2016-04-12

    Additive manufacturing techniques may be utilized to construct effusion plates. Such additive manufacturing techniques may include defining a configuration for an effusion plate having one or more internal cooling channels. The manufacturing techniques may further include depositing a powder into a chamber, applying an energy source to the deposited powder, and consolidating the powder into a cross-sectional shape corresponding to the defined configuration. Such methods may be implemented to construct an effusion plate having one or more channels with a curved cross-sectional geometry.

  19. A note on families of fragility curves

    International Nuclear Information System (INIS)

    Kaplan, S.; Bier, V.M.; Bley, D.C.

    1989-01-01

    In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve

  20. Power forward curves: a managerial perspective

    International Nuclear Information System (INIS)

    Nagarajan, Shankar

    1999-01-01

    This chapter concentrates on managerial application of power forward curves, and examines the determinants of electricity prices such as transmission constraints, its inability to be stored in a conventional way, its seasonality and weather dependence, the generation stack, and the swing risk. The electricity forward curve, classical arbitrage, constructing a forward curve, volatilities, and electricity forward curve models such as the jump-diffusion model, the mean-reverting heteroscedastic volatility model, and an econometric model of forward prices are examined. A managerial perspective of the applications of the forward curve is presented covering plant valuation, capital budgeting, performance measurement, product pricing and structuring, asset optimisation, valuation of transmission options, and risk management

  1. Base Stabilization Guidance and Additive Selection for Pavement Design and Rehabilitation

    Science.gov (United States)

    2017-12-01

    Significant improvements have been made in base stabilization practice that include design specifications and methodology, experience with the selection of stabilizing additives, and equipment for distribution and uniform blending of additives. For t...

  2. A novel and compact spectral imaging system based on two curved prisms

    Science.gov (United States)

    Nie, Yunfeng; Bin, Xiangli; Zhou, Jinsong; Li, Yang

    2013-09-01

    As a novel detection approach which simultaneously acquires two-dimensional visual picture and one-dimensional spectral information, spectral imaging offers promising applications on biomedical imaging, conservation and identification of artworks, surveillance of food safety, and so forth. A novel moderate-resolution spectral imaging system consisting of merely two optical elements is illustrated in this paper. It can realize the function of a relay imaging system as well as a 10nm spectral resolution spectroscopy. Compared to conventional prismatic imaging spectrometers, this design is compact and concise with only two special curved prisms by utilizing two reflective surfaces. In contrast to spectral imagers based on diffractive grating, the usage of compound-prism possesses characteristics of higher energy utilization and wider free spectral range. The seidel aberration theory and dispersive principle of this special prism are analyzed at first. According to the results, the optical system of this design is simulated, and the performance evaluation including spot diagram, MTF and distortion, is presented. In the end, considering the difficulty and particularity of manufacture and alignment, an available method for fabrication and measurement is proposed.

  3. Experimental analysis of waveform effects on satellite and ligament behavior via in situ measurement of the drop-on-demand drop formation curve and the instantaneous jetting speed curve

    International Nuclear Information System (INIS)

    Kwon, Kye-Si

    2010-01-01

    In situ techniques to measure the drop-on-demand (DOD) drop formation curve and the instantaneous jetting speed curve are developed such that ligament behavior and satellite behavior of inkjet droplets can be analyzed effectively. It is known that the droplet jetting behavior differs by ink properties and the driving waveform voltage. In this study, to reduce possible droplet placement errors due to satellite drops or long ligaments during printing, waveform effects on drop formation are investigated based on the measured DOD drop formation curve and the instantaneous jetting speed curve. Experimental results show that a dwell time greater than the so-called efficient dwell time was effective in reducing placement errors due to satellite drops during the printing process

  4. Simulation of a G-tolerance curve using the pulsatile cardiovascular model

    Science.gov (United States)

    Solomon, M.; Srinivasan, R.

    1985-01-01

    A computer simulation study, performed to assess the ability of the cardiovascular model to reproduce the G tolerance curve (G level versus tolerance time) is reported. A composite strength duration curve derived from experimental data obtained in human centrifugation studies was used for comparison. The effects of abolishing automomic control and of blood volume loss on G tolerance were also simulated. The results provide additional validation of the model. The need for the presence of autonomic reflexes even at low levels of G is pointed out. The low margin of safety with a loss of blood volume indicated by the simulation results underscores the necessity for protective measures during Shuttle reentry.

  5. Stochastic geometry of critical curves, Schramm-Loewner evolutions and conformal field theory

    International Nuclear Information System (INIS)

    Gruzberg, Ilya A

    2006-01-01

    Conformally invariant curves that appear at critical points in two-dimensional statistical mechanics systems and their fractal geometry have received a lot of attention in recent years. On the one hand, Schramm (2000 Israel J. Math. 118 221 (Preprint math.PR/9904022)) has invented a new rigorous as well as practical calculational approach to critical curves, based on a beautiful unification of conformal maps and stochastic processes, and by now known as Schramm-Loewner evolution (SLE). On the other hand, Duplantier (2000 Phys. Rev. Lett. 84 1363; Fractal Geometry and Applications: A Jubilee of Benot Mandelbrot: Part 2 (Proc. Symp. Pure Math. vol 72) (Providence, RI: American Mathematical Society) p 365 (Preprint math-ph/0303034)) has applied boundary quantum gravity methods to calculate exact multifractal exponents associated with critical curves. In the first part of this paper, I provide a pedagogical introduction to SLE. I present mathematical facts from the theory of conformal maps and stochastic processes related to SLE. Then I review basic properties of SLE and provide practical derivation of various interesting quantities related to critical curves, including fractal dimensions and crossing probabilities. The second part of the paper is devoted to a way of describing critical curves using boundary conformal field theory (CFT) in the so-called Coulomb gas formalism. This description provides an alternative (to quantum gravity) way of obtaining the multifractal spectrum of critical curves using only traditional methods of CFT based on free bosonic fields

  6. QUEST1 Variability Survey. III. Light Curve Catalog Update

    Science.gov (United States)

    Rengstorf, A. W.; Thompson, D. L.; Mufson, S. L.; Andrews, P.; Honeycutt, R. K.; Vivas, A. K.; Abad, C.; Adams, B.; Bailyn, C.; Baltay, C.; Bongiovanni, A.; Briceño, C.; Bruzual, G.; Coppi, P.; Della Prugna, F.; Emmet, W.; Ferrín, I.; Fuenmayor, F.; Gebhard, M.; Hernández, J.; Magris, G.; Musser, J.; Naranjo, O.; Oemler, A.; Rosenzweig, P.; Sabbey, C. N.; Sánchez, Ge.; Sánchez, Gu.; Schaefer, B.; Schenner, H.; Sinnott, J.; Snyder, J. A.; Sofia, S.; Stock, J.; van Altena, W.

    2009-03-01

    This paper reports an update to the QUEST1 (QUasar Equatorial Survey Team, Phase 1) Variability Survey (QVS) light curve catalog, which links QVS instrumental magnitude light curves to Sloan Digital Sky Survey (SDSS) objects and photometry. In the time since the original QVS catalog release, the overlap between publicly available SDSS data and QVS data has increased by 8% in sky coverage and 16,728 in number of matched objects. The astrometric matching and the treatment of SDSS masks have been refined for the updated catalog. We report on these improvements and present multiple bandpass light curves, global variability information, and matched SDSS photometry for 214,941 QUEST1 objects. Based on observations obtained at the Llano del Hato National Astronomical Observatory, operated by the Centro de Investigaciones de Astronomía for the Ministerio de Ciencia y Tecnologia of Venezuela.

  7. Dose-response curves from incomplete data

    International Nuclear Information System (INIS)

    Groer, P.G.

    1978-01-01

    Frequently many different responses occur in populations (animal or human) exposed to ionizing radiation. To obtain a dose-response curve, the exposed population is first divided into sub-groups whose members received the same radiation dose. To estimate the response, the fraction of subjects in each sub-group that showed the particular response of interest is determined. These fractions are plotted against dose to give the dose-response curve. This procedure of plotting the fractions versus the radiation dose is not the correct way to estimate the time distribution for a particular response at the different dose levels. Other observed responses competed for the individuals in the exposed population and therefore prevented manifestation of the complete information on the response-time distribution for one specific response. Such data are called incomplete in the statistical literature. A procedure is described which uses the by now classical Kaplan-Meier estimator, to establish dose-response curves from incomplete data under the assumption that the different observed responses are statistically independent. It is demonstrated that there is insufficient information in the observed survival functions to estimate the time distribution for one particular response if the assumption of independence is dropped. In addition, it is not possible to determine from the data (i.e. type of response and when it occurred) whether or not the different response-time distributions are independent. However, it is possible to give sharp bounds between which the response has to lie. This implies that for incomplete data, only a 'dose-response band' can be established if independence of the competing responses cannot be assumed. Examples are given using actual data to illustrate the estimation procedures

  8. Parameter Deduction and Accuracy Analysis of Track Beam Curves in Straddle-type Monorail Systems

    Directory of Open Access Journals (Sweden)

    Xiaobo Zhao

    2015-12-01

    Full Text Available The accuracy of the bottom curve of a PC track beam is strongly related to the production quality of the entire beam. Many factors may affect the parameters of the bottom curve, such as the superelevation of the curve and the deformation of a PC track beam. At present, no effective method has been developed to determine the bottom curve of a PC track beam; therefore, a new technique is presented in this paper to deduce the parameters of such a curve and to control the accuracy of the computation results. First, the domain of the bottom curve of a PC track beam is assumed to be a spindle plane. Then, the corresponding supposed top curve domain is determined based on a geometrical relationship that is the opposite of that identified by the conventional method. Second, several optimal points are selected from the supposed top curve domain according to the dichotomy algorithm; the supposed top curve is thus generated by connecting these points. Finally, one rigorous criterion is established in the fractal dimension to assess the accuracy of the assumed top curve deduced in the previous step. If this supposed curve coincides completely with the known top curve, then the assumed bottom curve corresponding to the assumed top curve is considered to be the real bottom curve. This technique of determining the bottom curve of a PC track beam is thus proven to be efficient and accurate.

  9. Use of regionalisation approach to develop fire frequency curves for Victoria, Australia

    Science.gov (United States)

    Khastagir, Anirban; Jayasuriya, Niranjali; Bhuyian, Muhammed A.

    2017-11-01

    It is important to perform fire frequency analysis to obtain fire frequency curves (FFC) based on fire intensity at different parts of Victoria. In this paper fire frequency curves (FFCs) were derived based on forest fire danger index (FFDI). FFDI is a measure related to fire initiation, spreading speed and containment difficulty. The mean temperature (T), relative humidity (RH) and areal extent of open water (LC2) during summer months (Dec-Feb) were identified as the most important parameters for assessing the risk of occurrence of bushfire. Based on these parameters, Andrews' curve equation was applied to 40 selected meteorological stations to identify homogenous stations to form unique clusters. A methodology using peak FFDI from cluster averaged FFDIs was developed by applying Log Pearson Type III (LPIII) distribution to generate FFCs. A total of nine homogeneous clusters across Victoria were identified, and subsequently their FFC's were developed in order to estimate the regionalised fire occurrence characteristics.

  10. Applicability of the fracture toughness master curve to irradiated reactor pressure vessel steels

    International Nuclear Information System (INIS)

    Sokolov, M.A.; McCabe, D.E.; Alexander, D.J.; Nanstad, R.K.

    1997-01-01

    The current methodology for determination of fracture toughness of irradiated reactor pressure vessel (RPV) steels is based on the upward temperature shift of the American Society of Mechanical Engineers (ASME) K Ic curve from either measurement of Charpy impact surveillance specimens or predictive calculations based on a database of Charpy impact tests from RPV surveillance programs. Currently, the provisions for determination of the upward temperature shift of the curve due to irradiation are based on the Charpy V-notch (CVN) 41-J shift, and the shape of the fracture toughness curve is assumed to not change as a consequence or irradiation. The ASME curve is a function of test temperature (T) normalized to a reference nit-ductility temperature, RT NDT , namely, T-RT NDT . That curve was constructed as the lower boundary to the available K Ic database and, therefore, does not consider probability matters. Moreover, to achieve valid fracture toughness data in the temperature range where the rate of fracture toughness increase with temperature is rapidly increasing, very large test specimens were needed to maintain plain-strain, linear-elastic conditions. Such large specimens are impractical for fracture toughness testing of each RPV steel, but the evolution of elastic-plastic fracture mechanics has led to the use of relatively small test specimens to achieve acceptable cleavage fracture toughness measurements, K Jc , in the transition temperature range. Accompanying this evolution is the employment of the Weibull distribution function to model the scatter of fracture toughness values in the transition range. Thus, a probabilistic-based bound for a given data population can be made. Further, it has been demonstrated by Wallin that the probabilistic-based estimates of median fracture toughness of ferritic steels tend to form transition curves of the same shape, the so-called ''master curve'', normalized to one common specimen size, namely the 1T [i.e., 1.0-in

  11. Monitoring and Fault Detection in Photovoltaic Systems Based On Inverter Measured String I-V Curves

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas

    2015-01-01

    Most photovoltaic (PV) string inverters have the hardware capability to measure at least part of the current-voltage (I-V) characteristic curve of the PV strings connected at the input. However, this intrinsic capability of the inverters is not used, since I-V curve measurement and monitoring...... functions are not implemented in the inverter control software. In this paper, we aim to show how such a functionality can be useful for PV system monitoring purposes, to detect the presence and cause of power-loss in the PV strings, be it due to shading, degradation of the PV modules or balance......-of-system components through increased series resistance losses, or shunting of the PV modules. To achieve this, we propose and experimentally demonstrate three complementary PV system monitoring methods that make use of the I-V curve measurement capability of a commercial string inverter. The first method is suitable...

  12. Utilization of multimode Love wave dispersion curve inversion for geotechnical site investigation

    International Nuclear Information System (INIS)

    Hamimu, La; Nawawi, Mohd; Safani, Jamhir

    2011-01-01

    Inversion codes based on a modified genetic algorithm (GA) have been developed to invert multimode Love wave dispersion curves. The multimode Love wave dispersion curves were synthesized from the profile representing shear-wave velocity reversal using a full SH (shear horizontal) waveform. In this study, we used a frequency–slowness transform to extract the dispersion curve from the full SH waveform. Dispersion curves overlain in dispersion images were picked manually. These curves were then inverted using the modified GA. To assess the accuracy of the inversion results, differences between the true and inverted shear-wave velocity profile were quantified in terms of shear-wave velocity and thickness errors, E S and E H . Our numerical modeling showed that the inversion of multimode dispersion curves can significantly provide the better assessment of a shear-wave velocity structure, especially with a velocity reversal profile at typical geotechnical site investigations. This approach has been applied on field data acquired at a site in Niigata prefecture, Japan. In these field data, our inversion results show good agreement between the calculated and experimental dispersion curves and accurately detect low velocity layer targets

  13. Determination of Sight Distance on a Combined Crest and Circular Curve in a Three Dimensional Space

    Directory of Open Access Journals (Sweden)

    Chiu Liu, PhD, PE, PTOE

    2012-06-01

    Full Text Available The sight distance (SD on a two-dimensional (2-d curve, namely, a vertical curve or a horizontal curve, has been well understood and documented for roadway geometric design in literature. In reality, three-dimensional (3-d curves can be found along ramps, connectors, and often mountain roads. The sight distance on these 3-d curves, which may vary with driver's location, has not been tackled in literature on an exact analytic setting. By integrating human-vehicle-roadway interaction, the formulas for computing the SD on a 3-d curve are derived the first time on an analytic framework. The crest curve SD that has been used in various literatures, can be deduced from these derived formulas as special limiting cases. Practitioners can easily apply theses user-friendly formulas or equations on a Microsoft Excel spread sheet to calculate 3-d SD on a roadway with sufficient roadside clearance. In addition, this framework can be extended easily to cope with various scenarios in which obstacles partially blocking driver's sight are present in a roadway environment.

  14. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    Science.gov (United States)

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  15. Dual Smarandache Curves and Smarandache Ruled Surfaces

    OpenAIRE

    Tanju KAHRAMAN; Mehmet ÖNDER; H. Hüseyin UGURLU

    2013-01-01

    In this paper, by considering dual geodesic trihedron (dual Darboux frame) we define dual Smarandache curves lying fully on dual unit sphere S^2 and corresponding to ruled surfaces. We obtain the relationships between the elements of curvature of dual spherical curve (ruled surface) x(s) and its dual Smarandache curve (Smarandache ruled surface) x1(s) and we give an example for dual Smarandache curves of a dual spherical curve.

  16. Fast Bilinear Maps from the Tate-Lichtenbaum Pairing on Hyperelliptic Curves

    DEFF Research Database (Denmark)

    Frey, Gerhard; Lange, Tanja

    2006-01-01

    on hyperelliptic curves of genus g. We give mathematically sound arguments why it is possible to use particular representatives of the involved residue classes in the second argument that allow to compute the pairing much faster, where the speed-up grows with the size of g. Since the curve arithmetic takes about...... the same time for small g and constant group size, this implies that g>1 offers advantages for implementations. We give two examples of how to apply the modified setting in pairing based protocols such that all parties profit from the idea. We stress that our results apply also to non-supersingular curves...

  17. F(α) curves: Experimental results

    International Nuclear Information System (INIS)

    Glazier, J.A.; Gunaratne, G.; Libchaber, A.

    1988-01-01

    We study the transition to chaos at the golden and silver means for forced Rayleigh-Benard (RB) convection in mercury. We present f(α) curves below, at, and above the transition, and provide comparisons to the curves calculated for the one-dimensional circle map. We find good agreement at both the golden and silver means. This confirms our earlier observation that for low amplitude forcing, forced RB convection is well described by the one-dimensional circle map and indicates that the f(α) curve is a good measure of the approach to criticality. For selected subcritical experimental data sets we calculate the degree of subcriticality. We also present both experimental and calculated results for f(α) in the presence of a third frequency. Again we obtain agreement: The presence of random noise or a third frequency narrows the right-hand (negative q) side of the f(α) curve. Subcriticality results in symmetrically narrowed curves. We can also distinguish these cases by examining the power spectra and Poincare sections of the time series

  18. Investigation of learning and experience curves

    Energy Technology Data Exchange (ETDEWEB)

    Krawiec, F.; Thornton, J.; Edesess, M.

    1980-04-01

    The applicability of learning and experience curves for predicting future costs of solar technologies is assessed, and the major test case is the production economics of heliostats. Alternative methods for estimating cost reductions in systems manufacture are discussed, and procedures for using learning and experience curves to predict costs are outlined. Because adequate production data often do not exist, production histories of analogous products/processes are analyzed and learning and aggregated cost curves for these surrogates estimated. If the surrogate learning curves apply, they can be used to estimate solar technology costs. The steps involved in generating these cost estimates are given. Second-generation glass-steel and inflated-bubble heliostat design concepts, developed by MDAC and GE, respectively, are described; a costing scenario for 25,000 units/yr is detailed; surrogates for cost analysis are chosen; learning and aggregate cost curves are estimated; and aggregate cost curves for the GE and MDAC designs are estimated. However, an approach that combines a neoclassical production function with a learning-by-doing hypothesis is needed to yield a cost relation compatible with the historical learning curve and the traditional cost function of economic theory.

  19. Impact of Perceptual Speed Calming Curve Countermeasures on Drivers’ Anticipation and Mitigation Ability : A Driving Simulator Study

    Science.gov (United States)

    2018-02-01

    Horizontal curves are unavoidable in rural roads and are a serious crash risk to vehicle occupants. This study investigates the impact and effectiveness of three curve-based perceptual speed-calming countermeasures (advance curve warning signs, chevr...

  20. Mathematical modeling improves EC50 estimations from classical dose-response curves.

    Science.gov (United States)

    Nyman, Elin; Lindgren, Isa; Lövfors, William; Lundengård, Karin; Cervin, Ida; Sjöström, Theresia Arbring; Altimiras, Jordi; Cedersund, Gunnar

    2015-03-01

    The β-adrenergic response is impaired in failing hearts. When studying β-adrenergic function in vitro, the half-maximal effective concentration (EC50 ) is an important measure of ligand response. We previously measured the in vitro contraction force response of chicken heart tissue to increasing concentrations of adrenaline, and observed a decreasing response at high concentrations. The classical interpretation of such data is to assume a maximal response before the decrease, and to fit a sigmoid curve to the remaining data to determine EC50 . Instead, we have applied a mathematical modeling approach to interpret the full dose-response curve in a new way. The developed model predicts a non-steady-state caused by a short resting time between increased concentrations of agonist, which affect the dose-response characterization. Therefore, an improved estimate of EC50 may be calculated using steady-state simulations of the model. The model-based estimation of EC50 is further refined using additional time-resolved data to decrease the uncertainty of the prediction. The resulting model-based EC50 (180-525 nm) is higher than the classically interpreted EC50 (46-191 nm). Mathematical modeling thus makes it possible to re-interpret previously obtained datasets, and to make accurate estimates of EC50 even when steady-state measurements are not experimentally feasible. The mathematical models described here have been submitted to the JWS Online Cellular Systems Modelling Database, and may be accessed at http://jjj.bio.vu.nl/database/nyman. © 2015 FEBS.

  1. A semi-analytical three-dimensional free vibration analysis of functionally graded curved panels

    Energy Technology Data Exchange (ETDEWEB)

    Zahedinejad, P. [Department of Mechanical Engineering, Islamic Azad University, Branch of Shiraz, Shiraz (Iran, Islamic Republic of); Malekzadeh, P., E-mail: malekzadeh@pgu.ac.i [Department of Mechanical Engineering, Persian Gulf University, Persian Gulf University Boulevard, Bushehr 75168 (Iran, Islamic Republic of); Center of Excellence for Computational Mechanics, Shiraz University, Shiraz (Iran, Islamic Republic of); Farid, M. [Department of Mechanical Engineering, Islamic Azad University, Branch of Shiraz, Shiraz (Iran, Islamic Republic of); Karami, G. [Department of Mechanical Engineering and Applied Mechanics, North Dakota State University, Fargo, ND 58105-5285 (United States)

    2010-08-15

    Based on the three-dimensional elasticity theory, free vibration analysis of functionally graded (FG) curved thick panels under various boundary conditions is studied. Panel with two opposite edges simply supported and arbitrary boundary conditions at the other edges are considered. Two different models of material properties variations based on the power law distribution in terms of the volume fractions of the constituents and the exponential distribution of the material properties through the thickness are considered. Differential quadrature method in conjunction with the trigonometric functions is used to discretize the governing equations. With a continuous material properties variation assumption through the thickness of the curved panel, differential quadrature method is efficiently used to discretize the governing equations and to implement the related boundary conditions at the top and bottom surfaces of the curved panel and in strong form. The convergence of the method is demonstrated and to validate the results, comparisons are made with the solutions for isotropic and FG curved panels. By examining the results of thick FG curved panels for various geometrical and material parameters and subjected to different boundary conditions, the influence of these parameters and in particular, those due to functionally graded material parameters are studied.

  2. A volume-based method for denoising on curved surfaces

    KAUST Repository

    Biddle, Harry

    2013-09-01

    We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.

  3. A volume-based method for denoising on curved surfaces

    KAUST Repository

    Biddle, Harry; von Glehn, Ingrid; Macdonald, Colin B.; Marz, Thomas

    2013-01-01

    We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.

  4. Application of the risk-informed methodology for APR1400 P-T limits curve

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.; Namgung, I. [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-07-01

    A reactor pressure vessel (RPV) in a nuclear power plant has operational limits of pressure and temperature to prevent a potential drastic propagation of cracks due to brittle fracture. We call it a pressure-temperature limits curve (P-T limits curve). Appendix G of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code, Section XI, provides deterministic procedures to develop the P-T limits curve for the reactor pressure vessel. Recently, an alternative risk-informed methodology has been added in the ASME Code. Risk-informed means that we can consider insights from a probabilistic risk assessment by using this methodology. This alternative methodology provides a simple procedure to develop risk-informed P-T limits for heat up, cool down, and hydrostatic test events. The risk-informed P-T limits curve is known to provide more operational flexibility, particularly for reactor pressure vessels with relatively high irradiation levels and radiation sensitive materials. In this paper, we developed both the deterministic and a risk-informed P-T limits curve for an APR1400 reactor using Appendix G of the ASME Code, Section XI and compare the results in terms of additional operational margin. (author)

  5. Additively manufactured metallic porous biomaterials based on minimal surfaces

    DEFF Research Database (Denmark)

    Bobbert, F. S. L.; Lietaert, K.; Eftekhari, Ali Akbar

    2017-01-01

    Porous biomaterials that simultaneously mimic the topological, mechanical, and mass transport properties of bone are in great demand but are rarely found in the literature. In this study, we rationally designed and additively manufactured (AM) porous metallic biomaterials based on four different...... of bone properties is feasible, biomaterials that could simultaneously mimic all or most of the relevant bone properties are rare. We used rational design and additive manufacturing to develop porous metallic biomaterials that exhibit an interesting combination of topological, mechanical, and mass...

  6. Genome association study through nonlinear mixed models revealed new candidate genes for pig growth curves

    Directory of Open Access Journals (Sweden)

    Fabyano Fonseca e Silva

    Full Text Available ABSTRACT: Genome association analyses have been successful in identifying quantitative trait loci (QTLs for pig body weights measured at a single age. However, when considering the whole weight trajectories over time in the context of genome association analyses, it is important to look at the markers that affect growth curve parameters. The easiest way to consider them is via the two-step method, in which the growth curve parameters and marker effects are estimated separately, thereby resulting in a reduction of the statistical power and the precision of estimates. One efficient solution is to adopt nonlinear mixed models (NMM, which enables a joint modeling of the individual growth curves and marker effects. Our aim was to propose a genome association analysis for growth curves in pigs based on NMM as well as to compare it with the traditional two-step method. In addition, we also aimed to identify the nearest candidate genes related to significant SNP (single nucleotide polymorphism markers. The NMM presented a higher number of significant SNPs for adult weight (A and maturity rate (K, and provided a direct way to test SNP significance simultaneously for both the A and K parameters. Furthermore, all significant SNPs from the two-step method were also reported in the NMM analysis. The ontology of the three candidate genes (SH3BGRL2, MAPK14, and MYL9 derived from significant SNPs (simultaneously affecting A and K allows us to make inferences with regards to their contribution to the pig growth process in the population studied.

  7. Interior Temperature Measurement Using Curved Mercury Capillary Sensor Based on X-ray Radiography

    Science.gov (United States)

    Chen, Shuyue; Jiang, Xing; Lu, Guirong

    2017-07-01

    A method was presented for measuring the interior temperature of objects using a curved mercury capillary sensor based on X-ray radiography. The sensor is composed of a mercury bubble, a capillary and a fixed support. X-ray digital radiography was employed to capture image of the mercury column in the capillary, and a temperature control system was designed for the sensor calibration. We adopted livewire algorithms and mathematical morphology to calculate the mercury length. A measurement model relating mercury length to temperature was established, and the measurement uncertainty associated with the mercury column length and the linear model fitted by least-square method were analyzed. To verify the system, the interior temperature measurement of an autoclave, which is totally closed, was taken from 29.53°C to 67.34°C. The experiment results show that the response of the system is approximately linear with an uncertainty of maximum 0.79°C. This technique provides a new approach to measure interior temperature of objects.

  8. Multilayer Strip Dipole Antenna Using Stacking Technique and Its Application for Curved Surface

    Directory of Open Access Journals (Sweden)

    Charinsak Saetiaw

    2013-01-01

    Full Text Available This paper presents the design of multilayer strip dipole antenna by stacking a flexible copper-clad laminate utilized for curved surface on the cylindrical objects. The designed antenna will reduce the effects of curving based on relative lengths that are changed in each stacking flexible copper-clad laminate layer. Curving is different from each layer of the antenna, so the resonance frequency that resulted from an extended antenna provides better frequency response stability compared to modern antenna when it is curved or attached to cylindrical objects. The frequency of multilayer antenna is designed at 920 MHz for UHF RFID applications.

  9. A novel knot selection method for the error-bounded B-spline curve fitting of sampling points in the measuring process

    International Nuclear Information System (INIS)

    Liang, Fusheng; Zhao, Ji; Ji, Shijun; Zhang, Bing; Fan, Cheng

    2017-01-01

    The B-spline curve has been widely used in the reconstruction of measurement data. The error-bounded sampling points reconstruction can be achieved by the knot addition method (KAM) based B-spline curve fitting. In KAM, the selection pattern of initial knot vector has been associated with the ultimate necessary number of knots. This paper provides a novel initial knots selection method to condense the knot vector required for the error-bounded B-spline curve fitting. The initial knots are determined by the distribution of features which include the chord length (arc length) and bending degree (curvature) contained in the discrete sampling points. Firstly, the sampling points are fitted into an approximate B-spline curve Gs with intensively uniform knot vector to substitute the description of the feature of the sampling points. The feature integral of Gs is built as a monotone increasing function in an analytic form. Then, the initial knots are selected according to the constant increment of the feature integral. After that, an iterative knot insertion (IKI) process starting from the initial knots is introduced to improve the fitting precision, and the ultimate knot vector for the error-bounded B-spline curve fitting is achieved. Lastly, two simulations and the measurement experiment are provided, and the results indicate that the proposed knot selection method can reduce the number of ultimate knots available. (paper)

  10. Development of the curve of Spee.

    Science.gov (United States)

    Marshall, Steven D; Caspersen, Matthew; Hardinger, Rachel R; Franciscus, Robert G; Aquilino, Steven A; Southard, Thomas E

    2008-09-01

    Ferdinand Graf von Spee is credited with characterizing human occlusal curvature viewed in the sagittal plane. This naturally occurring phenomenon has clinical importance in orthodontics and restorative dentistry, yet we have little understanding of when, how, or why it develops. The purpose of this study was to expand our understanding by examining the development of the curve of Spee longitudinally in a sample of untreated subjects with normal occlusion from the deciduous dentition to adulthood. Records of 16 male and 17 female subjects from the Iowa Facial Growth Study were selected and examined. The depth of the curve of Spee was measured on their study models at 7 time points from ages 4 (deciduous dentition) to 26 (adult dentition) years. The Wilcoxon signed rank test was used to compare changes in the curve of Spee depth between time points. For each subject, the relative eruption of the mandibular teeth was measured from corresponding cephalometric radiographs, and its contribution to the developing curve of Spee was ascertained. In the deciduous dentition, the curve of Spee is minimal. At mean ages of 4.05 and 5.27 years, the average curve of Spee depths are 0.24 and 0.25 mm, respectively. With change to the transitional dentition, corresponding to the eruption of the mandibular permanent first molars and central incisors (mean age, 6.91 years), the curve of Spee depth increases significantly (P < 0.0001) to a mean maximum depth of 1.32 mm. The curve of Spee then remains essentially unchanged until eruption of the second molars (mean age, 12.38 years), when the depth increases (P < 0.0001) to a mean maximum depth of 2.17 mm. In the adolescent dentition (mean age, 16.21 years), the depth decreases slightly (P = 0.0009) to a mean maximum depth of 1.98 mm, and, in the adult dentition (mean age 26.98 years), the curve remains unchanged (P = 0.66), with a mean maximum depth of 2.02 mm. No significant differences in curve of Spee development were found between

  11. Improved detection of genetic markers of antimicrobial resistance by hybridization probe-based melting curve analysis using primers to mask proximal mutations: examples include the influenza H275Y substitution.

    Science.gov (United States)

    Whiley, David M; Jacob, Kevin; Nakos, Jennifer; Bletchly, Cheryl; Nimmo, Graeme R; Nissen, Michael D; Sloots, Theo P

    2012-06-01

    Numerous real-time PCR assays have been described for detection of the influenza A H275Y alteration. However, the performance of these methods can be undermined by sequence variation in the regions flanking the codon of interest. This is a problem encountered more broadly in microbial diagnostics. In this study, we developed a modification of hybridization probe-based melting curve analysis, whereby primers are used to mask proximal mutations in the sequence targets of hybridization probes, so as to limit the potential for sequence variation to interfere with typing. The approach was applied to the H275Y alteration of the influenza A (H1N1) 2009 strain, as well as a Neisseria gonorrhoeae mutation associated with antimicrobial resistance. Assay performances were assessed using influenza A and N. gonorrhoeae strains characterized by DNA sequencing. The modified hybridization probe-based approach proved successful in limiting the effects of proximal mutations, with the results of melting curve analyses being 100% consistent with the results of DNA sequencing for all influenza A and N. gonorrhoeae strains tested. Notably, these included influenza A and N. gonorrhoeae strains exhibiting additional mutations in hybridization probe targets. Of particular interest was that the H275Y assay correctly typed influenza A strains harbouring a T822C nucleotide substitution, previously shown to interfere with H275Y typing methods. Overall our modified hybridization probe-based approach provides a simple means of circumventing problems caused by sequence variation, and offers improved detection of the influenza A H275Y alteration and potentially other resistance mechanisms.

  12. GLOBAL AND STRICT CURVE FITTING METHOD

    NARCIS (Netherlands)

    Nakajima, Y.; Mori, S.

    2004-01-01

    To find a global and smooth curve fitting, cubic B­Spline method and gathering­ line methods are investigated. When segmenting and recognizing a contour curve of character shape, some global method is required. If we want to connect contour curves around a singular point like crossing points,

  13. Determination of isodose curves in Radiotherapy using an Alanine/ESR dosemeter

    International Nuclear Information System (INIS)

    Chen, F.; Baffa, O.; Graeff, C.F.O.

    1998-01-01

    It was studied the possible use of an Alanine/ESR dosemeter in the isodose curves mapping in normal treatments of Radiotherapy. It was manufactured a lot of 150 dosemeters with base in a mixture of D-L Alanine dust (80 %) and paraffin (20 %). Each dosemeter has 4.7 mm diameter and 12 mm length. A group of 100 dosemeters of the lot were arranged inside 50 holes of the slice 25 of the phantom Rando Man. The phantom irradiation was realized in two opposed projections (AP and PA) in Co-60 equipment. A group of 15 dosemeters was take of the same lot for obtaining the calibration curve in a 1-20 Gy range. After irradiation the signal of each dosemeter was measured in an ESR spectrometer operating in the X-band (∼ 9.5 GHz) and the wideness of Alanine ESR spectra central line was correlated with the radiation dose. The wideness dose calibration curve resulted linear with a correlation coefficient 0.9996. The isodose curves obtained show a profile enough similar at comparing with the theoretical curves. (Author)

  14. Simulating Supernova Light Curves

    Energy Technology Data Exchange (ETDEWEB)

    Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-05

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.

  15. Simulating Supernova Light Curves

    International Nuclear Information System (INIS)

    Even, Wesley Paul; Dolence, Joshua C.

    2016-01-01

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth's atmosphere.

  16. A direct method to solve optimal knots of B-spline curves: An application for non-uniform B-spline curves fitting.

    Directory of Open Access Journals (Sweden)

    Van Than Dung

    Full Text Available B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.

  17. The Kepler Light Curves of AGN: A Detailed Analysis

    Science.gov (United States)

    Smith, Krista Lynne; Mushotzky, Richard F.; Boyd, Patricia T.; Malkan, Matt; Howell, Steve B.; Gelino, Dawn M.

    2018-04-01

    We present a comprehensive analysis of 21 light curves of Type 1 active galactic nuclei (AGN) from the Kepler spacecraft. First, we describe the necessity and development of a customized pipeline for treating Kepler data of stochastically variable sources like AGN. We then present the light curves, power spectral density functions (PSDs), and flux histograms. The light curves display an astonishing variety of behaviors, many of which would not be detected in ground-based studies, including switching between distinct flux levels. Six objects exhibit PSD flattening at characteristic timescales that roughly correlate with black hole mass. These timescales are consistent with orbital timescales or free-fall accretion timescales. We check for correlations of variability and high-frequency PSD slope with accretion rate, black hole mass, redshift, and luminosity. We find that bolometric luminosity is anticorrelated with both variability and steepness of the PSD slope. We do not find evidence of the linear rms–flux relationships or lognormal flux distributions found in X-ray AGN light curves, indicating that reprocessing is not a significant contributor to optical variability at the 0.1%–10% level.

  18. Application of Geodetic VLBI Data to Obtaining Long-Term Light Curves for Astrophysics

    Science.gov (United States)

    Kijima, Masachika

    2010-01-01

    The long-term light curve is important to research on binary black holes and disk instability in AGNs. The light curves have been drawn mainly using single dish data provided by the University of Michigan Radio Observatory and the Metsahovi Radio Observatory. Hence, thus far, we have to research on limited sources. I attempt to draw light curves using VLBI data for those sources that have not been monitored by any observatories with single dish. I developed software, analyzed all geodetic VLBI data available at the IVS Data Centers, and drew the light curves at 8 GHz. In this report, I show the tentative results for two AGNs. I compared two light curves of 4C39.25, which were drawn based on single dish data and on VLBI data. I confirmed that the two light curves were consistent. Furthermore, I succeeded in drawing the light curve of 0454-234 with VLBI data, which has not been monitored by any observatory with single dish. In this report, I suggest that the geodetic VLBI archive data is useful to obtain the long-term light curves at radio bands for astrophysics.

  19. Fermions in curved spacetimes

    Energy Technology Data Exchange (ETDEWEB)

    Lippoldt, Stefan

    2016-01-21

    In this thesis we study a formulation of Dirac fermions in curved spacetime that respects general coordinate invariance as well as invariance under local spin base transformations. We emphasize the advantages of the spin base invariant formalism both from a conceptual as well as from a practical viewpoint. This suggests that local spin base invariance should be added to the list of (effective) properties of (quantum) gravity theories. We find support for this viewpoint by the explicit construction of a global realization of the Clifford algebra on a 2-sphere which is impossible in the spin-base non-invariant vielbein formalism. The natural variables for this formulation are spacetime-dependent Dirac matrices subject to the Clifford-algebra constraint. In particular, a coframe, i.e. vielbein field is not required. We disclose the hidden spin base invariance of the vielbein formalism. Explicit formulas for the spin connection as a function of the Dirac matrices are found. This connection consists of a canonical part that is completely fixed in terms of the Dirac matrices and a free part that can be interpreted as spin torsion. The common Lorentz symmetric gauge for the vielbein is constructed for the Dirac matrices, even for metrics which are not linearly connected. Under certain criteria, it constitutes the simplest possible gauge, demonstrating why this gauge is so useful. Using the spin base formulation for building a field theory of quantized gravity and matter fields, we show that it suffices to quantize the metric and the matter fields. This observation is of particular relevance for field theory approaches to quantum gravity, as it can serve for a purely metric-based quantization scheme for gravity even in the presence of fermions. Hence, in the second part of this thesis we critically examine the gauge, and the field-parametrization dependence of renormalization group flows in the vicinity of non-Gaussian fixed points in quantum gravity. While physical

  20. An appraisal of the learning curve in robotic general surgery.

    Science.gov (United States)

    Pernar, Luise I M; Robertson, Faith C; Tavakkoli, Ali; Sheu, Eric G; Brooks, David C; Smink, Douglas S

    2017-11-01

    Robotic-assisted surgery is used with increasing frequency in general surgery for a variety of applications. In spite of this increase in usage, the learning curve is not yet defined. This study reviews the literature on the learning curve in robotic general surgery to inform adopters of the technology. PubMed and EMBASE searches yielded 3690 abstracts published between July 1986 and March 2016. The abstracts were evaluated based on the following inclusion criteria: written in English, reporting original work, focus on general surgery operations, and with explicit statistical methods. Twenty-six full-length articles were included in final analysis. The articles described the learning curves in colorectal (9 articles, 35%), foregut/bariatric (8, 31%), biliary (5, 19%), and solid organ (4, 15%) surgery. Eighteen of 26 (69%) articles report single-surgeon experiences. Time was used as a measure of the learning curve in all studies (100%); outcomes were examined in 10 (38%). In 12 studies (46%), the authors identified three phases of the learning curve. Numbers of cases needed to achieve plateau performance were wide-ranging but overlapping for different kinds of operations: 19-128 cases for colorectal, 8-95 for foregut/bariatric, 20-48 for biliary, and 10-80 for solid organ surgery. Although robotic surgery is increasingly utilized in general surgery, the literature provides few guidelines on the learning curve for adoption. In this heterogeneous sample of reviewed articles, the number of cases needed to achieve plateau performance varies by case type and the learning curve may have multiple phases as surgeons add more complex cases to their case mix with growing experience. Time is the most common determinant for the learning curve. The literature lacks a uniform assessment of outcomes and complications, which would arguably reflect expertise in a more meaningful way than time to perform the operation alone.

  1. Lactation curves and economic results of Saanen goats fed increasing dietary energy levels obtained by the addition of calcium salts of fatty acids

    Directory of Open Access Journals (Sweden)

    Rodrigo de Souza

    2014-02-01

    Full Text Available The objective of this study was to evaluate the use of calcium salts of fatty acids (CSFA to increase the dietary energy levels for Saanen goats and their effects on the lactation curve, dry matter intake, body weight, and economic results of the goats. Twenty multiparous goats, weighing an average of 63.5±10.3 kg, were randomly assigned to one of four treatment groups, each receiving one of the following dietary energy levels: a control diet consisting of 2.6 Mcal of metabolizable energy per kg of dry matter (Mcal ME/kg DM or a test diet supplemented with CSFA (Lactoplus® to obtain 2.7, 2.8, or 2.9 Mcal ME/kg DM. Goats were housed in individual stalls and were fed and milked twice daily. The animals were evaluated until 180 days in milk by measuring dry matter intake and milk yield. These measurements were used to calculate feed efficiencies and the cost-benefit ratio of diet and lactation curves using Wood's nonlinear model. Increasing dietary energy levels showed no effect on body weight. Supplementation with CSFA did not limit dry matter intake; however, it changed the shape of the lactation curve by promoting a late peak lactation with a longer duration. Milk yields at 180 days in milk had a quadratic increase with a maximum energy level at 2.85 Mcal ME/kg DM. Increasing the dietary energy level for Saanen goats using CSFA changes their lactation curves, with the best milk production achieved with a 2.85 Mcal ME/kg DM diet; however, the greatest economic results were obtained with a 2.7 Mcal ME/kg DM diet.

  2. A dose-response curve for biodosimetry from a 6 MV electron linear accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Lemos-Pinto, M.M.P.; Cadena, M.; Santos, N.; Fernandes, T.S.; Borges, E.; Amaral, A., E-mail: marcelazoo@yahoo.com.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear

    2015-10-15

    Biological dosimetry (biodosimetry) is based on the investigation of radiation-induced biological effects (biomarkers), mainly dicentric chromosomes, in order to correlate them with radiation dose. To interpret the dicentric score in terms of absorbed dose, a calibration curve is needed. Each curve should be constructed with respect to basic physical parameters, such as the type of ionizing radiation characterized by low or high linear energy transfer (LET) and dose rate. This study was designed to obtain dose calibration curves by scoring of dicentric chromosomes in peripheral blood lymphocytes irradiated in vitro with a 6 MV electron linear accelerator (Mevatron M, Siemens, USA). Two software programs, CABAS (Chromosomal Aberration Calculation Software) and Dose Estimate, were used to generate the curve. The two software programs are discussed; the results obtained were compared with each other and with other published low LET radiation curves. Both software programs resulted in identical linear and quadratic terms for the curve presented here, which was in good agreement with published curves for similar radiation quality and dose rates. (author)

  3. A dose-response curve for biodosimetry from a 6 MV electron linear accelerator.

    Science.gov (United States)

    Lemos-Pinto, M M P; Cadena, M; Santos, N; Fernandes, T S; Borges, E; Amaral, A

    2015-10-01

    Biological dosimetry (biodosimetry) is based on the investigation of radiation-induced biological effects (biomarkers), mainly dicentric chromosomes, in order to correlate them with radiation dose. To interpret the dicentric score in terms of absorbed dose, a calibration curve is needed. Each curve should be constructed with respect to basic physical parameters, such as the type of ionizing radiation characterized by low or high linear energy transfer (LET) and dose rate. This study was designed to obtain dose calibration curves by scoring of dicentric chromosomes in peripheral blood lymphocytes irradiated in vitro with a 6 MV electron linear accelerator (Mevatron M, Siemens, USA). Two software programs, CABAS (Chromosomal Aberration Calculation Software) and Dose Estimate, were used to generate the curve. The two software programs are discussed; the results obtained were compared with each other and with other published low LET radiation curves. Both software programs resulted in identical linear and quadratic terms for the curve presented here, which was in good agreement with published curves for similar radiation quality and dose rates.

  4. Evaluation of R-curves in ceramic materials based on bridging interactions

    International Nuclear Information System (INIS)

    Fett, T.; Munz, D.

    1991-10-01

    In coarse-grained alumina the crack growth resistance increases with increasing crack extension due to crack-border interactions. The crack shielding stress intensity factor can be calculated from the relation between the bridging stresses and the crack opening displacement. The parameters of this relation can be obtained from experimental results on stable or subcritical crack extension. Finally the effected of the R-curve on the behaviour of components with small cracks is discussed. (orig.) [de

  5. Shaping the learning curve: epigenetic dynamics in neural plasticity

    Directory of Open Access Journals (Sweden)

    Zohar Ziv Bronfman

    2014-07-01

    Full Text Available A key characteristic of learning and neural plasticity is state-dependent acquisition dynamics reflected by the non-linear learning curve that links increase in learning with practice. Here we propose that the manner by which epigenetic states of individual cells change during learning contributes to the shape of the neural and behavioral learning curve. We base our suggestion on recent studies showing that epigenetic mechanisms such as DNA methylation, histone acetylation and RNA-mediated gene regulation are intimately involved in the establishment and maintenance of long-term neural plasticity, reflecting specific learning-histories and influencing future learning. Our model, which is the first to suggest a dynamic molecular account of the shape of the learning curve, leads to several testable predictions regarding the link between epigenetic dynamics at the promoter, gene-network and neural-network levels. This perspective opens up new avenues for therapeutic interventions in neurological pathologies.

  6. Measuring the actual I-131 thyroid uptake curve with a collar detector system: a feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Brinks, Peter; Van Gils, Koen; Dickerscheid, Dennis B.M.; Habraken, Jan B.A. [Department of Medical Physics, St. Antonius Hospital, Nieuwegein (Netherlands); Kranenborg, Ellen; Lavalaye, Jules [Department of Nuclear Medicine, St. Antonius Hospital, Nieuwegein (Netherlands)

    2017-06-15

    Radionuclide therapy using I-131 is commonly used for the treatment of benign thyroid diseases. The therapeutic dose to be administered is calculated based on the type of disease, the volume of the thyroid, and the measured uptake percentage. This methodology assumes a similar biological half-life of iodine, whereas in reality a large variation in biological half-life is observed. More knowledge about the actual biological half-life of iodine for individual patients will improve the quantification of the delivered radiation dose during radioiodine therapy and could aid the evaluation of the success of the therapy. In this feasibility study we used a novel measurement device [Collar Therapy Indicator (CoTI)] to measure the uptake curve of patients undergoing I-131 radioiodine therapy. The CoTI device is a light-weight wearable device that contains two independent gamma radiation detectors that are placed in a collar. By comparing results of thyroid uptake measurements with results obtained with a gamma camera, the precision of the system is demonstrated. Additionally, for three patients the uptake curve is measured during 48 h of admission in the hospital. The presented results demonstrate the feasibility of the new measurement device to measure the uptake curve during radioiodine therapy. (orig.)

  7. The effect of Tricresyl-Phosphate (TCP) as an additive on wear of Iron (Fe)

    Science.gov (United States)

    Ghose, Hiren M.; Ferrante, John; Honecy, Frank C.

    1987-01-01

    The effect of tricresyl phosphate (TCP) as an antiwear additive in lubricant trimethyol propane triheptanoate (TMPTH) was investigated. The objective was to examine step loading wear by use of surface analysis, wetting, and chemical bonding changes in the lubricant. The investigation consisted of steploading wear studies by a pin or disk tribometer, the effects on wear related to wetting by contact angle and surface tension measurements of various liquid systems, the chemical bonding changes between lubricant and TCP chromatographic analysis, and by determining the reaction between the TCP and metal surfaces through wear scar analysis by Auger emission spectroscopy (AES). The steploading curve for the base fluid alone shows rapid increase of wear rate with load. The steploading curve for the base fluid in presence of 4.25 percent by volume TCP under dry air purge has shown a great reduction of wear rate with all loads studied. It has also been found that the addition of 4.25 percent by volume TCP plus 0.33 percent by volume water to the base lubricant under N2 purge also greatly reduces the wear rate with all loads studied. AES surface analysis reveals a phosphate type wear resistant film, which greatly increases load-bearing capacity, formed on the iron disk. Preliminary chromatographic studies suggest that this film forms either because of ester oxidation or TCP degradation. Wetting studies show direct correlation between the spreading coefficient and the wear rate.

  8. Trigonometric Characterization of Some Plane Curves

    Indian Academy of Sciences (India)

    IAS Admin

    (Figure 1). A relation between tan θ and tanψ gives the trigonometric equation of the family of curves. In this article, trigonometric equations of some known plane curves are deduced and it is shown that these equations reveal some geometric characteristics of the families of the curves under consideration. In Section 2,.

  9. Colloidal-based additive manufacturing of bio-inspired composites

    Science.gov (United States)

    Studart, Andre R.

    Composite materials in nature exhibit heterogeneous architectures that are tuned to fulfill the functional demands of the surrounding environment. Examples range from the cellulose-based organic structure of plants to highly mineralized collagen-based skeletal parts like bone and teeth. Because they are often utilized to combine opposing properties such as strength and low-density or stiffness and wear resistance, the heterogeneous architecture of natural materials can potentially address several of the technical limitations of artificial homogeneous composites. However, current man-made manufacturing technologies do not allow for the level of composition and fiber orientation control found in natural heterogeneous systems. In this talk, I will present two additive manufacturing technologies recently developed in our group to build composites with exquisite architectures only rivaled by structures made by living organisms in nature. Since the proposed techniques utilize colloidal suspensions as feedstock, understanding the physics underlying the stability, assembly and rheology of the printing inks is key to predict and control the architecture of manufactured parts. Our results will show that additive manufacturing routes offer a new exciting pathway for the fabrication of biologically-inspired composite materials with unprecedented architectures and functionalities.

  10. Conservatism of ASME KIR-reference curve with respect to crack arrest

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.; Nagel, G.

    1999-01-01

    The conservatism of the RT NDT temperature indexing parameter and the ASME K IR -reference curve with respect to crack arrest toughness, has been evaluated. Based on an analysis of the original ASME K Ia data, it was established that inherently, the ASME K IR -reference curve corresponds to an overall 5% lower bound curve with respect to crack arrest. It was shown that the scatter of crack arrest toughness is essentially material independent and has a standard deviation of 18% and the temperature dependence of K Ia has the same form as predicted by the master curve for crack initiation toughness. The 'built in' offset between the mean 100 MPa√(m) crack arrest temperature, TK Ia , and RT NDT is 38 C (TK Ia =RT NDT +38 C) and the experimental relation between TK Ia and NDT is, TK Ia =NDT+28 C. The K IR -reference curve using NDT as reference temperature will be conservative with respect to the general 5% lower bound K Ia(5%) -curve, with a 75% confidence. The use of RT NDT , instead of NDT, will generally increase the degree of conservatism, both for non-irradiated as well as irradiated materials, close to a 95% confidence level. This trend is pronounced for materials with Charpy-V upper shelf energies below 100 J. It is shown that the K IR -curve effectively constitutes a deterministic lower bound curve for crack arrest. The findings are valid both for nuclear pressure vessel plates, forgings and welds. (orig.)

  11. Conservatism of ASME KIR-reference curve with respect to crack arrest

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.; Nagel, G.

    2001-01-01

    The conservatism of the RT NDT temperature indexing parameter and the ASME K IR -reference curve with respect to crack arrest toughness, has been evaluated. Based on an analysis of the original ASME K Ia data, it was established that inherently, the ASME K IR -reference curve corresponds to an overall 5% lower bound curve with respect to crack arrest. It was shown that the scatter of crack arrest toughness is essentially material independent and has a standard deviation (S.D.) of 18% and the temperature dependence of K Ia has the same form as predicted by the master curve for crack initiation toughness. The 'built in' offset between the mean 100 MPa√m crack arrest temperature, TK Ia , and RT NDT is 38 deg. C (TK Ia =RT NDT +38 deg. C) and the experimental relation between TK Ia and NDT is, TK Ia =NDT+28 deg. C. The K IR -reference curve using NDT as reference temperature will be conservative with respect to the general 5% lower bound K Ia(5%) -curve, with a 75% confidence. The use of RT NDT , instead of NDT, will generally increase the degree of conservatism, both for non-irradiated as well as irradiated materials, close to a 95% confidence level. This trend is pronounced for materials with Charpy-V upper shelf energies below 100 J. It is shown that the K IR -curve effectively constitutes a deterministic lower bound curve for crack arrest The findings are valid both for nuclear pressure vessel plates, forgings and welds

  12. The learning curve: implications of a quantitative analysis.

    Science.gov (United States)

    Gallistel, Charles R; Fairhurst, Stephen; Balsam, Peter

    2004-09-07

    The negatively accelerated, gradually increasing learning curve is an artifact of group averaging in several commonly used basic learning paradigms (pigeon autoshaping, delay- and trace-eye-blink conditioning in the rabbit and rat, autoshaped hopper entry in the rat, plus maze performance in the rat, and water maze performance in the mouse). The learning curves for individual subjects show an abrupt, often step-like increase from the untrained level of responding to the level seen in the well trained subject. The rise is at least as abrupt as that commonly seen in psychometric functions in stimulus detection experiments. It may indicate that the appearance of conditioned behavior is mediated by an evidence-based decision process, as in stimulus detection experiments. If the appearance of conditioned behavior is taken instead to reflect the increase in an underlying associative strength, then a negligible portion of the function relating associative strength to amount of experience is behaviorally visible. Consequently, rate of learning cannot be estimated from the group-average curve; the best measure is latency to the onset of responding, determined for each subject individually.

  13. Can CCM law properly represent all extinction curves?

    International Nuclear Information System (INIS)

    Geminale, Anna; Popowski, Piotr

    2005-01-01

    We present the analysis of a large sample of lines of sight with extinction curves covering wavelength range from near-infrared (NIR) to ultraviolet (UV). We derive total to selective extinction ratios based on the Cardelli, Clayton and Mathis (1989, CCM) law, which is typically used to fit the extinction data both for diffuse and dense interstellar medium. We conclude that the CCM law is able to fit most of the extinction curves in our sample. We divide the remaining lines of sight with peculiar extinction into two groups according to two main behaviors: a) the optical/IR or/and UV wavelength region cannot be reproduced by the CCM formula; b) the optical/NIR and UV extinction data are best fit by the CCM law with different values of R v . We present examples of such curves. The study of both types of peculiar cases can help us to learn about the physical processes that affect dust in the interstellar medium, e.g., formation of mantles on the surface of grains, evaporation, growing or shattering

  14. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  15. Graphical interpretation of confidence curves in rankit plots

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Blaabjerg, Ole; Andersen, Marianne

    2004-01-01

    A well-known transformation from the bell-shaped Gaussian (normal) curve to a straight line in the rankit plot is investigated, and a tool for evaluation of the distribution of reference groups is presented. It is based on the confidence intervals for percentiles of the calculated Gaussian distri...

  16. Management of the learning curve

    DEFF Research Database (Denmark)

    Pedersen, Peter-Christian; Slepniov, Dmitrij

    2016-01-01

    Purpose – This paper focuses on the management of the learning curve in overseas capacity expansions. The purpose of this paper is to unravel the direct as well as indirect influences on the learning curve and to advance the understanding of how these affect its management. Design...... the dimensions of the learning process involved in a capacity expansion project and identified the direct and indirect labour influences on the production learning curve. On this basis, the study proposes solutions to managing learning curves in overseas capacity expansions. Furthermore, the paper concludes...... with measures that have the potential to significantly reduce the non-value-added time when establishing new capacities overseas. Originality/value – The paper uses a longitudinal in-depth case study of a Danish wind turbine manufacturer and goes beyond a simplistic treatment of the lead time and learning...

  17. Computing segmentations directly from x-ray projection data via parametric deformable curves

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen; Dahl, Anders Bjorholm; Hansen, Per Christian

    2018-01-01

    We describe an efficient algorithm that computes a segmented reconstruction directly from x-ray projection data. Our algorithm uses a parametric curve to define the segmentation. Unlike similar approaches which are based on level-sets, our method avoids a pixel or voxel grid; hence the number...... of unknowns is reduced to the set of points that define the curve, and attenuation coefficients of the segments. Our current implementation uses a simple closed curve and is capable of separating one object from the background. However, our basic algorithm can be applied to an arbitrary topology and multiple...

  18. Description of saturation curves and boiling process of dry air

    Directory of Open Access Journals (Sweden)

    Vestfálová Magda

    2018-01-01

    Full Text Available Air is a mixture of gases forming the gas wrap of Earth. It is formed by dry air, moisture and other pollutants. Dry air is a substance whose thermodynamic properties in gaseous state, as well as the thermodynamic properties of its main constituents in gaseous state, are generally known and described in detail in the literature. The liquid air is a bluish liquid and is industrially used to produce oxygen, nitrogen, argon and helium by distillation. The transition between the gaseous and liquid state (the condensation process, resp. boiling process, is usually displayed in the basic thermodynamic diagrams using the saturation curves. The saturation curves of all pure substances are of a similar shape. However, since the dry air is a mixture, the shapes of its saturation curves are modified relative to the shapes corresponding to the pure substances. This paper deals with the description of the dry air saturation curves as a mixture, i.e. with a description of the process of phase change of dry air (boiling process. The dry air saturation curves are constructed in the basic thermodynamic charts based on the values obtained from the literature. On the basis of diagrams, data appearing in various publications are interpreted and put into context with boiling process of dry air.

  19. Computational aspects of algebraic curves

    CERN Document Server

    Shaska, Tanush

    2005-01-01

    The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove

  20. Study on elastic-plastic deformation analysis using a cyclic stress-strain curve

    International Nuclear Information System (INIS)

    Igari, Toshihide; Setoguchi, Katsuya; Yamauchi, Masafumi

    1983-01-01

    This paper presents the results of the elastic-plastic deformation analysis using a cyclic stress-strain curve with an intention to apply this method for predicting the low-cycle fatigue life. Uniaxial plastic cycling tests were performed on 2 1/4Cr-1Mo steel to investigate the correspondence between the cyclic stress-strain curve and the hysteresis loop, and also to determine what mathematical model should be used for analysis of deformation at stress reversal. Furthermore, a cyclic in-plane bending test was performed on a flat plate to clarify the validity of the cyclic stress-strain curve-based theoretical analysis. The results obtained are as follows: (1) The cyclic stress-strain curve corresponds nearly to the ascending curve of hysteresis loop scaled by a factor of 1/2 for both stress and strain. Therefore, the cyclic stress-strain curve can be determined from the shape of hysteresis loop, for simplicity. (2) To perform the elastic-plastic deformation analysis using the cyclic stress-strain curve is both practical and effective for predicting the cyclic elastic-plastic deformation of structures at the stage of advanced cycles. And Masing model can serve as a suitable mathematical model for such a deformation analysis. (author)

  1. The experience curve, option value, and the energy paradox

    International Nuclear Information System (INIS)

    Ansar, Jasmin; Sparks, Roger

    2009-01-01

    This paper develops a model to explain the 'energy paradox,' the inclination of households and firms to require very high internal rates of return in order to make energy-saving investments. The model abstracts from many features of such investments to focus on their irreversibility, the uncertainty of their future payoff streams, and the investor's anticipation of future technological advance. In this setting, the decision to invest in energy-saving technology can be delayed, providing option value. In addition, delay allows the potential investor to cash in on future experience-curve effects: With the passage of time, firms gain practical knowledge in producing and installing the energy-saving technology, enabling them to reduce the technology's up-front cost per unit of energy saved. We incorporate these fundamentals into a stochastic model where the investment's discounted benefits follow geometric Brownian motion. To demonstrate the model's capabilities, we generate simulation results for photovoltaic systems that highlight the experience-curve effect as a fundamental reason why households and firms delay making energy-saving investments until internal rates of return exceed values of 50% and higher, consistent with observations in the economics literature. We also explore altruistic motivations for energy conservation and the model's implications for both 'additionality' and the design of energy-conservation policy

  2. Spectral curve for open strings attached to the Y=0 brane

    International Nuclear Information System (INIS)

    Bajnok, Zoltán; Kim, Minkyoo; Palla, László

    2014-01-01

    The concept of spectral curve is generalized to open strings in AdS/CFT with integrability preserving boundary conditions. Our definition is based on the logarithms of the eigenvalues of the open monodromy matrix and makes possible to determine all the analytic, symmetry and asymptotic properties of the quasimomenta. We work out the details of the whole construction for the Y=0 brane boundary condition. The quasimomenta of open circular strings are explicitly calculated. We use the asymptotic solutions of the Y-system and the boundary Bethe Ansatz equations to recover the spectral curve in the strong coupling scaling limit. Using the curve the quasiclassical fluctuations of some open string solutions are also studied

  3. Optimal Sample Size for Probability of Detection Curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2012-01-01

    The use of Probability of Detection (POD) curves to quantify NDT reliability is common in the aeronautical industry, but relatively less so in the nuclear industry. The European Network for Inspection Qualification's (ENIQ) Inspection Qualification Methodology is based on the concept of Technical Justification, a document assembling all the evidence to assure that the NDT system in focus is indeed capable of finding the flaws for which it was designed. This methodology has become widely used in many countries, but the assurance it provides is usually of qualitative nature. The need to quantify the output of inspection qualification has become more important, especially as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. To credit the inspections in structural reliability evaluations, a measure of the NDT reliability is necessary. A POD curve provides such metric. In 2010 ENIQ developed a technical report on POD curves, reviewing the statistical models used to quantify inspection reliability. Further work was subsequently carried out to investigate the issue of optimal sample size for deriving a POD curve, so that adequate guidance could be given to the practitioners of inspection reliability. Manufacturing of test pieces with cracks that are representative of real defects found in nuclear power plants (NPP) can be very expensive. Thus there is a tendency to reduce sample sizes and in turn reduce the conservatism associated with the POD curve derived. Not much guidance on the correct sample size can be found in the published literature, where often qualitative statements are given with no further justification. The aim of this paper is to summarise the findings of such work. (author)

  4. DNA-based catalytic enantioselective intermolecular oxa-Michael addition reactions

    NARCIS (Netherlands)

    Megens, Rik P.; Roelfes, Gerard

    2012-01-01

    Using the DNA-based catalysis concept, a novel Cu(II) catalyzed enantioselective oxa-Michael addition of alcohols to enones is reported. Enantioselectivities of up to 86% were obtained. The presence of water is important for the reactivity, possibly by reverting unwanted side reactions such as

  5. Toward precise potential energy curves for diatomic molecules, derived from experimental line positions

    International Nuclear Information System (INIS)

    Helm, H.

    1984-01-01

    An inverted, first-order perturbation approach is used to derive potential energy curves for diatomic molecules from experimental line positions of molecular bands. The concept adopted here is based on the inverted perturbation analysis (IPA) proposed by Kozman and Hinze, but uses radial eigenfunctions of the trial potential energy curves as basis sets for the perturbation correction. Using molecular linepositions rather than molecular energy levels we circumvent the necessity of defining molecular constants for the molecule prior to the derivation of the potential energy curves. (Author)

  6. Derivation of Path Independent Coupled Mix Mode Cohesive Laws from Fracture Resistance Curves

    DEFF Research Database (Denmark)

    Goutianos, Stergios

    2016-01-01

    A generalised approach is presented to derive coupled mixed mode cohesive laws described with physical parameters such as peak traction, critical opening, fracture energy and cohesive shape. The approach is based on deriving mix mode fracture resistance curves from an effective mix mode cohesive...... law at different mode mixities. From the fracture resistance curves, the normal and shear stresses of the cohesive laws can be obtained by differentiation. Since, the mixed mode cohesive laws are obtained from a fracture resistance curve (potential function), path independence is automatically...

  7. Parametric study of guided waves dispersion curves for composite plates

    Science.gov (United States)

    Predoi, Mihai Valentin; Petre, Cristian Cǎtǎlin; Kettani, Mounsif Ech Cherif El; Leduc, Damien

    2018-02-01

    Nondestructive testing of composite panels benefit from the relatively long range propagation of guided waves in sandwich structures. The guided waves are sensitive to delamination, air bubbles inclusions and cracks and can thus bring information about hidden defects in the composite panel. The preliminary data in all such inspections is represented by the dispersion curves, representing the dependency of the phase/group velocity on the frequency for the propagating modes. In fact, all modes are more or less attenuated, so it is even more important to compute the dispersion curves, which provide also the modal attenuation as function of frequency. Another important aspect is the sensitivity of the dispersion curves on each of the elastic constant of the composite, which are orthotropic in most cases. All these aspects are investigated in the present work, based on our specially developed finite element numerical model implemented in Comsol, which has several advantages over existing methods. The dispersion curves and modal displacements are computed for an example of composite plate. Comparison with literature data validates the accuracy of our results.

  8. String Sigma Models on Curved Supermanifolds

    Directory of Open Access Journals (Sweden)

    Roberto Catenacci

    2018-04-01

    Full Text Available We use the techniques of integral forms to analyze the easiest example of two-dimensional sigma models on a supermanifold. We write the action as an integral of a top integral form over a D = 2 supermanifold, and we show how to interpolate between different superspace actions. Then, we consider curved supermanifolds, and we show that the definitions used for flat supermanifolds can also be used for curved supermanifolds. We prove it by first considering the case of a curved rigid supermanifold and then the case of a generic curved supermanifold described by a single superfield E.

  9. Prevalence of Phosphorus-Based Additives in the Australian Food Supply: A Challenge for Dietary Education?

    Science.gov (United States)

    McCutcheon, Jemma; Campbell, Katrina; Ferguson, Maree; Day, Sarah; Rossi, Megan

    2015-09-01

    Phosphorus-based food additives may pose a significant risk in chronic kidney disease given the link between hyperphosphatemia and cardiovascular disease. The objective of the study was to determine the prevalence of phosphorus-based food additives in best-selling processed grocery products and to establish how they were reported on food labels. A data set of 3000 best-selling grocery items in Australia across 15 food and beverage categories was obtained for the 12 months ending December 2013 produced by the Nielsen Company's Homescan database. The nutrition labels of the products were reviewed in store for phosphorus additives. The type of additive, total number of additives, and method of reporting (written out in words or as an E number) were recorded. Presence of phosphorus-based food additives, number of phosphorus-based food additives per product, and the reporting method of additives on product ingredient lists. Phosphorus-based additives were identified in 44% of food and beverages reviewed. Additives were particularly common in the categories of small goods (96%), bakery goods (93%), frozen meals (75%), prepared foods (70%), and biscuits (65%). A total of 19 different phosphorus additives were identified across the reviewed products. From the items containing phosphorus additives, there was a median (minimum-maximum) of 2 (1-7) additives per product. Additives by E number (81%) was the most common method of reporting. Phosphorus-based food additives are common in the Australian food supply. This suggests that prioritizing phosphorus additive education may be an important strategy in the dietary management of hyperphosphatemia. Further research to establish a database of food items containing phosphorus-based additives is warranted. Copyright © 2015 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  10. Analysis of variation in calibration curves for Kodak XV radiographic film using model-based parameters.

    Science.gov (United States)

    Hsu, Shu-Hui; Kulasekere, Ravi; Roberson, Peter L

    2010-08-05

    Film calibration is time-consuming work when dose accuracy is essential while working in a range of photon scatter environments. This study uses the single-target single-hit model of film response to fit the calibration curves as a function of calibration method, processor condition, field size and depth. Kodak XV film was irradiated perpendicular to the beam axis in a solid water phantom. Standard calibration films (one dose point per film) were irradiated at 90 cm source-to-surface distance (SSD) for various doses (16-128 cGy), depths (0.2, 0.5, 1.5, 5, 10 cm) and field sizes (5 × 5, 10 × 10 and 20 × 20 cm²). The 8-field calibration method (eight dose points per film) was used as a reference for each experiment, taken at 95 cm SSD and 5 cm depth. The delivered doses were measured using an Attix parallel plate chamber for improved accuracy of dose estimation in the buildup region. Three fitting methods with one to three dose points per calibration curve were investigated for the field sizes of 5 × 5, 10 × 10 and 20 × 20 cm². The inter-day variation of model parameters (background, saturation and slope) were 1.8%, 5.7%, and 7.7% (1 σ) using the 8-field method. The saturation parameter ratio of standard to 8-field curves was 1.083 ± 0.005. The slope parameter ratio of standard to 8-field curves ranged from 0.99 to 1.05, depending on field size and depth. The slope parameter ratio decreases with increasing depth below 0.5 cm for the three field sizes. It increases with increasing depths above 0.5 cm. A calibration curve with one to three dose points fitted with the model is possible with 2% accuracy in film dosimetry for various irradiation conditions. The proposed fitting methods may reduce workload while providing energy dependence correction in radiographic film dosimetry. This study is limited to radiographic XV film with a Lumisys scanner.

  11. Characterization of Type Ia Supernova Light Curves Using Principal Component Analysis of Sparse Functional Data

    Science.gov (United States)

    He, Shiyuan; Wang, Lifan; Huang, Jianhua Z.

    2018-04-01

    With growing data from ongoing and future supernova surveys, it is possible to empirically quantify the shapes of SNIa light curves in more detail, and to quantitatively relate the shape parameters with the intrinsic properties of SNIa. Building such relationships is critical in controlling systematic errors associated with supernova cosmology. Based on a collection of well-observed SNIa samples accumulated in the past years, we construct an empirical SNIa light curve model using a statistical method called the functional principal component analysis (FPCA) for sparse and irregularly sampled functional data. Using this method, the entire light curve of an SNIa is represented by a linear combination of principal component functions, and the SNIa is represented by a few numbers called “principal component scores.” These scores are used to establish relations between light curve shapes and physical quantities such as intrinsic color, interstellar dust reddening, spectral line strength, and spectral classes. These relations allow for descriptions of some critical physical quantities based purely on light curve shape parameters. Our study shows that some important spectral feature information is being encoded in the broad band light curves; for instance, we find that the light curve shapes are correlated with the velocity and velocity gradient of the Si II λ6355 line. This is important for supernova surveys (e.g., LSST and WFIRST). Moreover, the FPCA light curve model is used to construct the entire light curve shape, which in turn is used in a functional linear form to adjust intrinsic luminosity when fitting distance models.

  12. Power Curve Measurements FGW

    DEFF Research Database (Denmark)

    Georgieva Yankova, Ginka; Federici, Paolo

    This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2.......This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2....

  13. Stage-discharge rating curves based on satellite altimetry and modeled discharge in the Amazon basin

    Science.gov (United States)

    Paris, Adrien; Dias de Paiva, Rodrigo; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Calmant, Stephane; Garambois, Pierre-André; Collischonn, Walter; Bonnet, Marie-Paule; Seyler, Frederique

    2016-05-01

    In this study, rating curves (RCs) were determined by applying satellite altimetry to a poorly gauged basin. This study demonstrates the synergistic application of remote sensing and watershed modeling to capture the dynamics and quantity of flow in the Amazon River Basin, respectively. Three major advancements for estimating basin-scale patterns in river discharge are described. The first advancement is the preservation of the hydrological meanings of the parameters expressed by Manning's equation to obtain a data set containing the elevations of the river beds throughout the basin. The second advancement is the provision of parameter uncertainties and, therefore, the uncertainties in the rated discharge. The third advancement concerns estimating the discharge while considering backwater effects. We analyzed the Amazon Basin using nearly one thousand series that were obtained from ENVISAT and Jason-2 altimetry for more than 100 tributaries. Discharge values and related uncertainties were obtained from the rain-discharge MGB-IPH model. We used a global optimization algorithm based on the Monte Carlo Markov Chain and Bayesian framework to determine the rating curves. The data were randomly allocated into 80% calibration and 20% validation subsets. A comparison with the validation samples produced a Nash-Sutcliffe efficiency (Ens) of 0.68. When the MGB discharge uncertainties were less than 5%, the Ens value increased to 0.81 (mean). A comparison with the in situ discharge resulted in an Ens value of 0.71 for the validation samples (and 0.77 for calibration). The Ens values at the mouths of the rivers that experienced backwater effects significantly improved when the mean monthly slope was included in the RC. Our RCs were not mission-dependent, and the Ens value was preserved when applying ENVISAT rating curves to Jason-2 altimetry at crossovers. The cease-to-flow parameter of our RCs provided a good proxy for determining river bed elevation. This proxy was validated

  14. [New population curves in spanish extremely preterm neonates].

    Science.gov (United States)

    García-Muñoz Rodrigo, F; García-Alix Pérez, A; Figueras Aloy, J; Saavedra Santana, P

    2014-08-01

    Most anthropometric reference data for extremely preterm infants used in Spain are outdated and based on non-Spanish populations, or are derived from small hospital-based samples that failed to include neonates of borderline viability. To develop gender-specific, population-based curves for birth weight, length, and head circumference in extremely preterm Caucasian infants, using a large contemporary sample size of Spanish singletons. Anthropometric data from neonates ≤ 28 weeks of gestational age were collected between January 2002 and December 2010 using the Spanish database SEN1500. Gestational age was estimated according to obstetric data (early pregnancy ultrasound). The data were analyzed with the SPSS.20 package, and centile tables were created for males and females using the Cole and Green LMS method. This study presents the first population-based growth curves for extremely preterm infants, including those of borderline viability, in Spain. A sexual dimorphism is evident for all of the studied parameters, starting at early gestation. These new gender-specific and population-based data could be useful for the improvement of growth assessments of extremely preterm infants in our country, for the development of epidemiological studies, for the evaluation of temporal trends, and for clinical or public health interventions seeking to optimize fetal growth. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  15. Use of supernovae light curves for testing the expansion hypothesis and other cosmological relations

    International Nuclear Information System (INIS)

    Rust, B.W.

    1974-01-01

    This thesis is primarily concerned with a test of the expansion hypothesis based on the relation Δt/sub obs/ = (1 + V/sub r//c)Δt/sub int/ where Δt/sub int/ is the time lapse characterizing some phenomenon in a distant galaxy, Δt/sub obs/ is the observed time lapse and V/sub r/ is the symbolic velocity of recession. If the red shift is a Doppler effect, the observed time lapse should be lengthened by the same factor as the wave length of the light. Many authors have suggested type I supernovae for such a test because of their great luminosity and the uniformity of their light curves, but apparently the test has heretofore never actually been performed. Thirty-six light curves were gathered from the literature and one (SN1971i) was measured. All of the light curves were reduced to a common (m/sub pg/) photometric system. The comparison time lapse, Δt/sub c/, was taken to be the time required for the brightness to fall from 0.5 m below peak to 2.5 m below peak. The straight line regression of Δt/sub c/ on V/sub r/ gives a correlation coefficient significant at the 93 percent level, and the simple static Euclidean hypothesis is rejected at that level. The regression line also deviates from the prediction of the classical expansion hypothesis. Better agreement was obtained using the chronogeometric theory of I. E. Segal ( []972 Astron. and Astrophys. 18, 143), but the scatter in the present data makes it impossible to distinguish between these alternate hypotheses at the 95 percent confidence level. The question of how many additional light curves would be needed to give definite tests is addressed. It is shown that at the present rate of supernova discoveries, only a few more years would be required to obtain the necessary data if light curves are systematically measured for the more distant supernovae. (Diss. Abstr. Int., B)

  16. LEARNING CURVE IN SINGLE-LEVEL MINIMALLY INVASIVE TLIF: EXPERIENCE OF A NEUROSURGEON

    Directory of Open Access Journals (Sweden)

    Samuel Romano-Feinholz

    Full Text Available ABSTRACT Objective: To describe the learning curve that shows the progress of a single neurosurgeon when performing single-level MI-TLIF. Methods: We included 99 consecutive patients who underwent single-level MI-TLIF by the same neurosurgeon (JASS. Patient’s demographic characteristics were analyzed. In addition, surgical time, intraoperative blood loss and hospital stay were evaluated. The learning curves were calculated with a piecewise regression model. Results: The mean age was 54.6 years. The learning curves showed an inverse relationship between the surgical experience and the variable analyzed, reaching an inflection point for surgical time in case 43 and for blood loss in case 48. The mean surgical time was 203.3 minutes (interquartile range [IQR] 150-240 minutes, intraoperative bleeding was 97.4ml (IQR 40-100ml and hospital stay of four days (IQR 3-5 days. Conclusions: MI-TLIF is a very frequent surgical procedure due to its effectiveness and safety, which has shown similar results to open procedure. According to this study, the required learning curve is slightly higher than for open procedures, and is reached after about 45 cases.

  17. Construction of calibration curve for accountancy tank

    International Nuclear Information System (INIS)

    Kato, Takayuki; Goto, Yoshiki; Nidaira, Kazuo

    2009-01-01

    Tanks are equipped in a reprocessing plant for accounting solution of nuclear material. The careful measurement of volume in tanks is very important to implement rigorous accounting of nuclear material. The calibration curve relating the volume and level of solution needs to be constructed, where the level is determined by differential pressure of dip tubes. Several calibration curves are usually employed, but it's not explicitly decided how many segment are used, where to select segment, or what should be the degree of polynomial curve. These parameters, i.e., segment and degree of polynomial curve are mutually interrelated to give the better performance of calibration curve. Here we present the construction technique of giving optimum calibration curves and their characteristics. (author)

  18. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  19. Magnetization curves of sintered heavy tungsten alloys for applications in MRI-guided radiotherapy

    International Nuclear Information System (INIS)

    Kolling, Stefan; Oborn, Bradley M.; Keall, Paul J.; Horvat, Joseph

    2014-01-01

    Purpose: Due to the current interest in MRI-guided radiotherapy, the magnetic properties of the materials commonly used in radiotherapy are becoming increasingly important. In this paper, measurement results for the magnetization (BH) curves of a range of sintered heavy tungsten alloys used in radiation shielding and collimation are presented. Methods: Sintered heavy tungsten alloys typically contain >90 % tungsten and 0 and the BH curve derived. Results: The iron content of the alloys was found to play a dominant role, directly influencing the magnetizationM and thus the nonlinearity of the BH curve. Generally, the saturation magnetization increased with increasing iron content of the alloy. Furthermore, no measurable magnetization was found for all alloys without iron content, despite containing up to 6% of nickel. For two samples from different manufacturers but with identical quoted nominal elemental composition (95% W, 3.5% Ni, 1.5% Fe), a relative difference in the magnetization of 11%–16% was measured. Conclusions: The measured curves show that the magnetic properties of sintered heavy tungsten alloys strongly depend on the iron content, whereas the addition of nickel in the absence of iron led to no measurable effect. Since a difference in the BH curves for two samples with identical quoted nominal composition from different manufacturers was observed, measuring of the BH curve for each individual batch of heavy tungsten alloys is advisable whenever accurate knowledge of the magnetic properties is crucial. The obtained BH curves can be used in FEM simulations to predict the magnetic impact of sintered heavy tungsten alloys

  20. 51Cr - erythrocyte survival curves

    International Nuclear Information System (INIS)

    Paiva Costa, J. de.

    1982-07-01

    Sixteen patients were studied, being fifteen patients in hemolytic state, and a normal individual as a witness. The aim was to obtain better techniques for the analysis of the erythrocytes, survival curves, according to the recommendations of the International Committee of Hematology. It was used the radiochromatic method as a tracer. Previously a revisional study of the International Literature was made in its aspects inherent to the work in execution, rendering possible to establish comparisons and clarify phonomena observed in cur investigation. Several parameters were considered in this study, hindering both the exponential and the linear curves. The analysis of the survival curves of the erythrocytes in the studied group, revealed that the elution factor did not present a homogeneous answer quantitatively to all, though, the result of the analysis of these curves have been established, through listed programs in the electronic calculator. (Author) [pt