WorldWideScience

Sample records for normalized cut method

  1. Color Restoration Method Based on Spectral Information Using Normalized Cut

    Institute of Scientific and Technical Information of China (English)

    Tetsuro Morimoto; Tohru Mihashi; Katsushi Ikeuchi

    2008-01-01

    This paper proposes a novel method for color restoration that can effectively apply accurate color based on spectral information to a segmented image using the normalized cut technique. Using the proposed method, we can obtain a digital still camera image and spectral information in different environments. Also, it is not necessary to estimate reflectance spectra using a spectral database such as other methods. The synthesized images are accurate and high resolution. The proposed method effectively works in making digital archive contents. Some experimental results are demonstrated in this paper.

  2. Wedge cutting of mild steel by CO 2 laser and cut-quality assessment in relation to normal cutting

    Science.gov (United States)

    Yilbas, B. S.; Karatas, C.; Uslan, I.; Keles, O.; Usta, Y.; Yilbas, Z.; Ahsan, M.

    2008-10-01

    In some applications, laser cutting of wedge surfaces cannot be avoided in sheet metal processing and the quality of the end product defines the applicability of the laser-cutting process in such situations. In the present study, CO 2 laser cutting of the wedge surfaces as well as normal surfaces (normal to laser beam axis) is considered and the end product quality is assessed using the international standards for thermal cutting. The cut surfaces are examined by the optical microscopy and geometric features of the cut edges such as out of flatness and dross height are measured from the micrographs. A neural network is introduced to classify the striation patterns of the cut surfaces. It is found that the dross height and out of flatness are influenced significantly by the laser output power, particularly for wedge-cutting situation. Moreover, the cut quality improves at certain value of the laser power intensity.

  3. Spectral segmentation of polygonized images with normalized cuts

    Energy Technology Data Exchange (ETDEWEB)

    Matsekh, Anna [Los Alamos National Laboratory; Skurikhin, Alexei [Los Alamos National Laboratory; Rosten, Edward [UNIV OF CAMBRIDGE

    2009-01-01

    We analyze numerical behavior of the eigenvectors corresponding to the lowest eigenvalues of the generalized graph Laplacians arising in the Normalized Cuts formulations of the image segmentation problem on coarse polygonal grids.

  4. NEURONAL WHITE MATTER PARCELLATION USING SPATIALLY COHERENT NORMALIZED CUTS.

    Science.gov (United States)

    Bloy, Luke; Ingalhalikar, Madhura; Verma, Ragini

    2011-01-01

    This work presents an automated method for partitioning neuronal white matter (WM) into regions of interest with uniform WM architecture. These regions can then be used to replace atlas-derived regions for any subsequent statistical analysis. The fiber orientation distribution function is used as a model of WM architecture resulting in a voxel similarity function sensitive to both fiber orientations and configurations. The method utilizes the normalized cuts algorithm to partition WM voxels based on this similarity function along with a connected component labeling algorithm to ensure spatial compactness. We illustrate the algorithms ability to discern regions based on both orientation and complexity through its application to a simulated fiber crossing and an in-vivo dataset.

  5. Correlation methods in cutting arcs

    Energy Technology Data Exchange (ETDEWEB)

    Prevosto, L; Kelly, H, E-mail: prevosto@waycom.com.ar [Grupo de Descargas Electricas, Departamento Ing. Electromecanica, Universidad Tecnologica Nacional, Regional Venado Tuerto, Laprida 651, Venado Tuerto (2600), Santa Fe (Argentina)

    2011-05-01

    The present work applies similarity theory to the plasma emanating from transferred arc, gas-vortex stabilized plasma cutting torches, to analyze the existing correlation between the arc temperature and the physical parameters of such torches. It has been found that the enthalpy number significantly influence the temperature of the electric arc. The obtained correlation shows an average deviation of 3% from the temperature data points. Such correlation can be used, for instance, to predict changes in the peak value of the arc temperature at the nozzle exit of a geometrically similar cutting torch due to changes in its operation parameters.

  6. Normalized cut group clustering of resting-state FMRI data.

    Directory of Open Access Journals (Sweden)

    Martijn van den Heuvel

    Full Text Available BACKGROUND: Functional brain imaging studies have indicated that distinct anatomical brain regions can show coherent spontaneous neuronal activity during rest. Regions that show such correlated behavior are said to form resting-state networks (RSNs. RSNs have been investigated using seed-dependent functional connectivity maps and by using a number of model-free methods. However, examining RSNs across a group of subjects is still a complex task and often involves human input in selecting meaningful networks. METHODOLOGY/PRINCIPAL FINDINGS: We report on a voxel based model-free normalized cut graph clustering approach with whole brain coverage for group analysis of resting-state data, in which the number of RSNs is computed as an optimal clustering fit of the data. Inter-voxel correlations of time-series are grouped at the individual level and the consistency of the resulting networks across subjects is clustered at the group level, defining the group RSNs. We scanned a group of 26 subjects at rest with a fast BOLD sensitive fMRI scanning protocol on a 3 Tesla MR scanner. CONCLUSIONS/SIGNIFICANCE: An optimal group clustering fit revealed 7 RSNs. The 7 RSNs included motor/visual, auditory and attention networks and the frequently reported default mode network. The found RSNs showed large overlap with recently reported resting-state results and support the idea of the formation of spatially distinct RSNs during rest in the human brain.

  7. Statokinesigram normalization method.

    Science.gov (United States)

    de Oliveira, José Magalhães

    2017-02-01

    Stabilometry is a technique that aims to study the body sway of human subjects, employing a force platform. The signal obtained from this technique refers to the position of the foot base ground-reaction vector, known as the center of pressure (CoP). The parameters calculated from the signal are used to quantify the displacement of the CoP over time; there is a large variability, both between and within subjects, which prevents the definition of normative values. The intersubject variability is related to differences between subjects in terms of their anthropometry, in conjunction with their muscle activation patterns (biomechanics); and the intrasubject variability can be caused by a learning effect or fatigue. Age and foot placement on the platform are also known to influence variability. Normalization is the main method used to decrease this variability and to bring distributions of adjusted values into alignment. In 1996, O'Malley proposed three normalization techniques to eliminate the effect of age and anthropometric factors from temporal-distance parameters of gait. These techniques were adopted to normalize the stabilometric signal by some authors. This paper proposes a new method of normalization of stabilometric signals to be applied in balance studies. The method was applied to a data set collected in a previous study, and the results of normalized and nonnormalized signals were compared. The results showed that the new method, if used in a well-designed experiment, can eliminate undesirable correlations between the analyzed parameters and the subjects' characteristics and show only the experimental conditions' effects.

  8. On the Methods for Calculating Annual Allowable Cut

    Directory of Open Access Journals (Sweden)

    V. А. Sokolov

    2014-10-01

    Full Text Available Crisis in supplying regions and the country related to available forest resources and low profitability of forest sector, as a whole, is an indicator of failure of the existing model of forest management and forest use organization in Russia at the present time. Many Russian regions, which are traditionally considered as forest industrial territories, face the challenge of lack of economically accessible forests. The forests are decreasing against a background of under exploitation of the annual allowable cut. This situation occurs in Siberia as well. In many cases, using calculated allowable cut will result in unsustainable harvest levels and a future decrease of accessible forest resources. Thus, the statement that «a volume of wood resource utilization is determined by allowable cut represented the scientifically grounded norm of sustainable forest use» is considered as no more than the declarative proposition. Modeling the normal forest, and using a formula of allowable cut calculation estimated for some decades based on the modeling, is totally unreliable and unreal. The long-term forecast should use analog methods, but it will hardly be sufficiently accurate and adequate to set norms. In order to estimate ecological and economic accessibility of forest resources, an algorithm was made, and a method and model were developed. This model is based on GIS-database and makes it possible to estimate accessibility of forest resources and to map it as well. The conclusion on necessity to determine annual allowable cut in two varieties was drawn following the procedures for calculating annual allowable cut. The first variety is silvicultural (according the currently used methods and the other one is economically accessible allowable cut, which could provide economic effective use of tradable mature wood, taking in to account ecological and economic accessibility of forest resources.

  9. Normal Limits of Electrocardiogram and Cut-Off Values for Left ...

    African Journals Online (AJOL)

    olayemitoyin

    Summary: This study assessed healthy young adults to determine the normal limits for electrocardiographic ... The normal limits for heart rate, P wave duration, amplitude and axis in lead II ... Gender difference exists in some cut-off values for.

  10. Replacing spectral techniques for expander ratio, normalized cut and conductance by combinatorial flow algorithms

    CERN Document Server

    Hochbaum, Dorit S

    2010-01-01

    Several challenging problem in clustering, partitioning and imaging have traditionally been solved using the "spectral technique". These problems include the normalized cut problem, the graph expander ratio problem, the Cheeger constant problem and the conductance problem. These problems share several common features: all seek a bipartition of a set of elements; the problems are formulated as a form of ratio cut; the formulation as discrete optimization is shown here to be equivalent to a quadratic ratio, sometimes referred to as the Raleigh ratio, on discrete variables and a single sum constraint which we call the balance or orthogonality constraint; when the discrete nature of the variables is disregarded, the continuous relaxation is solved by the spectral method. Indeed the spectral relaxation technique is a dominant method providing an approximate solution to these problems. We propose an algorithm for these problems which involves a relaxation of the orthogonality constraint only. This relaxation is sho...

  11. Selection of Near Optimal Laser Cutting Parameters in CO2 Laser Cutting by the Taguchi Method

    Directory of Open Access Journals (Sweden)

    Miloš MADIĆ

    2013-12-01

    Full Text Available Identification of laser cutting conditions that are insensitive to parameter variations and noise is of great importance. This paper demonstrates the application of Taguchi method for optimization of surface roughness in CO2 laser cutting of stainless steel. The laser cutting experiment was planned and conducted according to the Taguchi’s experimental design using the L27 orthogonal array. Four laser cutting parameters such as laser power, cutting speed, assist gas pressure, and focus position were considered in the experiment. Using the analysis of means and analysis of variance, the significant laser cutting parameters were identified, and subsequently the optimal combination of laser cutting parameter levels was determined. The results showed that the cutting speed is the most significant parameter affecting the surface roughness whereas the influence of the assist gas pressure can be neglected. It was observed, however, that interaction effects have predominant influence over the main effects on the surface roughness.

  12. Cutting tool form compensation system and method

    Science.gov (United States)

    Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.

    1993-10-19

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.

  13. Cutting tool form compensaton system and method

    Science.gov (United States)

    Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.

    1993-01-01

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.

  14. GPU-based normalized cuts for road extraction using satellite imagery

    Indian Academy of Sciences (India)

    J Senthilnath; S Sindhu; S N Omkar

    2014-12-01

    This paper presents a GPU implementation of normalized cuts for road extraction problem using panchromatic satellite imagery. The roads have been extracted in three stages namely pre-processing, image segmentation and post-processing. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, vegetation, and fallow regions). The road regions are then extracted using the normalized cuts algorithm. Normalized cuts algorithm is a graph-based partitioning approach whose focus lies in extracting the global impression (perceptual grouping) of an image rather than local features. For the segmented image, post-processing is carried out using morphological operations – erosion and dilation. Finally, the road extracted image is overlaid on the original image. Here, a GPGPU (General Purpose Graphical Processing Unit) approach has been adopted to implement the same algorithm on the GPU for fast processing. A performance comparison of this proposed GPU implementation of normalized cuts algorithm with the earlier algorithm (CPU implementation) is presented. From the results, we conclude that the computational improvement in terms of time as the size of image increases for the proposed GPU implementation of normalized cuts. Also, a qualitative and quantitative assessment of the segmentation results has been projected.

  15. Water jet: a promising method for cutting optical glass

    Science.gov (United States)

    Salinas-Luna, Javier; Machorro, Roberto; Camacho, Javier; Luna, Esteban; Nunez, Juan

    2006-05-01

    We present an alternative method for cutting optical glass. It works with a high-pressure fluid, carring abrasive powder. This technique offers some advantages over conventional methods that use diamond abrasive covered wires or disks. We make a critical comparison between those two techniques, characterizing cuts with interferometric, polarimetric, and Ronchi testing. The main feature of the water-jet technique is that it allows surface of any shape, already polished, to be cut safely.

  16. The $\\mathcal{Q}$-cut Representation of One-loop Integrands and Unitarity Cut Method

    CERN Document Server

    Huang, Rijun; Rao, Junjie; Zhou, Kang; Feng, Bo

    2015-01-01

    Recently, a new construction for complete loop integrands of massless field theories has been proposed, with on-shell tree-level amplitudes delicately incorporated into its algorithm. This new approach reinterprets integrands in a novel form, namely the $\\mathcal{Q}$-cut representation. In this paper, by deriving one-loop integrands as examples, we elaborate in details the technique of this new representation, e.g., the summation over all possible $\\mathcal{Q}$-cuts as well as helicity states for the non-scalar internal particle in the loop. Moreover, we show that the integrand in the $\\mathcal{Q}$-cut representation naturally reduces to the integrand in the traditional unitarity cut method for each given cut channel, providing a cross-check for the new approach.

  17. A new approach of high speed cutting modelling: SPH method

    OpenAIRE

    LIMIDO, Jérôme; Espinosa, Christine; Salaün, Michel; Lacome, Jean-Luc

    2006-01-01

    The purpose of this study is to introduce a new approach of high speed cutting numerical modelling. A lagrangian Smoothed Particle Hydrodynamics (SPH) based model is carried out using the Ls-Dyna software. SPH is a meshless method, thus large material distortions that occur in the cutting problem are easily managed and SPH contact control permits a “natural” workpiece/chip separation. Estimated chip morphology and cutting forces are compared to machining dedicated code results and experimenta...

  18. SPH method applied to high speed cutting modelling

    OpenAIRE

    LIMIDO, Jérôme; Espinosa, Christine; Salaün, Michel; Lacome, Jean-Luc

    2007-01-01

    The purpose of this study is to introduce a new approach of high speed cutting numerical modelling. A Lagrangian smoothed particle hydrodynamics (SPH)- based model is arried out using the Ls-Dyna software. SPH is a meshless method, thus large material distortions that occur in the cutting problem are easily managed and SPH contact control permits a "natural" workpiece/chip separation. The developed approach is compared to machining dedicated code results and experimental data. The SPH cutting...

  19. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  20. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  1. Cutting

    Science.gov (United States)

    ... a traumatic experience, such as living through abuse , violence, or a disaster. Self-injury may feel like ... embarrassed." Sometimes self-injury affects a person's body image. Jen says, "I actually liked how the cuts ...

  2. INFLUENCE OF CUTTING ZONE COOLING METHOD ON CHIP FORMING CONDITIONS

    Directory of Open Access Journals (Sweden)

    E. E. Feldshtein

    2014-01-01

    Full Text Available The paper considers an influence of a cutting zone cooling method on the chip shape and thickening ratio while turning R35 steel with the hardness of НВ 1250 МРа. Cutting with various types of cooling - dry, compressed air and emulsion fog has been investigated in the paper. OPORTET RG-2 emulsol with emulsion concentration of 4% has been used as an active substation. Cutting tool is a turning cutter with a changeable square plate SNUN120408 made of Р25 hard alloy with multilayer wear-resistant coating, upper titanium nitride layer. Front plate surface is flat. Range of cutting speeds - 80-450 m/min, motions - 0,1-0,5 mm/rev, emulsion flow - 1,5-3,5 g/min and compressed air - 4,5-7,0 m3/h, cutting depth - 1,0 mm. In order to reduce a number of single investigations it is possible to use plans based on ЛПх-sequences.It has been shown that the method for cutting zone cooling exerts significant influence on conditions for chip formation. Regression equation describing influence of machining conditions on Ка-chip thickening ratio has been obtained in the paper. The range of cutting modes is extended while using emulsion fog for cooling. In the process of these modes chip is formed in the shape of short spiral fragments or elements. Favourable form of chips is ensured while using the following rate of emulsion - not more than 2 g/min. The investigations have made it possible to determine conditions required for cooling emulsion fog. In this case it has been observed minimum values in chip thickening ratio and chip shape that ensures its easy removal from cutting zone. While making dry turning values of Ка is higher not less than 15 % in comparison with other methods for cutting zone cooling.

  3. Examining Classification Criteria: A Comparison of Three Cut Score Methods

    Science.gov (United States)

    DiStefano, Christine; Morgan, Grant

    2011-01-01

    This study compared 3 different methods of creating cut scores for a screening instrument, T scores, receiver operating characteristic curve (ROC) analysis, and the Rasch rating scale method (RSM), for use with the Behavioral and Emotional Screening System (BESS) Teacher Rating Scale for Children and Adolescents (Kamphaus & Reynolds, 2007).…

  4. Vision-based method for tracking meat cuts in slaughterhouses

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo; Hviid, Marchen Sonja; Engbo Jørgensen, Mikkel

    2014-01-01

    Meat traceability is important for linking process and quality parameters from the individual meat cuts back to the production data from the farmer that produced the animal. Current tracking systems rely on physical tagging, which is too intrusive for individual meat cuts in a slaughterhouse...... environment. In this article, we demonstrate a computer vision system for recognizing meat cuts at different points along a slaughterhouse production line. More specifically, we show that 211 pig loins can be identified correctly between two photo sessions. The pig loins undergo various perturbation scenarios...... (hanging, rough treatment and incorrect trimming) and our method is able to handle these perturbations gracefully. This study shows that the suggested vision-based approach to tracking is a promising alternative to the more intrusive methods currently available....

  5. Method and apparatus for jet-assisted drilling or cutting

    Science.gov (United States)

    Summers, David Archibold; Woelk, Klaus Hubert; Oglesby, Kenneth Doyle; Galecki, Grzegorz

    2013-07-02

    An abrasive cutting or drilling system, apparatus and method, which includes an upstream supercritical fluid and/or liquid carrier fluid, abrasive particles, a nozzle and a gaseous or low-density supercritical fluid exhaust abrasive stream. The nozzle includes a throat section and, optionally, a converging inlet section, a divergent discharge section, and a feed section.

  6. Two-Dimensional Rectangular Stock Cutting Problem and Solution Methods

    Institute of Scientific and Technical Information of China (English)

    Zhao Hui; Yu Liang; Ning Tao; Xi Ping

    2001-01-01

    Optimal layout of rectangular stock cutting is still in great demand from industry for diversified applications. This paper introduces four basic solution methods to the problem: linear programming, dynamic programming, tree search and heuristic approach. A prototype of application software is developed to verify the pros and cons of various approaches.

  7. High-throughput biomarker segmentation on ovarian cancer tissue microarrays via hierarchical normalized cuts.

    Science.gov (United States)

    Janowczyk, Andrew; Chandran, Sharat; Singh, Rajendra; Sasaroli, Dimitra; Coukos, George; Feldman, Michael D; Madabhushi, Anant

    2012-05-01

    We present a system for accurately quantifying the presence and extent of stain on account of a vascular biomarker on tissue microarrays. We demonstrate our flexible, robust, accurate, and high-throughput minimally supervised segmentation algorithm, termed hierarchical normalized cuts (HNCuts) for the specific problem of quantifying extent of vascular staining on ovarian cancer tissue microarrays. The high-throughput aspect of HNCut is driven by the use of a hierarchically represented data structure that allows us to merge two powerful image segmentation algorithms-a frequency weighted mean shift and the normalized cuts algorithm. HNCuts rapidly traverses a hierarchical pyramid, generated from the input image at various color resolutions, enabling the rapid analysis of large images (e.g., a 1500 × 1500 sized image under 6 s on a standard 2.8-GHz desktop PC). HNCut is easily generalizable to other problem domains and only requires specification of a few representative pixels (swatch) from the object of interest in order to segment the target class. Across ten runs, the HNCut algorithm was found to have average true positive, false positive, and false negative rates (on a per pixel basis) of 82%, 34%, and 18%, in terms of overlap, when evaluated with respect to a pathologist annotated ground truth of the target region of interest. By comparison, a popular supervised classifier (probabilistic boosting trees) was only able to marginally improve on the true positive and false negative rates (84% and 14%) at the expense of a higher false positive rate (73%), with an additional computation time of 62% compared to HNCut. We also compared our scheme against a k-means clustering approach, which both the HNCut and PBT schemes were able to outperform. Our success in accurately quantifying the extent of vascular stain on ovarian cancer TMAs suggests that HNCut could be a very powerful tool in digital pathology and bioinformatics applications where it could be used to

  8. The method of minimal normal forms

    Energy Technology Data Exchange (ETDEWEB)

    Mane, S.R.; Weng, W.T.

    1992-12-31

    Normal form methods for solving nonlinear differential equations are reviewed and the comparative merits of three methods are evaluated. The concept of the minimal normal form is explained and is shown to be superior to other choices. The method is then extended to apply to the evaluation of discrete maps of an accelerator or storage ring. Such an extension, as suggested in this paper, is more suited for accelerator-based applications than a formulation utilizing continuous differential equations. A computer code has been generated to systematically implement various normal form formulations for maps in two-dimensional phase space. Specific examples of quadratic and cubic nonlinear fields were used and solved by the method developed. The minimal normal form method shown here gives good results using relatively low order expansions.

  9. The method of minimal normal forms

    Energy Technology Data Exchange (ETDEWEB)

    Mane, S.R.; Weng, W.T.

    1992-01-01

    Normal form methods for solving nonlinear differential equations are reviewed and the comparative merits of three methods are evaluated. The concept of the minimal normal form is explained and is shown to be superior to other choices. The method is then extended to apply to the evaluation of discrete maps of an accelerator or storage ring. Such an extension, as suggested in this paper, is more suited for accelerator-based applications than a formulation utilizing continuous differential equations. A computer code has been generated to systematically implement various normal form formulations for maps in two-dimensional phase space. Specific examples of quadratic and cubic nonlinear fields were used and solved by the method developed. The minimal normal form method shown here gives good results using relatively low order expansions.

  10. Evaluation of three working methods for cutting windshields seams in car repair

    NARCIS (Netherlands)

    Grinten, M.P. van der; Bronkhorst, R.E.

    2006-01-01

    A company for car glass repair requested a physical load evaluation study of three cutting methods for windshield removal. Aim was to get indications of the cutting forces in relation to the postures observed. A cutting knife, a traditional cutting wire and a new developed wire winder were tested.

  11. The analysis and selection of methods and facilities for cutting of naturally-deficit materials

    Science.gov (United States)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2016-06-01

    The comparison of perspective methods is done in the article, such as laser, plasma and combined electro-diamond methods of hard processed materials cutting. There are the review and analysis of naturally-deficit materials cutting facilities. A new electrode-tool for the combined cutting of naturally-deficit materials is suggested. This electrode-tool eliminates electrical contact between the cutting electrode-tool and side surfaces of the channel of cutting workpiece cut, which allows to obtain coplanar channels of cut.

  12. Ann modeling of kerf transfer in Co2 laser cutting and optimization of cutting parameters using monte carlo method

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2015-01-01

    Full Text Available In this paper, an attempt has been made to develop a mathematical model in order to study the relationship between laser cutting parameters such as laser power, cutting speed, assist gas pressure and focus position, and kerf taper angle obtained in CO2 laser cutting of AISI 304 stainless steel. To this aim, a single hidden layer artificial neural network (ANN trained with gradient descent with momentum algorithm was used. To obtain an experimental database for the ANN training, laser cutting experiment was planned as per Taguchi’s L27 orthogonal array with three levels for each of the cutting parameters. Statistically assessed as adequate, ANN model was then used to investigate the effect of the laser cutting parameters on the kerf taper angle by generating 2D and 3D plots. It was observed that the kerf taper angle was highly sensitive to the selected laser cutting parameters, as well as their interactions. In addition to modeling, by applying the Monte Carlo method on the developed kerf taper angle ANN model, the near optimal laser cutting parameter settings, which minimize kerf taper angle, were determined.

  13. Color image segmentation based on normalized cut and fish swarm optimization algorithm%基于鱼群算法优化normalized cut的彩色图像分割方法

    Institute of Scientific and Technical Information of China (English)

    周逊; 郭敏; 马苗

    2013-01-01

    为了克服传统的谱聚类算法求解normalized cut彩色图像分割时,分割效果差、算法复杂度高的缺点,提出了一种基于鱼群算法优化normalized cut的彩色图像分割方法.先对图像进行模糊C-均值聚类预处理,然后用鱼群优化算法替代谱聚类算法求解Ncut的最小值,最后通过最优个体鱼得到分割结果.实验表明,该方法耗时少,且分割效果好.%Traditional spectral clustering algorithm minimizing normalized cut criterion has an inaccurate result and a high algorithm complexity in color image segmentation. In order to improve these disadvantages, this paper proposed a color image segmentation method based on normalized cut and fish swarm optimization algorithm. It firstly used fuzzy C-means dealing with color image, then employed fish swarm optimization algorithm instead of spectral clustering algorithm to minimize normalized cut, finally got segmentation result by the optimal individual fish. Experimental results show that the method achieves consumes less time, and achieves a precise segmentation result.

  14. A Proposed Arabic Handwritten Text Normalization Method

    Directory of Open Access Journals (Sweden)

    Tarik Abu-Ain

    2014-11-01

    Full Text Available Text normalization is an important technique in document image analysis and recognition. It consists of many preprocessing stages, which include slope correction, text padding, skew correction, and straight the writing line. In this side, text normalization has an important role in many procedures such as text segmentation, feature extraction and characters recognition. In the present article, a new method for text baseline detection, straightening, and slant correction for Arabic handwritten texts is proposed. The method comprises a set of sequential steps: first components segmentation is done followed by components text thinning; then, the direction features of the skeletons are extracted, and the candidate baseline regions are determined. After that, selection of the correct baseline region is done, and finally, the baselines of all components are aligned with the writing line.  The experiments are conducted on IFN/ENIT benchmark Arabic dataset. The results show that the proposed method has a promising and encouraging performance.

  15. 顺产接生会阴伤口侧切与纵切的临床比较%Clinical Comparison on Perineum Wound of Side -cutting and Cross -cutting in Normal Labor/LI Yanchao, WU Xiaowei

    Institute of Scientific and Technical Information of China (English)

    李艳超; 吴晓蔚

    2016-01-01

    Objective To analysis the effect and difference on side -cutting and cross-cutting of episiotomy in nor-mal labor .Methods 104 cases of vaginal delivery performed episiotomy were analyzed ,these maternal women were hos-pitalized in our hospital from February 2014 to February 2015 .The recovery effect of perineum incision and maternal sat-isfaction were compared between two methods .Results The adverse effects of side -cutting were lower than cross -cut-ting for the maternal ,the wound recovery degree and maternal pain less than cross -cutting.But the postoperative pain , wound recovery and probability of wound rupture of cross -cutting were superior to side -cutting.Conclusions Ac-cording to situation of maternal , choosing the more suitable incision method for women were very important .The correct episiotomy method can guarantee the safety of the mother's life, prevent pain enhancement , and improve maternal satis-faction , which was great significance for clinical research .%目的:对产妇进行顺产时,行会阴侧切以及纵切的效果以及差异性进行分析。方法将2014年2月~2015年2月进行顺产接生并行会阴切开术的产妇共104例的数据进行对比分析,对会阴切开不同方法的效果以及产妇的满意度进行比较。结果采取会阴侧切的产妇其所产生的不良影响低于行纵切的产妇,其伤口的恢复程度以及产妇的痛感小于纵切产妇。但是行纵切的产妇其术后痛感以及伤口的恢复程度、伤口裂开的几率要优于行侧切的产妇。结论根据产妇的自身情况,选择更适合产妇的会阴切开方法更重要,且采取正确的会阴切开法能够保证产妇的生命安全,防止痛感增强,且提高产妇的满意度,这对于临床研究具有重要意义。

  16. DEVELOPMENT OF IMAGE SELECTION METHOD USING GRAPH CUTS

    Directory of Open Access Journals (Sweden)

    T. Fuse

    2016-06-01

    Full Text Available 3D models have been widely used by spread of many available free-software. Additionally, enormous images can be easily acquired, and images are utilized for creating the 3D models recently. The creation of 3D models by using huge amount of images, however, takes a lot of time and effort, and then efficiency for 3D measurement are required. In the efficient strategy, the accuracy of the measurement is also required. This paper develops an image selection method based on network design that means surveying network construction. The proposed method uses image connectivity graph. The image connectivity graph consists of nodes and edges. The nodes correspond to images to be used. The edges connected between nodes represent image relationships with costs as accuracies of orientation elements. For the efficiency, the image connectivity graph should be constructed with smaller number of edges. Once the image connectivity graph is built, the image selection problem is regarded as combinatorial optimization problem and the graph cuts technique can be applied. In the process of 3D reconstruction, low quality images and similar images are also extracted and removed. Through the experiments, the significance of the proposed method is confirmed. It implies potential to efficient and accurate 3D measurement.

  17. Development of Image Selection Method Using Graph Cuts

    Science.gov (United States)

    Fuse, T.; Harada, R.

    2016-06-01

    3D models have been widely used by spread of many available free-software. Additionally, enormous images can be easily acquired, and images are utilized for creating the 3D models recently. The creation of 3D models by using huge amount of images, however, takes a lot of time and effort, and then efficiency for 3D measurement are required. In the efficient strategy, the accuracy of the measurement is also required. This paper develops an image selection method based on network design that means surveying network construction. The proposed method uses image connectivity graph. The image connectivity graph consists of nodes and edges. The nodes correspond to images to be used. The edges connected between nodes represent image relationships with costs as accuracies of orientation elements. For the efficiency, the image connectivity graph should be constructed with smaller number of edges. Once the image connectivity graph is built, the image selection problem is regarded as combinatorial optimization problem and the graph cuts technique can be applied. In the process of 3D reconstruction, low quality images and similar images are also extracted and removed. Through the experiments, the significance of the proposed method is confirmed. It implies potential to efficient and accurate 3D measurement.

  18. Selection Of Cutting Inserts For Aluminum Alloys Machining By Using MCDM Method

    Science.gov (United States)

    Madić, Miloš; Radovanović, Miroslav; Petković, Dušan; Nedić, Bogdan

    2015-07-01

    Machining of aluminum and its alloys requires the use of cutting tools with special geometry and material. Since there exists a number of cutting tools for aluminum machining, each with unique characteristics, selection of the most appropriate cutting tool for a given application is very complex task which can be viewed as a multi-criteria decision making (MCDM) problem. This paper is focused on multi-criteria analysis of VCGT cutting inserts for aluminum alloys turning by applying recently developed MCDM method, i.e. weighted aggregated sum product assessment (WASPAS) method. The MCDM model was defined using the available catalogue data from cutting tool manufacturers.

  19. Cutting heat dissipation in high-speed machining of carbon steel based on the calorimetric method

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The cutting heat dissipation in chips,workpiece,tool and surroundings during the high-speed machining of carbon steel is quantitatively investigated based on the calorimetric method.Water is used as the medium to absorb the cutting heat;a self-designed container suitable for the high-speed lathe is used to collect the chips,and two other containers are adopted to absorb the cutting heat dissipated in the workpiece and tool,respectively.The temperature variations of the water,chips,workpiece,tool and surroundings during the closed high-speed machining are then measured.Thus,the cutting heat dissipated in each component of the cutting system,total cutting heat and heat flux are calculated.Moreover,the power resulting from the main cutting force is obtained according to the measured cutting force and predetermined cutting speed.The accuracy of cutting heat measurement by the calorimetric method is finally evaluated by comparing the total cutting heat flux with the power resulting from the main cutting force.

  20. The New Normal: Senior Student Affairs Officers Speak out about Budget Cutting

    Science.gov (United States)

    Romano, C. Renee; Hanish, Jan; Phillips, Calvin; Waggoner, Michael D.

    2010-01-01

    To understand the experiences of leaders in student affairs in higher education and to document the strategies they used to cut budgets and the results of these actions, the authors conducted a qualitative research study using public institutions as case studies. Data were gathered in 2005 through phone interviews with senior student affairs…

  1. The New Normal: Senior Student Affairs Officers Speak out about Budget Cutting

    Science.gov (United States)

    Romano, C. Renee; Hanish, Jan; Phillips, Calvin; Waggoner, Michael D.

    2010-01-01

    To understand the experiences of leaders in student affairs in higher education and to document the strategies they used to cut budgets and the results of these actions, the authors conducted a qualitative research study using public institutions as case studies. Data were gathered in 2005 through phone interviews with senior student affairs…

  2. THE EFFECTS OF CUTTING METHODS OF SURFACE ROUGHNESS OF ALUMINUM POROUS MATERIALPRODUCED VIA VACUUM METHOD

    Directory of Open Access Journals (Sweden)

    Lütfiye DAHIL

    2015-04-01

    Full Text Available In this study, the surface roughness values of 3 aluminum porous materials, which were produced via vacuum method and have different porous structures, depending on the implemented cutting method after processing them were assessed comparatively. 3 different cutting methods have been implemented on each of samples, as Water Jet, Wire Erosion, and Band Saw. Setting the speed to 20 m/min, the methods were compared under same conditions. The smoothness measurement has been executed by taking the mean of 3 measurements in parallel with surface and 3 measurements in vertical to surface. By comparing the obtained results, it has been determined that the most advantageous method is the Wire Erosion method.

  3. A Surface Damage Investigation on Uniaxial Tensile Test Specimens Prepared by Common Cutting Methods

    Science.gov (United States)

    1981-02-01

    REPORI’ A SURFACE DAMAGE INVESTIGATION ON UNIAXIAL TENSILE TEST SPECIMENSPREPARED BY COMMON CUTTING METHODS JUN 2 1931 THOMAS J. C. CHEW DALE A...Sýrfa-i-mage Investigation on Uniaxial Tensile Test Specimens Prepared by Common Cutting Method I Spi-.i t’ ,, ., • T7. AUTHORý#) . _" ’ /t’ .• r...Saw 10 2.1.3 Cutting by Milling Machine 11 2.1.4 Cutting by Die Cutter 11 2.2 Uniaxial Tensile Test 12 2.3 Electron Microscope Surface Examination 13 3

  4. Experimental and Statistical Evaluation of Cutting Methods in Relation to Specific Energy and Rock Properties

    Science.gov (United States)

    Engin, Irfan Celal; Bayram, Fatih; Yasitli, Nazmi Erhan

    2013-07-01

    In a processing plant, natural stone can be cut by methods such as circular sawing (CS), frame sawing (FS), water jet cutting (WJC) and abrasive water jet cutting (AWJC). The efficiency of cutting systems can be compared using various parameters. In this study, the specific energy values were determined and compared to evaluate the efficiency of rock-cutting methods. Rock-cutting experiments were performed on 12 different types of rock samples using a circular sawing machine and an AWJC machine. The experimental results showed that the specific energy values in AWJC were generally higher than in CS. In addition, the relationships between specific energy values and rock properties were explained in this study. The Shore hardness and abrasion resistance were found to be strongly related to the specific energy values, and according to these parameters prediction charts of specific energy values were created.

  5. 铸造方法和正火对 Fe-Cu合金组织和切削加工性能的影响%Effects of casting method and normalizing treatment on microstructure and cutting properties of Fe-Cu alloy

    Institute of Scientific and Technical Information of China (English)

    陈军; 张强

    2015-01-01

    The effects of pouring mode on hardness and microstructure of Fe-Cu alloy were investigated by means of hardness test, microstructure observation and energy spectrum analysis respectively.Comparative analysis of the cutting properties of as-cast and normalized Fe-Cu alloy was carried out, and the proper cutting parameters were discussed.The results show that the Fe-Cu alloy prepared with permanent mold casting has the highest hardness while that casted with refractory brick type mold has the lowest hardness.The cooling rate of the alloy casting is positively related to its hardness, the faster the cooling rate, the higher the hardness of the alloy.In the Fe-Cu alloy prepared with permanent mold casting, the reticular structure is mainly distributed at the grain boundaries, while the intermittent linear structure is mainly distributed in the grain.The amount of intermittent linear structure in the alloy prepared with resin sand casting decreases obviously and the white structure in the grain boundaries turns from continuous to intermittent structure, however, the thickness of the reticular wall increases.In the Fe-Cu alloy prepared with refractory brick casting, the white structure in the grain boundaries changes to columnar or block structure.With the gradual decrease of the cooling rate, the content of Fe element in the matrix of the Fe-Cu alloy increases gradually while the Cu content decreases gradually.Under the same cutting parameters, the normalized Fe-Cu alloy needs larger cutting force than the as-cast Fe-Cu alloy.%采用硬度测试、显微组织观察和能谱分析等手段研究了不同铸造方法对Fe-Cu合金硬度与组织的影响,对比分析了铸态和正火态Fe-Cu合金的切削加工性能并探讨了合理的切削加工参数。结果表明,金属型铸造Fe-Cu合金的硬度最高,耐火砖型铸造Fe-Cu合金的硬度最低;合金铸造时的冷却速度与其硬度呈正影响关系,冷却速度越大则合金硬度越高。

  6. Optimization methods of cutting depth in mining Co-rich crusts

    Institute of Scientific and Technical Information of China (English)

    QIN Xuan-yun; GUAN Ji-hong; REN Bo; BU Ying-yong

    2007-01-01

    For optimizing the cutting depth of spiral drum type cutting head, the relations among collecting ratio, interfusing ratio of mullock and cutting depth of the mining cobalt-rich crusts in ocean were discussed. Furthermore, the multi-extremum problem about cutting depth was analyzed in mining at a certain interfusing ratio of mullock. Through introducing genetic algorithm (GA), the cutting depth-control problem when the collecting ratio is maximized by controlling the interfusing ratio of mullock was solved with global-optimization-search algorithms. Then optimization theory for cutting depth in mining cobalt-rich crusts by GA, and computer programming were given to realize the algorithm. The computation result of actual data proves the validity of this method.

  7. The Study of Cutting Conditions Effects on the Damping Process Using the Experimental Taguchi Method

    Directory of Open Access Journals (Sweden)

    Haikel Mejri

    2016-01-01

    Full Text Available This article focuses on determining the effects of cutting conditions and their interactions on the cutting process damping in the case of curvilinear milling. The tests were performed using a numerical model simulation that allows the prediction of cutting forces and damping. The effects and interactions are determined using the Taguchi experimental method. Analysis of variance (ANOVA was performed to know the level of importance of the machining parameters on the cutting damping process. The results revealed that the Depth of cut Ap “C” and cutting speed Vc “B” have the most significant influence on the Cxx and Cxy process damping. The variations of tool diameter D “A” and clearance angle α have remarkable effects on the process damping Cxx. The “BC” interaction has the greatest effect on the process damping Cxx while the “AC” interaction has the greatest effect on the process damping Cxy.

  8. On the Branch and Cut Method for Multidimentional Mixed Integer Knapsack Problem

    OpenAIRE

    Mostafa Khorramizadeh; Zahra Rakhshandehroo

    2014-01-01

    In this paper, we examine the effect of the feasibility pump (FP) method on the branch and cut method for solving the multidimentional mixed integer knapsack problem. The feasibility pump is a heuristic method, trying to compute a feasible solution for mixed integer pro- gramming problems. Moreover, we consider two efficient strategies for using the feasibility pump in a branch and cut method and present some tables of numerical results, concerning the application and comparison of using thes...

  9. ADAPTIVE LAYERED CARTESIAN CUT CELL METHOD FOR THE UNSTRUCTURED HEXAHEDRAL GRIDS GENERATION

    Institute of Scientific and Technical Information of China (English)

    WU Peining; TAN Jianrong; LIU Zhenyu

    2007-01-01

    Adaptive layered Cartesian cut cell method is presented to solve the difficulty of the unstructured hexahedral anisotropic Cartesian grids generation from the complex CAD model. Vertex merging algorithm based on relaxed AVL tree is investigated to construct topological structure for stereo lithography (STL) files, and a topology-based self-adaptive layered slicing algorithm with special features control strategy is brought forward. With the help of convex hull, a new points-in-polygon method is employed to improve the Cartesian cut cell method. By integrating the self-adaptive layered slicing algorithm and the improved Cartesian cut cell method, the adaptive layered Cartesian cut cell method gains the volume data of the complex CAD model in STL file and generates the unstructured hexahedral anisotropic Cartesian grids.

  10. Cutting inoperable bodies: particularizing rural sociality to normalize hysterectomies in Balochistan, Pakistan.

    Science.gov (United States)

    Towghi, Fouzieyha

    2012-01-01

    Drawing on 15 months of ethnographic research in Balochistan, Pakistan (2005 -2006), I explore Panjguri midwives' (dïnabogs, kawwās, or balloks) narrative links between routine injections of prostaglandins around childbirth and the increasing number of hysterectomies. These techno-medical interventions reflect the postcolonial biomedicalization of women's bodies and reproductive health care, and are reinforced by shifts in Pakistan's public health policy against maternal mortality in a context where about 90 percent of births occur outside of hospitals. Transnational campaigns against maternal mortality further biomedicalize women's lives. Interviews with doctors, midwives, and women, and analysis of women's experiences, illustrate the practical considerations that were used to normalize radical hysterectomies over less invasive procedures.

  11. A combination method of the theory and experiment in determination of cutting force coefficients in ball-end mill processes

    Directory of Open Access Journals (Sweden)

    Yung-Chou Kao

    2015-10-01

    Full Text Available In this paper, the cutting force calculation of ball-end mill processing was modeled mathematically. All derivations of cutting forces were directly based on the tangential, radial, and axial cutting force components. In the developed mathematical model of cutting forces, the relationship of average cutting force and the feed per flute was characterized as a linear function. The cutting force coefficient model was formulated by a function of average cutting force and other parameters such as cutter geometry, cutting conditions, and so on. An experimental method was proposed based on the stable milling condition to estimate the cutting force coefficients for ball-end mill. This method could be applied for each pair of tool and workpiece. The developed cutting force model has been successfully verified experimentally with very promising results.

  12. When I cut, you choose method implies intransitivity

    Science.gov (United States)

    Makowski, Marcin; Piotrowski, Edward W.

    2014-12-01

    There is a common belief that humans and many animals follow transitive inference (choosing A over C on the basis of knowing that A is better than B and B is better than C). Transitivity seems to be the essence of rational choice. We present a theoretical model of a repeated game in which the players make a choice between three goods (e.g. food). The rules of the game refer to the simple procedure of fair division among two players, known as the “I cut, you choose” mechanism which has been widely discussed in the literature. In this game one of the players has to make intransitive choices in order to achieve the optimal result (for him/her and his/her co-player). The point is that an intransitive choice can be rational. Previously, an increase in the significance of intransitive strategies was achieved by referring to models of quantum games. We show that relevant intransitive strategies also appear in the classic description of decision algorithms.

  13. Concept of automatic programming of NC machine for metal plate cutting by genetic algorithm method

    Directory of Open Access Journals (Sweden)

    B. Vaupotic

    2005-12-01

    Full Text Available Purpose: In this paper the concept of automatic programs of the NC machine for metal plate cutting by genetic algorithm method has been presented.Design/methodology/approach: The paper was limited to automatic creation of NC programs for two-dimensional cutting of material by means of adaptive heuristic search algorithms.Findings: Automatic creation of NC programs in laser cutting of materials combines the CAD concepts, the recognition of features and creation and optimization of NC programs. The proposed intelligent system is capable to recognize automatically the nesting of products in the layout, to determine the incisions and sequences of cuts forming the laid out products. Position of incisions is determined at the relevant places on the cut. The system is capable to find the shortest path between individual cuts and to record the NC program.Research limitations/implications: It would be appropriate to orient future researches towards conceiving an improved system for three-dimensional cutting with optional determination of positions of incisions, with the capability to sense collisions and with optimization of the speed and acceleration during cutting.Practical implications: The proposed system assures automatic preparation of NC program without NC programer.Originality/value: The proposed concept shows a high degree of universality, efficiency and reliability and it can be simply adapted to other NC-machines.

  14. A Novel Approach for Shearer Memory Cutting Based on Fuzzy Optimization Method

    Directory of Open Access Journals (Sweden)

    Xin Zhou

    2013-01-01

    Full Text Available In order to improve the implement precision of shearer memory cutting, a novel approach based on the coal floor height variation which is taken as a significant factor and fuzzy optimization theory is proposed. The problem of shearer memory cutting is analyzed and the mathematic model is established. Moreover, the key technologies such as fuzzy control model, quantitative factors, and fuzzy control rules are elaborated, and the flowchart of shearer memory cutting method based on fuzzy optimization theory is designed. Finally, a simulation example is carried out and the proposed approach is proved feasible and efficient.

  15. Progress Towards a Cartesian Cut-Cell Method for Viscous Compressible Flow

    Science.gov (United States)

    Berger, Marsha; Aftosmis, Michael J.

    2011-01-01

    The proposed paper reports advances in developing a method for high Reynolds number compressible viscous flow simulations using a Cartesian cut-cell method with embedded boundaries. This preliminary work focuses on accuracy of the discretization near solid wall boundaries. A model problem is used to investigate the accuracy of various difference stencils for second derivatives and to guide development of the discretization of the viscous terms in the Navier-Stokes equations. Near walls, quadratic reconstruction in the wall-normal direction is used to mitigate mesh irregularity and yields smooth skin friction distributions along the body. Multigrid performance is demonstrated using second-order coarse grid operators combined with second-order restriction and prolongation operators. Preliminary verification and validation for the method is demonstrated using flat-plate and airfoil examples at compressible Mach numbers. Simulations of flow on laminar and turbulent flat plates show skin friction and velocity profiles compared with those from boundary-layer theory. Airfoil simulations are performed at laminar and turbulent Reynolds numbers with results compared to both other simulations and experimental data

  16. THEORETICAL METHOD FOR PREDICTION OF THE CUTTING EDGE RECESSION DURING MILLING WOOD AND SECONDARY WOOD PRODUCTS

    Directory of Open Access Journals (Sweden)

    Bolesław Porankiewicz

    2008-11-01

    Full Text Available A theoretical method for prediction of cutting edge recession during milling wood and wood-based products, due to the presence of hard mineral contamination, High Temperature Tribochemical Reactions (HTTR, and frictional wearing, based on 3D random distribution of contaminant particles is presented and positively verified based on the example of three experiments from the literature, showing good correlation between the predicted and observed cutting edge recession.

  17. The implementation of the high technology methods of cutting on St 50 alloyed steel and the examination of the effects of cutting operation at the surface of material

    Directory of Open Access Journals (Sweden)

    L. Dahil

    2014-01-01

    Full Text Available Since competition has risen in the manufacturing methods of present conditions, the chance of competition has lost in the situations which new or high technologies aren’t exactly known. The proper selection is important for competing in the market conditions. In this study, the aim is to present the most advantaged cutting method for obtaining quality products by using high technology for gaining a better chance of competition. In the test, St 50 alloyed steel is used as sample. This sample is cut with the methods of laser, erosion and abrasive water jet (AWJ among the cutting operations made with the high technology. The micro structure photograph of surface of sample, which is cut, are taken and the effects of different methods of cutting on the metallurgical structure of material are compared. Also, changes are examined by performing the measurement of rigidity to the core from the edge of cutting. The most advantaged cutting method is presented by considering these examinations.

  18. Reinforcement mechanism of slope stability method with no cutting trees

    OpenAIRE

    Yuki, Chikata; Harushige, KUSUMI; 楠見, 晴重; Katsumi, TERAOKA

    2008-01-01

    The study in this paper is the slope stability. Although many slopes are prone to collapse, countermeasures against slop failures have not been progressed yet in Japan. Most slope protection methods were to cover shotcrete on the slope in 1960’s. However, the slope covered shotcrete have been deteriorating. Therefore, the slope failures frequently occur due to the natural disaster such as heavy rainfall and earthquake. It is important to develop an effective slope stability method. Moreover, ...

  19. A Method and Device for 3D Recognition of Cutting Edge Micro Geometry

    Directory of Open Access Journals (Sweden)

    Bartosz Palubicki

    2014-04-01

    Full Text Available A very useful method was successfully applied in the investigation of tools for machining wood and wood based composites. It allows scanning of the cutting edge micro geometry in three dimensions and reproducing it in a virtual space as a 3D surface. The application of the method opens new possibilities of studying tool wear by scanning, including the calculation of volume loss and other analysis of tool wedge geometry along and perpendicularly to the cutting edge. Effectiveness of the method and scanner were successfully verified by a reference ESEM (Environmental Scanning Electron Microscopy method.

  20. Rotation method for the measurement of thickness of Z-cut uniaxial crystals

    Science.gov (United States)

    Paranin, V. D.

    2015-12-01

    An original polarization method for the measurement of thickness of Z-cut uniaxial crystals employs the transmittance measurement of the polarizer-crystal-analyzer system at different rotation angles of the crystal. The mathematical analysis of the method is based on the optics of uniaxial crystals and Jones matrices. A measurement error of no greater than ±0.6 μm is estimated using the formula of a vector sum. Z-cut crystals of congruent lithium niobate with rated thicknesses of 514 and 554 μm are used to experimentally test the method and propose practical recommendations for applications.

  1. Plasma arc cutting optimization parameters for aluminum alloy with two thickness by using Taguchi method

    Science.gov (United States)

    Abdulnasser, B.; Bhuvenesh, R.

    2016-07-01

    Manufacturing companies define the qualities of thermal removing process based on the dimension and physical appearance of the cutting material surface. The surface roughness of the cutting area for the material and the material removal rate being removed during the manual plasma arc cutting process were importantly considered. Plasma arc cutter machine model PS-100 was used to cut the specimens made from aluminium alloy 1100 manually based on the selected parameters setting. Two different thicknesses of specimens, 3mm and 6mm were used. The material removal rate (MRR) was measured by determining the difference between the weight of specimens before and after the cutting process. The surface roughness (Ra) was measured by using MITUTOYO CS-3100 machine and analysis was conducted to determine the average roughness (Ra) value, Taguchi method was utilized as an experimental layout to obtain MRR and Ra values. The results indicate that the current and cutting speed is the most significant parameters, followed by the arc gap for both rate of material removal and surface roughness.

  2. A Simple Method to Derive Minimal Cut Sets for a Non-coherent Fault Tree

    Institute of Scientific and Technical Information of China (English)

    Takehisa Kohda

    2006-01-01

    Minimal cut sets (or prime implicants: minimal combinations of basic event conditions leading to system failure)are important information for reliability/safety analysis and design. To obtain minimal cut sets for general non-coherent fault trees, including negative basic events or multi-valued basic events, a special procedure such as the consensus rule must be applied to the results obtained by logical operations for coherent fault trees, which will require more steps and time.This paper proposes a simple method for a non-coherent fault tree, whose top event is represented as an AND combination of monotonic sub-trees. A "monotonic" sub-tree means that it does not have both positive and negative representations for each basic event. It is proven that minimal cut sets can be obtained by a conventional method for coherent fault trees. An illustrative example of a simple event tree analysis shows the detail and characteristics of the proposed method.

  3. STUDY ON MODULAR FAULT TREE ANALYSIS TECHNIQUE WITH CUT SETS MATRIX METHOD

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A new fault tree analysis (FTA) computation method is put forth by using modularization technique in FTA with cut sets matrix, and can reduce NP (Nondeterministic polynomial) difficulty effectively. This software can run in IBM-PC and DOS 3.0 and up. The method provides theoretical basis and computation tool for application of FTA technique in the common engineering system

  4. Crack-cutting method to solve the torsion problem of polygonal cylinders

    Institute of Scientific and Technical Information of China (English)

    汤任基; 李玉兰; 胡深洋

    1995-01-01

    Using the singular integral equation and the crack-cutting technique, a new method to solvethe torsion problem of polygonal cylinders is proposed under exact boundary conditions and the corner-point conditions. Several calculating examples are presented and the method is verified.

  5. Full gradient stabilized cut finite element methods for surface partial differential equations

    Science.gov (United States)

    Burman, Erik; Hansbo, Peter; Larson, Mats G.; Massing, André; Zahedi, Sara

    2016-10-01

    We propose and analyze a new stabilized cut finite element method for the Laplace-Beltrami operator on a closed surface. The new stabilization term provides control of the full $\\mathbb{R}^3$ gradient on the active mesh consisting of the elements that intersect the surface. Compared to face stabilization, based on controlling the jumps in the normal gradient across faces between elements in the active mesh, the full gradient stabilization is easier to implement and does not significantly increase the number of nonzero elements in the mass and stiffness matrices. The full gradient stabilization term may be combined with a variational formulation of the Laplace-Beltrami operator based on tangential or full gradients and we present a simple and unified analysis that covers both cases. The full gradient stabilization term gives rise to a consistency error which, however, is of optimal order for piecewise linear elements, and we obtain optimal order a priori error estimates in the energy and $L^2$ norms as well as an optimal bound of the condition number. Finally, we present detailed numerical examples where we in particular study the sensitivity of the condition number and error on the stabilization parameter.

  6. Determination of laser cutting process conditions using the preference selection index method

    Science.gov (United States)

    Madić, Miloš; Antucheviciene, Jurgita; Radovanović, Miroslav; Petković, Dušan

    2017-03-01

    Determination of adequate parameter settings for improvement of multiple quality and productivity characteristics at the same time is of great practical importance in laser cutting. This paper discusses the application of the preference selection index (PSI) method for discrete optimization of the CO2 laser cutting of stainless steel. The main motivation for application of the PSI method is that it represents an almost unexplored multi-criteria decision making (MCDM) method, and moreover, this method does not require assessment of the considered criteria relative significances. After reviewing and comparing the existing approaches for determination of laser cutting parameter settings, the application of the PSI method was explained in detail. Experiment realization was conducted by using Taguchi's L27 orthogonal array. Roughness of the cut surface, heat affected zone (HAZ), kerf width and material removal rate (MRR) were considered as optimization criteria. The proposed methodology is found to be very useful in real manufacturing environment since it involves simple calculations which are easy to understand and implement. However, while applying the PSI method it was observed that it can not be useful in situations where there exist a large number of alternatives which have attribute values (performances) very close to those which are preferred.

  7. Examination of Bacteriophage as a Biocontrol Method for Salmonella on Fresh-Cut Fruit: A Model Study

    National Research Council Canada - National Science Library

    Leverentz B; Conway W.S; Alavidze Z; Janisiewicz W.J; Fuchs Y; Camp M.J; Chighladze E; Sulakvelidze A

    2001-01-01

    .... However, fresh-cut fruits and vegetables may represent an increased food safety concern because of the absence or damage of peel and rind, which normally help reduce colonization of uncut produce...

  8. An effective method to predict oil recovery in high water cut stage

    Institute of Scientific and Technical Information of China (English)

    刘志斌; 刘浩翰

    2015-01-01

    The water flooding characteristic curve method based on the traditional regression equation between the oil and water phase permeability ratio and the water saturation is inappropriate to predict the oil recovery in the high water cut stage. Hence, a new water flooding characteristic curve equation adapted to the high water cut stage is proposed to predict the oil recovery. The water drive phase permeability experiments show that the curve of the oil and water phase permeability ratio vs. the water saturation, in the semi-logarithmic coordinates, has a significantly lower bend after entering the high water cut stage, so the water flooding characteristic curve method based on the traditional regression equation between the oil and water phase permeability ratio and the water saturation is inappropriate to predict the oil recovery in the high water cut stage; therefore, a new water flooding characteristic curve equation based on a better relationship betweenln(kro/krw)andwS is urgently desirable to be established to effectively and reliably predict the oil recovery of a water drive reservoir adapted to a high water cut stage. In this paper, by carrying out the water drive phase permeability experiments, a new mathematical model between the oil and water phase permeability ratio and the water saturation is established,with the regression analysis method and an integration of the established model, the water flooding characteristic curve equation adapted to a high water cut stage is obtained. Using the new water flooding characteristic curve to predict the oil recovery of the GD3-block of the SL oilfield and the J09-block of the DG oilfield in China, results with high predicted accuracy are obtained.

  9. Application of CBR method for adding the process of cutting tools and parameters selection

    Science.gov (United States)

    Ociepka, P.; Herbus, K.

    2015-11-01

    Modem enterprises must face with the dynamically changing market demand what influences the designing process. It is possible by linking computer tools with information gathered by experienced designers teams. The article describes the method basing on engineering knowledge and experience to adding the process of tools selection and cutting parameters determination for a turning operation. The method, proposed by the authors, is based on the CBR (Case Based Reasoning) method. CBR is a method of problem solving that involves searching for an analogy (similarity) between the current task to be solved, and the earlier cases that properly described, are stored in a computer memory. This article presents an algorithm and a formalized description of the developed method. It was discussed the range of its utilization, as well as it was illustrated the method of its functioning on the example of the tools and cutting parameters selection with respect to the turning process.

  10. Global optimization of discrete truss topology design problems using a parallel cut-and-branch method

    DEFF Research Database (Denmark)

    Rasmussen, Marie-Louise Højlund; Stolpe, Mathias

    2008-01-01

    the physics, and the cuts (Combinatorial Benders’ and projected Chvátal–Gomory) come from an understanding of the particular mathematical structure of the reformulation. The impact of a stronger representation is investigated on several truss topology optimization problems in two and three dimensions....... to a mixed-integer linear program, which is solved with a parallel implementation of branch-and-bound. Additional valid inequalities and cuts are introduced to give a stronger representation of the problem, which improves convergence and speed up of the parallel method. The valid inequalities represent...

  11. Finite Element Method Based Modeling for Prediction of Cutting Forces in Micro-end Milling

    Science.gov (United States)

    Pratap, Tej; Patra, Karali

    2017-02-01

    Micro-end milling is one of the widely used processes for producing micro features/components in micro-fluidic systems, biomedical applications, aerospace applications, electronics and many more fields. However in these applications, the forces generated in the micro-end milling process can cause tool vibration, process instability and even cause tool breakage if not minimized. Therefore, an accurate prediction of cutting forces in micro-end milling is essential. In this work, a finite element method based model is developed using ABAQUS/Explicit 6.12 software for prediction of cutting forces in micro-end milling with due consideration of tool edge radius effect, thermo-mechanical properties and failure parameters of the workpiece material including friction behaviour at tool-chip interface. Experiments have been performed for manufacturing of microchannels on copper plate using 500 µm diameter tungsten carbide micro-end mill and cutting forces are acquired through a dynamometer. Predicted cutting forces in feed and cross feed directions are compared with experimental results and are found to be in good agreements. Results also show that FEM based simulations can be applied to analyze size effects of specific cutting forces in micro-end milling process.

  12. Natural loblolly and shortleaf pine productivity through 53 years of management under four reproduction cutting methods

    Science.gov (United States)

    Michael D. Cain; Michael G. Shelton

    2001-01-01

    A study was initiated in 1943 to evaluate the long-term productivity of loblolly (Pinus taeda L.) and shortleaf pines (P. echinata Mill.) when managed under four reproduction cutting methods—clearcut, heavy seedtree, diameter-limit, and selection—on the Upper Coastal Plain of southeastern Arkansas. Early volume production...

  13. Application of the ROV method for the selection of cutting fluids

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2016-06-01

    Full Text Available Production engineers are frequently faced with the multi-criteria selection problems in the manufacturing environment. Over the years, many multi-criteria decision making (MCDM methods have been proposed to help decision makers in solving different complex selection problems. This paper introduces the use of an almost unexplored MCDM method, i.e. range of value (ROV method for solving cutting fluid selection problems. The main motivation of using the ROV method is that it offers a very simple computational procedure compared to other MCDM methods. Applicability and effectiveness of the ROV method have been demonstrated while solving four case studies dealing with selection of the most suitable cutting fluid for the given machining application. In each case study the obtained complete rankings were compared with those derived by the past researchers using different MCDM methods. The results obtained using the ROV method have excellent correlation with those derived by the past researchers which validate the usefulness and effectiveness of this simple MCDM method for solving cutting fluid selection problems.

  14. Fast beam cut-off method in RF-knockout extraction for spot-scanning

    CERN Document Server

    Furukawa, T

    2002-01-01

    An irradiation method with magnetic scanning has been developed in order to provide accurate irradiation even for an irregular target shape. The scanning method has strongly required a lower ripple of the beam spill and a faster response to beam-on/off in slow extraction from a synchrotron ring. At HIMAC, RF-knockout extraction has utilized a bunched beam to reduce the beam-spill ripple. Therefore, particles near the resonance can be spilled out from the separatrices by synchrotron oscillation as well as by a transverse RF field. From this point of view, a fast beam cut-off method has been proposed and verified by both simulations and experiments. The maximum delay from the beam cut-off signal to beam-off has been improved to around 60 mu s from 700 mu s by a usual method. Unwanted dose has been considerably reduced by around a factor of 10 compared with that by the usual method.

  15. Measuring multiple residual-stress components using the contour method and multiple cuts

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Swenson, Hunter [Los Alamos National Laboratory; Pagliaro, Pierluigi [U. PALERMO; Zuccarello, Bernardo [U. PALERMO

    2009-01-01

    The conventional contour method determines one component of stress over the cross section of a part. The part is cut into two, the contour of the exposed surface is measured, and Bueckner's superposition principle is analytically applied to calculate stresses. In this paper, the contour method is extended to the measurement of multiple stress components by making multiple cuts with subsequent applications of superposition. The theory and limitations are described. The theory is experimentally tested on a 316L stainless steel disk with residual stresses induced by plastically indenting the central portion of the disk. The stress results are validated against independent measurements using neutron diffraction. The theory has implications beyond just multiple cuts. The contour method measurements and calculations for the first cut reveal how the residual stresses have changed throughout the part. Subsequent measurements of partially relaxed stresses by other techniques, such as laboratory x-rays, hole drilling, or neutron or synchrotron diffraction, can be superimposed back to the original state of the body.

  16. Proposed SPAR Modeling Method for Quantifying Time Dependent Station Blackout Cut Sets

    Energy Technology Data Exchange (ETDEWEB)

    John A. Schroeder

    2010-06-01

    Abstract: The U.S. Nuclear Regulatory Commission’s (USNRC’s) Standardized Plant Analysis Risk (SPAR) models and industry risk models take similar approaches to analyzing the risk associated with loss of offsite power and station blackout (LOOP/SBO) events at nuclear reactor plants. In both SPAR models and industry models, core damage risk resulting from a LOOP/SBO event is analyzed using a combination of event trees and fault trees that produce cut sets that are, in turn, quantified to obtain a numerical estimate of the resulting core damage risk. A proposed SPAR method for quantifying the time-dependent cut sets is sometimes referred to as a convolution method. The SPAR method reflects assumptions about the timing of emergency diesel failures, the timing of subsequent attempts at emergency diesel repair, and the timing of core damage that may be different than those often used in industry models. This paper describes the proposed SPAR method.

  17. CFD Method for Predicting Annular Pressure Losses and Cuttings Concentration in Eccentric Horizontal Wells

    Directory of Open Access Journals (Sweden)

    Titus N. Ofei

    2014-01-01

    Full Text Available In oil and gas drilling operations, predictions of pressure losses and cuttings concentration in the annulus are very complex due to the combination of interacting drilling parameters. Past studies have proposed many empirical correlations to estimate pressure losses and cuttings concentration. However, these developed correlations are limited to their experimental data range and setup, and hence, they cannot be applicable to all cases. CFD methods have the advantages of handling complex multiphase flow problems, as well as, an unlimited number of physical and operational conditions. The present study employs the inhomogeneous (Eulerian-Eulerian model to simulate a two-phase solid-fluid flow and predict pressure losses and cuttings concentration in eccentric horizontal annuli as a function of varying drilling parameters: fluid velocity, diameter ratio (ratio of inner pipe diameter to outer pipe diameter, inner pipe rotation speed, and fluid type. Experimental data for pressure losses and cuttings concentration from previous literature compared very well with simulation data, confirming the validity of the current model. The study shows how reliable CFD methods can replicate the actual, yet complex oil and gas drilling operations.

  18. Method and systems for power control of internal combustion engines using individual cycle cut-off

    Energy Technology Data Exchange (ETDEWEB)

    Fedorenko, Y.; Korzhov, M.; Filippov, A.; Atamanenko, N.

    1996-09-01

    A new method of controlling power has been developed for improving efficiency and emissions performance of internal combustion engines at partial load. The method involves cutting-off some of the work cycles, as the load decreases, to obtain required power. Theoretical and experimental material is presented to illustrate the underlying principle, the implementation means and the results for the 4- and 8-cylinder piston engine and a twin rotor Wankel engine applications.

  19. Implementation Methods of Computer Aided Design-Drawing and Drawing Management for Plate Cutting-Machine

    Institute of Scientific and Technical Information of China (English)

    DONG Yu-de; ZHAO Han; TAN Jian-rong

    2002-01-01

    The implementation methods of computer aided design,drawing and drawing management for plate cuttingmachine are discussed. The system structure for plate cutting- machine design is put forward firstly, then some key technologies and their implementation methods are introduced, which include the structure management of graphics, the unification of graph and design calculation, information share of part, assemble and drawing management system, and movement simulation of key components.

  20. 3D Multiphase Piecewise Constant Level Set Method Based on Graph Cut Minimization

    Institute of Scientific and Technical Information of China (English)

    Tiril P.Gurholt; Xuecheng Tai

    2009-01-01

    Segmentation of three-dimensional (3D) complicated structures is of great importance for many real applications. In this work we combine graph cut minimization method with a variant of the level set idea for 3D segmentation based on the Mumford-Shah model. Compared with the traditional approach for solving the Euler-Lagrange equation we do not need to solve any partial differential equations. Instead, the minimum cut on a special designed graph need to be computed. The method is tested on data with complicated structures. It is rather stable with respect to initial value and the algorithm is nearly parameter free. Experiments show that it can solve large problems much faster than traditional approaches.

  1. The Influence of Mechanical Resonance & Compensation Method in CNC Heavy Cutting

    Institute of Scientific and Technical Information of China (English)

    WU Yuguo; HUANG Yunlin; SONG Chongzhi

    2006-01-01

    In CNC heavy cutting servo system, mechanical driving system has a torture feedback on electrical speed-adjustment system, thus it's possible to generate the mechanical resonance. The mechanical resonance makes the feature parameters of mechanical driving system influence the dynamic performance of the electrical speed-adjustment system. This paper has studied the resistance ratio of ζmech and q parameters ,and put forward the compensation method to decrease the mechanical resonance influence.

  2. Method of Monitoring Wearing and Breakage States of Cutting Tools Based on Mahalanobis Distance Features

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The Mahalanobis distance features proposed by P.C.Mahalanobis, an Indian statistician, can be used in an automatic on-line cutting tool condition monitoring process based on digital image processing. In this paper, a new method of obtaining Mahalanobis distance features from a tool image is proposed. The key of calculating Mahalanobis distance is appropriately dividing the object into several component sets. Firstly, a technique is proposed that can automatically divide the component groups for calculati...

  3. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    Science.gov (United States)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  4. The Reliability of Electromyographic Normalization Methods for Cycling Analyses.

    Science.gov (United States)

    Sinclair, Jonathan; Taylor, Paul John; Hebron, Jack; Brooks, Darrell; Hurst, Howard Thomas; Atkins, Stephen

    2015-06-27

    Electromyography (EMG) is normalized in relation to a reference maximum voluntary contraction (MVC) value. Different normalization techniques are available but the most reliable method for cycling movements is unknown. This study investigated the reliability of different normalization techniques for cycling analyses. Twenty-five male cyclists (age 24.13 ± 2.79 years, body height 176.22 ± 4.87 cm and body mass 67.23 ± 4.19 kg, BMI = 21.70 ± 2.60 kg·m-1) performed different normalization procedures on two occasions, within the same testing session. The rectus femoris, biceps femoris, gastrocnemius and tibialis anterior muscles were examined. Participants performed isometric normalizations (IMVC) using an isokinetic dynamometer. Five minutes of submaximal cycling (180 W) were also undertaken, allowing the mean (DMA) and peak (PDA) activation from each muscle to serve as reference values. Finally, a 10 s cycling sprint (MxDA) trial was undertaken and the highest activation from each muscle was used as the reference value. Differences between reference EMG amplitude, as a function of normalization technique and time, were examined using repeated measures ANOVAs. The test-retest reliability of each technique was also examined using linear regression, intraclass correlations and Cronbach's alpha. The results showed that EMG amplitude differed significantly between normalization techniques for all muscles, with the IMVC and MxDA methods demonstrating the highest amplitudes. The highest levels of reliability were observed for the PDA technique for all muscles; therefore, our results support the utilization of this method for cycling analyses.

  5. The Reliability of Electromyographic Normalization Methods for Cycling Analyses

    Directory of Open Access Journals (Sweden)

    Sinclair Jonathan

    2015-06-01

    Full Text Available Electromyography (EMG is normalized in relation to a reference maximum voluntary contraction (MVC value. Different normalization techniques are available but the most reliable method for cycling movements is unknown. This study investigated the reliability of different normalization techniques for cycling analyses. Twenty-five male cyclists (age 24.13 ± 2.79 years, body height 176.22 ± 4.87 cm and body mass 67.23 ± 4.19 kg, BMI = 21.70 ± 2.60 kg·m−1 performed different normalization procedures on two occasions, within the same testing session. The rectus femoris, biceps femoris, gastrocnemius and tibialis anterior muscles were examined. Participants performed isometric normalizations (IMVC using an isokinetic dynamometer. Five minutes of submaximal cycling (180 W were also undertaken, allowing the mean (DMA and peak (PDA activation from each muscle to serve as reference values. Finally, a 10 s cycling sprint (MxDA trial was undertaken and the highest activation from each muscle was used as the reference value. Differences between reference EMG amplitude, as a function of normalization technique and time, were examined using repeated measures ANOVAs. The testretest reliability of each technique was also examined using linear regression, intraclass correlations and Cronbach’s alpha. The results showed that EMG amplitude differed significantly between normalization techniques for all muscles, with the IMVC and MxDA methods demonstrating the highest amplitudes. The highest levels of reliability were observed for the PDA technique for all muscles; therefore, our results support the utilization of this method for cycling analyses.

  6. Method for construction of normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1998-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.

  7. Systems and Methods for Determining Water-Cut of a Fluid Mixture

    KAUST Repository

    Karimi, Muhammad Akram

    2017-03-02

    Provided in some embodiments are systems and methods for measuring the water content (or water-cut) of a fluid mixture. Provided in some embodiments is a water-cut sensor system that includes a T-resonator, a ground conductor, and a separator. The T-resonator including a feed line, and an open shunt stub conductively coupled to the feed line. The ground conductor including a bottom ground plane opposite the T-resonator and a ground ring conductively coupled to the bottom ground plane, with the feed line overlapping at least a portion of the ground ring. The separator including a dielectric material disposed between the feed line and the portion of the ground ring overlapped by the feed line, and the separator being adapted to electrically isolate the T-resonator from the ground conductor.

  8. Combining Illumination Normalization Methods for Better Face Recognition

    NARCIS (Netherlands)

    Boom, B.J.; Tao, Q.; Spreeuwers, L.J.; Veldhuis, R.N.J.

    2009-01-01

    Face Recognition under uncontrolled illumination conditions is partly an unsolved problem. There are two categories of illumination normalization methods. The first category performs a local preprocessing, where they correct a pixel value based on a local neighborhood in the images. The second categ

  9. SPC without Control Limits and Normality Assumption: A New Method

    Science.gov (United States)

    Vazquez-Lopez, J. A.; Lopez-Juarez, I.

    Control Charts (CC) are important Statistic Process Control (SPC) tools developed in the 20's to control and improve the quality of industrial production. The use of CC requires visual inspection and human judgement to diagnoses the process quality properly. CC assume normal distribution in the observed variables to establish the control limits. However, this is a requirement difficult to meet in practice since skewness distributions are commonly observed. In this research, a novel method that neither requires control limits nor data normality is presented. The core of the method is based on the FuzzyARTMAP (FAM) Artificial Neural Network (ANN) that learns special and non-special patterns of variation and whose internal parameters are determined through experimental design to increase its efficiency. The proposed method was implemented successfully in a manufacturing process determining the statistical control state that validate our method.

  10. Minimization and Mitigation of Wire EDM Cutting Errors in the Application of the Contour Method of Residual Stress Measurement

    Science.gov (United States)

    Ahmad, Bilal; Fitzpatrick, Michael E.

    2016-01-01

    The contour method of residual stress measurement relies on the careful application of wire electro-discharge machining (WEDM) for the cutting stage. Changes in material removal rates during the cut lead to errors in the final calculated values of residual stress. In this study, WEDM cutting parameters have been explored to identify the optimum conditions for contour method residual stress measurements. The influence of machine parameters on the surface roughness and cutting artifacts in the contour cut is discussed. It has been identified that the critical parameter in improving the surface finish is the spark pulse duration. A typical cutting artifact and its impact on measured stress values have been identified and demonstrated for a contour cut in a welded marine steel. A procedure is presented to correct contour displacement data from the influence of WEDM cutting artifacts, and is demonstrated on the correction of a measured weld residual stress. The corrected contour method improved the residual stress magnitude up to 150 MPa. The corrected contour method results were validated by X-ray diffraction, incremental center hole drilling, and neutron diffraction.

  11. The Constant Intensity Cut Method applied to the KASCADE-Grande muon data

    Energy Technology Data Exchange (ETDEWEB)

    Arteaga-Velazquez, J.C., E-mail: arteaga@ifm.umich.m [Institut fuer Experimentelle Kernphysik, Universitaet Karlsruhe, D-76021 Karlsruhe (Germany); Apel, W.D.; Badea, F.; Bekk, K. [Institut fuer Kernphysik, Forschungszentrum Karlsruhe, D-76021 Karlsruhe (Germany); Bertaina, M. [Dipartimento di Fisica Generale dell' Universita, 10125 Torino (Italy); Bluemer, J. [Institut fuer Kernphysik, Forschungszentrum Karlsruhe, D-76021 Karlsruhe (Germany); Institut fuer Experimentelle Kernphysik, Universitaet Karlsruhe, D-76021 Karlsruhe (Germany); Bozdog, H. [Institut fuer Kernphysik, Forschungszentrum Karlsruhe, D-76021 Karlsruhe (Germany); Brancus, I.M. [National Institute of Physics and Nuclear Engineering, P.O. Box Mg-6, RO-7690 Bucharest (Romania); Brueggemann, M.; Buchholz, P. [Fachbereich Physik, Universitaet Siegen, 57068 Siegen (Germany); Cantoni, E. [Dipartimento di Fisica Generale dell' Universita, 10125 Torino (Italy); Istituto di Fisica dello Spazio Interplanetario, INAF, 10133 Torino (Italy); Chiavassa, A. [Dipartimento di Fisica Generale dell' Universita, 10125 Torino (Italy); Cossavella, F. [Institut fuer Experimentelle Kernphysik, Universitaet Karlsruhe, D-76021 Karlsruhe (Germany); Daumiller, K. [Institut fuer Kernphysik, Forschungszentrum Karlsruhe, D-76021 Karlsruhe (Germany); Souza, V. de [Institut fuer Experimentelle Kernphysik, Universitaet Karlsruhe, D-76021 Karlsruhe (Germany); Di Pierro, F. [Dipartimento di Fisica Generale dell' Universita, 10125 Torino (Italy); Doll, P.; Engel, R.; Engler, J.; Finger, M. [Institut fuer Kernphysik, Forschungszentrum Karlsruhe, D-76021 Karlsruhe (Germany)

    2009-12-15

    The constant intensity cut method is a very useful tool to reconstruct the cosmic ray energy spectrum in order to combine or compare extensive air shower data measured for different attenuation depths independently of the MC model. In this contribution the method is used to explore the muon data of the KASCADE-Grande experiment. In particular, with this technique, the measured muon number spectra for different zenith angle ranges are compared and summed up to obtain a single muon spectrum for the measured showers. Preliminary results are presented, along with estimations of the systematic uncertainties associated with the analysis technique.

  12. Phase- and size-adjusted CT cut-off for differentiating neoplastic lesions from normal colon in contrast-enhanced CT colonography

    Energy Technology Data Exchange (ETDEWEB)

    Luboldt, W. [University Hospital Frankfurt, Department of Radiology, Frankfurt (Germany); Multiorgan Screening Foundation, Frankfurt (Germany); University Hospital Essen, Clinic of Angiology, Essen (Germany); Kroll, M.; Wetter, A.; Vogl, T.J. [University Hospital Frankfurt, Department of Radiology, Frankfurt (Germany); Toussaint, T.L. [Multiorgan Screening Foundation, Frankfurt (Germany); Hoepffner, N. [University Hospital Frankfurt, Department of Internal Medicine, Frankfurt (Germany); Holzer, K. [University Hospital Frankfurt, Department of Visceral and Vascular Surgery, Frankfurt (Germany); Kluge, A. [Kerckhoff Heart Center, Department of Radiology, Bad Nauheim (Germany)

    2004-12-01

    A computed tomography (CT) cut-off for differentiating neoplastic lesions (polyps/carcinoma) from normal colon in contrast-enhanced CT colonography (CTC) relating to the contrast phase and lesion size is determined. CT values of 64 colonic lesions (27 polyps <10 mm, 13 polyps {>=}10 mm, 24 carcinomas) were determined by region-of-interest (ROI) measurements in 38 patients who underwent contrast-enhanced CTC. In addition, the height (H) of the colonic lesions was measured in CT. CT values were also measured in the aorta (A), superior mesenteric vein (V) and colonic wall. The contrast phase was defined by xA + (1 - x)V using x as a weighting factor for describing the different contrast phases ranging from the pure arterial phase (x=1) over the intermediate phases (x=0.9-0.1) to the pure venous phase (x=0). The CT values of the lesions were correlated with their height (H), the different phases (xA + (1 - x)V) and the ratio [xA + (1 - x)V]/H. The CT cut-off was linearly adjusted to the imaged contrast phase and height of the lesion by the line y = m[xA + (1 - x)V]/H + y{sub 0}. The slope m was determined by linear regression in the correlation (lesion {proportional_to}[xA + (1 - x)V]//H) and the Y-intercept y{sub 0} by the minimal shift of the line needed to maximize the accuracy of separating the colonic wall from the lesions. The CT value of the lesions correlated best with the intermediate phase: 0.4A+ 0.6V(r=0.8 for polyps {>=}10 mm, r=0.6 for carcinomas, r=0.4 for polyps <10 mm). The accuracy in the differentiation between lesions and normal colonic wall increased with the height implemented as divisor, reached 91% and was obtained by the dynamic cut-off described by the formula: cut-off(A,V,H) = 1.1[0.4A + 0.6V]/H + 69.8. The CT value of colonic polyps or carcinomas can be increased extrinsically by scanning in the phase in which 0.4A + 0.6V reaches its maximum. Differentiating lesions from normal colon based on CT values is possible in contrast-enhanced CTC and

  13. An efficient conservative cut-cell method for rigid bodies interacting with viscous compressible flows

    Science.gov (United States)

    Schneiders, Lennart; Günther, Claudia; Meinke, Matthias; Schröder, Wolfgang

    2016-04-01

    A Cartesian cut-cell method for viscous flows interacting with freely moving boundaries is presented. The method enables a sharp resolution of the embedded boundaries and strictly conserves mass, momentum, and energy. A new explicit Runge-Kutta scheme (PC-RK) is introduced by which the overall computational time is reduced by a factor of up to 2.5. The new scheme is a predictor-corrector type reformulation of a popular class of Runge-Kutta methods which substantially reduces the computational effort for tracking the moving boundaries and subsequently reinitializing the solver impairing neither stability nor accuracy. The structural motion is computed by an implicit scheme with good stability properties due to a strong-coupling strategy and the conservative discretization of the flow solver at the material interfaces. A new formulation for the treatment of small cut cells is proposed with high accuracy and robustness for arbitrary geometries based on a weighted Taylor-series approach solved via singular-value decomposition. The efficiency and the accuracy of the new method are demonstrated for several three-dimensional cases of laminar and turbulent particulate flow. It is shown that the new method remains fully conservative even for large displacements of the boundaries leading to a fast convergence of the fluid-solid coupling while spurious force oscillations inherent to this class of methods are effectively suppressed. The results substantiate the good stability and accuracy properties of the scheme even on relatively coarse meshes.

  14. Laser cutting of thick sheet metals: Residual stress analysis

    Science.gov (United States)

    Arif, A. F. M.; Yilbas, B. S.; Aleem, B. J. Abdul

    2009-04-01

    Laser cutting of tailored blanks from a thick mild steel sheet is considered. Temperature and stress field in the cutting sections are modeled using the finite element method. The residual stress developed in the cutting section is determined using the X-ray diffraction (XRD) technique and is compared with the predictions. The structural and morphological changes in the cut section are examined using the optical microscopy and scanning electron microscopy (SEM). It is found that temperature and von Mises stress increase sharply in the cutting section, particularly in the direction normal to the cutting direction. The residual stress remains high in the region close to the cutting section.

  15. An Integrated Method Based on PSO and EDA for the Max-Cut Problem

    Directory of Open Access Journals (Sweden)

    Geng Lin

    2016-01-01

    Full Text Available The max-cut problem is NP-hard combinatorial optimization problem with many real world applications. In this paper, we propose an integrated method based on particle swarm optimization and estimation of distribution algorithm (PSO-EDA for solving the max-cut problem. The integrated algorithm overcomes the shortcomings of particle swarm optimization and estimation of distribution algorithm. To enhance the performance of the PSO-EDA, a fast local search procedure is applied. In addition, a path relinking procedure is developed to intensify the search. To evaluate the performance of PSO-EDA, extensive experiments were carried out on two sets of benchmark instances with 800 to 20000 vertices from the literature. Computational results and comparisons show that PSO-EDA significantly outperforms the existing PSO-based and EDA-based algorithms for the max-cut problem. Compared with other best performing algorithms, PSO-EDA is able to find very competitive results in terms of solution quality.

  16. An Integrated Method Based on PSO and EDA for the Max-Cut Problem.

    Science.gov (United States)

    Lin, Geng; Guan, Jian

    2016-01-01

    The max-cut problem is NP-hard combinatorial optimization problem with many real world applications. In this paper, we propose an integrated method based on particle swarm optimization and estimation of distribution algorithm (PSO-EDA) for solving the max-cut problem. The integrated algorithm overcomes the shortcomings of particle swarm optimization and estimation of distribution algorithm. To enhance the performance of the PSO-EDA, a fast local search procedure is applied. In addition, a path relinking procedure is developed to intensify the search. To evaluate the performance of PSO-EDA, extensive experiments were carried out on two sets of benchmark instances with 800 to 20,000 vertices from the literature. Computational results and comparisons show that PSO-EDA significantly outperforms the existing PSO-based and EDA-based algorithms for the max-cut problem. Compared with other best performing algorithms, PSO-EDA is able to find very competitive results in terms of solution quality.

  17. NOLB : Non-linear rigid block normal mode analysis method.

    Science.gov (United States)

    Hoffmann, Alexandre; Grudinin, Sergei

    2017-04-05

    We present a new conceptually simple and computationally efficient method for non-linear normal mode analysis called NOLB. It relies on the rotations-translations of blocks (RTB) theoretical basis developed by Y.-H. Sanejouand and colleagues. We demonstrate how to physically interpret the eigenvalues computed in the RTB basis in terms of angular and linear velocities applied to the rigid blocks and how to construct a non-linear extrapolation of motion out of these velocities. The key observation of our method is that the angular velocity of a rigid block can be interpreted as the result of an implicit force, such that the motion of the rigid block can be considered as a pure rotation about a certain center. We demonstrate the motions produced with the NOLB method on three different molecular systems and show that some of the lowest frequency normal modes correspond to the biologically relevant motions. For example, NOLB detects the spiral sliding motion of the TALE protein, which is capable of rapid diffusion along its target DNA. Overall, our method produces better structures compared to the standard approach, especially at large deformation amplitudes, as we demonstrate by visual inspection, energy and topology analyses, and also by the MolProbity service validation. Finally, our method is scalable and can be applied to very large molecular systems, such as ribosomes. Standalone executables of the NOLB normal mode analysis method are available at https://team.inria.fr/nano-d/software/nolb-normal-modes. A graphical user interfaces created for the SAMSON software platform will be made available at https: //www.samson-connect.net.

  18. Fast beam cut-off method in RF-knockout extraction for spot-scanning

    Science.gov (United States)

    Furukawa, Takuji; Noda, Koji

    2002-08-01

    An irradiation method with magnetic scanning has been developed in order to provide accurate irradiation even for an irregular target shape. The scanning method has strongly required a lower ripple of the beam spill and a faster response to beam-on/off in slow extraction from a synchrotron ring. At HIMAC, RF-knockout extraction has utilized a bunched beam to reduce the beam-spill ripple. Therefore, particles near the resonance can be spilled out from the separatrices by synchrotron oscillation as well as by a transverse RF field. From this point of view, a fast beam cut-off method has been proposed and verified by both simulations and experiments. The maximum delay from the beam cut-off signal to beam-off has been improved to around 60 μs from 700 μs by a usual method. Unwanted dose has been considerably reduced by around a factor of 10 compared with that by the usual method.

  19. Experimental study on variations in Charpy impact energies of low carbon steel, depending on welding and specimen cutting method

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhaorui; Kang, Hansaem; Lee, Young Seog [Chung-Ang University, Seoul (Korea, Republic of)

    2016-05-15

    This paper presents an experimental study that examines variations of Charpy impact energy of a welded steel plate, depending upon the welding method and the method for obtaining the Charpy specimens. Flux cored arc welding (FCAW) and Gas tungsten arc welding (GTAW) were employed to weld an SA516 Gr. 70 steel plate. The methods of wire cutting and water-jet cutting were adopted to take samples from the welded plate. The samples were machined according to the recommendations of ASTM SEC. II SA370, in order to fit the specimen dimension that the Charpy impact test requires. An X-ray diffraction (XRD) method was used to measure the as-weld residual stress and its redistribution after the samples were cut. The Charpy impact energy of specimens was considerably dependent on the cutting methods and locations in the welded plate where the specimens were taken. The specimens that were cut by water jet followed by FCAW have the greatest resistance-to-fracture (Charpy impact energy). Regardless of which welding method was used, redistributed transverse residual stress becomes compressive when the specimens are prepared using water-jet cutting. Meanwhile, redistributed transverse residual stress becomes tensile when the specimens are prepared using wire cutting.

  20. NOLB: Nonlinear Rigid Block Normal Mode Analysis Method

    OpenAIRE

    Hoffmann, Alexandre; Grudinin, Sergei

    2017-01-01

    International audience; We present a new conceptually simple and computationally efficient method for non-linear normal mode analysis called NOLB. It relies on the rotations-translations of blocks (RTB) theoretical basis developed by Y.-H. Sanejouand and colleagues. We demonstrate how to physically interpret the eigenvalues computed in the RTB basis in terms of angular and linear velocities applied to the rigid blocks and how to construct a non-linear extrapolation of motion out of these velo...

  1. Analysis of methods for determining cutting resistance of tools on the shearer drum of a shearer loader

    Energy Technology Data Exchange (ETDEWEB)

    Krauze, K. (Akademia Gorniczo-Hutnicza, Cracow (Poland). Instytut Maszyn Gorniczych, Przerobczych i Automatyki)

    1989-01-01

    Comparatively evaluates methods for determining resistance of coal cut by a single radial or tangential tool on the shearer of a shearer loader. Methods developed by Bieron, Ewans, Nishimatsu and Merchants as well as empirical methods used in Poland are analyzed. Evaluations show that none of the methods is a universal method. The method of empirical formulae should be used for optimization of shearer loader drive systems. Ewans' method should be used for determining cutting resistance of brittle rocks when tangential tools are used. Bieron's and Nishimatsu's methods are used for radial tools and firm coal with plastic properties. Merchants' and Bieron's methods for determining coal resistance to cutting by tangential tools produce inaccurate results. 14 refs.

  2. Laser capture microdissection: Arcturus(XT) infrared capture and UV cutting methods.

    Science.gov (United States)

    Gallagher, Rosa I; Blakely, Steven R; Liotta, Lance A; Espina, Virginia

    2012-01-01

    Laser capture microdissection (LCM) is a technique that allows the precise procurement of enriched cell populations from a heterogeneous tissue under direct microscopic visualization. LCM can be used to harvest the cells of interest directly or can be used to isolate specific cells by ablating the unwanted cells, resulting in histologically enriched cell populations. The fundamental components of laser microdissection technology are (a) visualization of the cells of interest via microscopy, (b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and (c) removal of cells of interest from the heterogeneous tissue section. Laser energy supplied by LCM instruments can be infrared (810 nm) or ultraviolet (355 nm). Infrared lasers melt thermolabile polymers for cell capture, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes the unique features of the Arcturus(XT) laser capture microdissection instrument, which incorporates both infrared capture and ultraviolet cutting technology in one instrument, using a proteomic downstream assay as a model.

  3. Taping torque test for cutting fluid evaluation: test method and procedure

    DEFF Research Database (Denmark)

    Belluco, Walter; De Chiffre, Leonardo

    Tapping torque is a parameter closely connected to the lubricating effect of a cutting fluid. Tapping involves many small cutting edges in continuous contact with the work throughout the cut. The design of the tools and the nature of this operation shield the edges of the tool from the flow of th...

  4. Penalized versions of the Newman-Girvan modularity and their relation to normalized cuts and k-means clustering.

    Science.gov (United States)

    Bolla, Marianna

    2011-07-01

    Two penalized-balanced and normalized-versions of the Newman-Girvan modularity are introduced and estimated by the non-negative eigenvalues of the modularity and normalized modularity matrix, respectively. In this way, the partition of the vertices that maximizes the modularity can be obtained by applying the k-means algorithm for the representatives of the vertices based on the eigenvectors belonging to the largest positive eigenvalues of the modularity or normalized modularity matrix. The proper dimension depends on the number of the structural eigenvalues of positive sign, while dominating negative eigenvalues indicate an anticommunity structure; the balance between the negative and the positive eigenvalues determines whether the underlying graph has a community, anticommunity, or randomlike structure.

  5. Physicochemical, microbial and sensory quality of fresh-cut red beetroots in relation to sanization method and storage duration

    Directory of Open Access Journals (Sweden)

    Dulal Chandra

    2015-06-01

    Full Text Available Effects of sanitization and storage on fresh-cut beetroots (Beta vulgaris L. were evaluated following sanitation – peeling - cutting (SPC, peeling – sanitation – cutting (PSC and peeling – cutting – sanitation (PCS methods with (Cl, or without (TW, 100 ppm chlorine solution, then packaged in polyethylene bag and stored at 5°C for up to 14 days. Chroma values of fresh-cut beetroots significantly declined whereas whiteness index and titratable acidity values increased, however, texture and total soluble solid contents showed no significant variation. Betalain contents decreased gradually and total phenol content showed inconsistence trend. PCS-Cl treated samples accounted for higher betalains decline and received lower visual quality scores despite its lower total aerobic bacterial count. Minimum microbial population was observed in PSC-Cl methodsalong with higher levels of betalain contents. Considering pigment retention, microbial and visual qualities, beetroots sanitized with chlorine water following PSC method was the best processingway for fresh-cut beetroots and therefore, PSC-Cl treatment could commercially be used for processing of fresh-cut beetroots.

  6. A Novel Method for 3D Face Detection and Normalization

    Directory of Open Access Journals (Sweden)

    Robert Niese

    2007-09-01

    Full Text Available When automatically analyzing images of human faces, either for recognition in biometry applications or facial expression analysis in human machine interaction, one has to cope with challenges caused by different head pose, illumination and expression. In this article we propose a new stereo based method for effectively solving the pose problem through 3D face detection and normalization. The proposed method applies a model-based matching and is especially intended for the study of facial features and the description of their dynamic changes in image sequences under the assumption of non-cooperative persons. In our work, we are currently implementing a new application to observe and analyze single faces of post-operative patients. In the proposed method, face detection is based on color driven clustering of 3D points derived from stereo. A mesh model is matched with the post-processed face cluster using a variant of the Iterative Closest Point algorithm (ICP. Pose is derived from correspondence. Then, pose and model information is used for the synthesis of the face normalization. Results show, stereo and color are powerful cues for finding the face and its pose under a wide range of poses, illuminations and expressions (PIE. Head orientation may vary in out of plane rotations up to ±45°.

  7. Transfer of Pathogens from Cantaloupe Rind to Preparation Surfaces and Edible Tissue as a Function of Cutting Method.

    Science.gov (United States)

    Shearer, Adrienne E H; LeStrange, Kyle; Castañeda Saldaña, Rafael; Kniel, Kalmia E

    2016-05-01

    Whole and cut cantaloupes have been implicated as vehicles in foodborne illness outbreaks of norovirus, salmonellosis, and listeriosis. Preparation methods that minimize pathogen transfer from external surfaces to the edible tissue are needed. Two preparation methods were compared for the transfer of Listeria monocytogenes, Salmonella enterica serovar Typhimurium LT2, murine norovirus, and Tulane virus from inoculated cantaloupe rinds to edible tissue and preparation surfaces. For the first method, cantaloupes were cut into eighths, and edible tissue was separated from the rind and cubed with the same knife used to open the cantaloupes. For the second method, cantaloupes were scored with a knife around the circumference sufficient to allow manual separation of the cantaloupes into halves. Edible tissue was scooped with a spoon and did not contact the preparation surface touched by the rind. Bacteria and virus were recovered from the rinds, preparation surfaces, and edible tissue and enumerated by culture methods and reverse transcription, quantitative PCR, respectively. Standard plate counts were determined throughout refrigerated storage of cantaloupe tissue. Cut method 2 yielded approximately 1 log lower recovery of L. monocytogenes and Salmonella Typhimurium from edible tissue, depending on the medium in which the bacteria were inoculated. A slight reduction was observed in murine norovirus recovered from edible tissue by cut method 2. The Tulane virus was detected in approximately half of the sampled cantaloupe tissue and only at very low levels. Aerobic mesophilic colony counts were lower through day 6 of storage for buffered peptone water-inoculated cantaloupes prepared by cut method 2. No differences were observed in environmental contamination as a function of cutting method. Although small reductions in contamination of edible tissue were observed for cut method 2, the extent of microbial transfer underscores the importance of preventing contamination of

  8. Solution of N-S equations based on the quadtree cut cell method

    Institute of Scientific and Technical Information of China (English)

    KASE; Kiwamu

    2009-01-01

    With the characteristic of the quadtree data structure, a new mesh generation method, which adopts square meshes to decompose a background domain and a cut cell approach to express arbitrary boundaries, is proposed to keep the grids generated with a good orthogonality easily. The solution of N-S equations via finite volume method for this kind of unstructured meshes is derived. The mesh generator and N-S solver are implemented to study two benchmark cases, i.e. a lid driven flow within an inclined square and a natural convection heat transfer flow in a square duct with an inner hot circular face. The simulation results are in agreement with the benchmark values, verifying that the present methodology is valid and will be a strong tool for two-dimensional flow and heat transfer simulations, especially in the case of complex boundaries.

  9. Fabrication of thin diamond membranes by using hot implantation and ion-cut methods

    Science.gov (United States)

    Suk, Jaekwon; Kim, Hyeongkwon; Lim, Weon Cheol; Yune, Jiwon; Moon, Sung; Eliades, John A.; Kim, Joonkon; Lee, Jaeyong; Song, Jonghan

    2017-03-01

    A thin (2 μm) and relatively large area (3 × 3 mm2) diamond membrane was fabricated by cleaving a surface from a single crystal chemical vapor deposition (CVD) diamond wafer (3 × 3 mm2× 300 μm) using a hot implantation and ion-cut method. First, while maintaining the CVD diamond at 400 °C, a damage zone was created at a depth of 2.3 μm underneath the surface by implanting 4 MeV carbon ions into the diamond in order to promote membrane cleavage (hot implantation). According to TEM data, hot implantation reduces the thickness of the implantation damage zone by about a factor of 10 when compared to implanting carbon ions with the CVD diamond at room temperature (RT). In order to recover crystallinity, the implanted sample was then annealed at 850 °C. Next, 380 keV hydrogen ions were implanted into the sample to a depth of 2.3 μm below the surface with the CVD diamond at RT. After annealing at 850 °C, the CVD diamond surface layer was cleaved at the damage-zone due to internal pressure from H2 gas arising from the implanted hydrogen (ion-cut). A thin layer of graphite (˜300 nm) on the cleavage surface, arising from the implanted carbon, was removed by O2 annealing. This technique can potentially be used to produce much larger area membranes of variable thickness.

  10. New Graphical Methods and Test Statistics for Testing Composite Normality

    Directory of Open Access Journals (Sweden)

    Marc S. Paolella

    2015-07-01

    Full Text Available Several graphical methods for testing univariate composite normality from an i.i.d. sample are presented. They are endowed with correct simultaneous error bounds and yield size-correct tests. As all are based on the empirical CDF, they are also consistent for all alternatives. For one test, called the modified stabilized probability test, or MSP, a highly simplified computational method is derived, which delivers the test statistic and also a highly accurate p-value approximation, essentially instantaneously. The MSP test is demonstrated to have higher power against asymmetric alternatives than the well-known and powerful Jarque-Bera test. A further size-correct test, based on combining two test statistics, is shown to have yet higher power. The methodology employed is fully general and can be applied to any i.i.d. univariate continuous distribution setting.

  11. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.

    Science.gov (United States)

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-11-16

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  12. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining

    Directory of Open Access Journals (Sweden)

    Qiaokang Liang

    2016-11-01

    Full Text Available Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  13. A novel method for the analysis of slash cuts to clothing

    OpenAIRE

    Bostock, Esta; Parkes, Gareth M. B.; Williams, Graham

    2013-01-01

    Slash attacks form a major element of physical assaults involving a sharp implement such as a knife. If the slash attack is inflicted onto a surface covered with fabric, such as clothing, then that fabric may receive a slash cut. Investigation of the slash cut can provide further information as to the nature of the implement and the action of the attack. This study aims to identify a quantifiable correlation between the nature of the slash cut and the implement causing said slash cut....

  14. Simulation of metal cutting using the particle finite-element method and a physically based plasticity model

    Science.gov (United States)

    Rodríguez, J. M.; Jonsén, P.; Svoboda, A.

    2017-01-01

    Metal cutting is one of the most common metal-shaping processes. In this process, specified geometrical and surface properties are obtained through the break-up of material and removal by a cutting edge into a chip. The chip formation is associated with large strains, high strain rates and locally high temperatures due to adiabatic heating. These phenomena together with numerical complications make modeling of metal cutting difficult. Material models, which are crucial in metal-cutting simulations, are usually calibrated based on data from material testing. Nevertheless, the magnitudes of strains and strain rates involved in metal cutting are several orders of magnitude higher than those generated from conventional material testing. Therefore, a highly desirable feature is a material model that can be extrapolated outside the calibration range. In this study, a physically based plasticity model based on dislocation density and vacancy concentration is used to simulate orthogonal metal cutting of AISI 316L. The material model is implemented into an in-house particle finite-element method software. Numerical simulations are in agreement with experimental results, but also with previous results obtained with the finite-element method.

  15. A Symplectic Method to Generate Multivariate Normal Distributions

    CERN Document Server

    Baumgarten, Christian

    2012-01-01

    The AMAS group at the Paul Scherrer Institute developed an object oriented library for high performance simulation of high intensity ion beam transport with space charge. Such particle-in-cell (PIC) simulations require a method to generate multivariate particle distributions as starting conditions. In a preceeding publications it has been shown that the generators of symplectic transformations in two dimensions are a subset of the real Dirac matrices (RDMs) and that few symplectic transformations are required to transform a quadratic Hamiltonian into diagonal form. Here we argue that the use of RDMs is well suited for the generation of multivariate normal distributions with arbitrary covariances. A direct and simple argument supporting this claim is that this is the "natural" way how such distributions are formed. The transport of charged particle beams may serve as an example: An uncorrelated gaussian distribution of particles starting at some initial position of the accelerator is subject to linear deformat...

  16. Multivariate meta-analysis of prognostic factor studies with multiple cut-points and/or methods of measurement.

    Science.gov (United States)

    Riley, Richard D; Elia, Eleni G; Malin, Gemma; Hemming, Karla; Price, Malcolm P

    2015-07-30

    A prognostic factor is any measure that is associated with the risk of future health outcomes in those with existing disease. Often, the prognostic ability of a factor is evaluated in multiple studies. However, meta-analysis is difficult because primary studies often use different methods of measurement and/or different cut-points to dichotomise continuous factors into 'high' and 'low' groups; selective reporting is also common. We illustrate how multivariate random effects meta-analysis models can accommodate multiple prognostic effect estimates from the same study, relating to multiple cut-points and/or methods of measurement. The models account for within-study and between-study correlations, which utilises more information and reduces the impact of unreported cut-points and/or measurement methods in some studies. The applicability of the approach is improved with individual participant data and by assuming a functional relationship between prognostic effect and cut-point to reduce the number of unknown parameters. The models provide important inferential results for each cut-point and method of measurement, including the summary prognostic effect, the between-study variance and a 95% prediction interval for the prognostic effect in new populations. Two applications are presented. The first reveals that, in a multivariate meta-analysis using published results, the Apgar score is prognostic of neonatal mortality but effect sizes are smaller at most cut-points than previously thought. In the second, a multivariate meta-analysis of two methods of measurement provides weak evidence that microvessel density is prognostic of mortality in lung cancer, even when individual participant data are available so that a continuous prognostic trend is examined (rather than cut-points). © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  17. A new data normalization method for unsupervised anomaly intrusion detection

    Institute of Scientific and Technical Information of China (English)

    Long-zheng CAI; Jian CHEN; Yun KE; Tao CHEN; Zhi-gang LI

    2010-01-01

    Unsupervised anomaly detection can detect attacks without the need for clean or labeled training data.This paper studies the application of clustering to unsupervised anomaly detection(ACUAD).Data records are mapped to a feature space.Anomalies are detected by determining which points lie in the sparse regions of the feature space.A critical element for this method to be effective is the definition of the distance function between data records.We propose a unified normalization distance framework for records with numeric and nominal features mixed data.A heuristic method that computes the distance for nominal features is proposed,taking advantage of an important characteristic of nominal features-their probability distribution.Then,robust methods are proposed for mapping numeric features and computing their distance,these being able to tolerate the impact of the value difference in scale and diversification among features,and outliers introduced by intrusions.Empirical experiments with the KDD 1999 dataset showed that ACUAD can detect intrusions with relatively low false alarm rates compared with other approaches.

  18. New approach to the normal mode method in underwater acoustics

    Institute of Scientific and Technical Information of China (English)

    王宁; 刘进忠

    2002-01-01

    A new approach to the numerical solution of normal mode problems in underwater acoustics is presented, in whichthe corresponding normal mode problem is transformed to the problem of solving a dynamic system. Three applica-tions are considered: (1) the broad band normal mode problem; (2) the range-dependent problem with perturbationproportional to the range parameter; and (3) the evolution of the normal mode with environmental parameters. Anumerical simulation for a broad band problem is performed, and the calculated eigenvalues have good agreement withthose obtained by the standard normal mode code KRAKAN.

  19. Comparison of heat effects associated with metal cutting method on ST 37 alloy steel

    Directory of Open Access Journals (Sweden)

    L. Dahil

    2014-04-01

    Full Text Available In this study, by examining effects of the heat on the cut surface of material formed by the processes, such as Plasma, Laser, Wire Erosion applied on St 37 alloy steel material, it has been determined that minimum cutting damage occurs in wire erosion process.

  20. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS.

    Science.gov (United States)

    Jiao, Xiangmin; Einstein, Daniel R; Dyedov, Vladimir

    2010-03-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.

  1. Temporary roads : nature-friendly construction method cuts road width by half and saves money

    Energy Technology Data Exchange (ETDEWEB)

    Godfrey, B.

    2010-10-15

    This article discussed a new method for building low-disturbance temporary roads that entails mulching trees and using the resulting wood-fibre material for the road bed. The size of the mulch is adjusted for conditions. A specialized whole-tree mulcher is used after the access area is logged and the felled trees cleared away. The mulcher mulches a whole tree to specs and then moves on to the next tree. Merchantable wood is removed and sold. The process is suited to flat or gently sloping areas at higher elevations, and it reduces the width of a conventional clay-based right-of-way by 50 percent. Clay-built roads require extensive ditch drainage, increasing the width needed, but moisture drains directly through mulch-based roads, which also leave the underlying root zone undisturbed, so no drainage ditches are needed. The new method reduces disturbance resulting from road width, makes use of materials that would otherwise go unused, eliminates the need to import large quantities of fill, and eliminates the risk of introducing new species to the roadway area during the reclamation process. The mulch road bed has a depth of 8 to 20 inches. Leaving the root bed undisturbed adds to the load-bearing integrity of the road. The mulch method also cuts overall access costs. The construction cost is slightly more than for clay-based roads, but the severely reduced reclamation cost results in big savings. 1 ref., 3 figs.

  2. Assessment of pulmonary dynamics in normal newborns: a pneumotachographic method.

    Science.gov (United States)

    Estol, P; Píriz, H; Pintos, L; Nieto, F; Simini, F

    1988-01-01

    A pneumotachographic method for assessment of pulmonary dynamics in critically ill newborns in an intensive care setting was developed in our laboratory. Before the results obtained with this method could be applied, the normal range of values were determined in 48 normal term and preterm newborns. Their body weight ranged between 1200 and 4100 g, and postnatal ages between 24 hours and 21 days. In three infants, two determinations were performed after an interval of 7 days. The studies were performed with a pneumotachograph applied to the upper airway by means of an inflatable face mask or latex nasal prongs. The air flow signal was electronically integrated to time to produce a volume signal. Airway pressure was determined proximal to the pneumotachograph. Esophageal pressure was determined with a water filled catheter placed in the lower third of the esophague. Tidal volume (VT), minute ventilation (V), Dynamic compliance (Cdyn), total pulmonary resistance (R), total pulmonary work (Wt), Elastic work (We), and flow resistive work (Wv), were determined. A significant linear correlation was found between Cdyn and body weight (r = 0.50, p less than 0.01) whereas no significative correlation was found between body weight and VT, V or R. Values for VT, V and Cdyn were corrected for body weight and means (X), standard deviation (SD) so as 10th and 90th percentiles are shown in table III. X, SD and percentiles for R were shown in table III. Wt, We and Wv were corrected for V, and X, SD and percentiles shown in table III. Values of VT/Kg, Cdyn/Kg and R are similar to those found by other authors with pneumotachography and plethysmography. The V/Kg values obtained by us were higher than those reported by other authors, which together with the lack of correlation of VT and V with body weight, question the reliability of V values in our study. This could be explained by: 1) excessive increase in dead space in cases in which a face mask was used; 2) nocioceptive stimulus

  3. Design of Batch Distillation Columns Using Short-Cut Method at Constant Reflux

    Directory of Open Access Journals (Sweden)

    Asteria Narvaez-Garcia

    2013-01-01

    Full Text Available A short-cut method for batch distillation columns working at constant reflux was applied to solve a problem of four components that needed to be separated and purified to a mole fraction of 0.97 or better. Distillation columns with 10, 20, 30, 40, and 50 theoretical stages were used; reflux ratio was varied between 2 and 20. Three quality indexes were used and compared: Luyben’s capacity factor, total annual cost, and annual profit. The best combinations of theoretical stages and reflux ratio were obtained for each method. It was found that the best combinations always required reflux ratios close to the minimum. Overall, annual profit was the best quality index, while the best combination was a distillation column with 30 stages, and reflux ratio’s of 2.0 for separation of benzene (i, 5.0 for the separation of toluene (ii, and 20 for the separation of ethylbenzene (iii and purification of o-xylene (iv.

  4. Examination of bacteriophage as a biocontrol method for salmonella on fresh-cut fruit: a model study.

    Science.gov (United States)

    Leverentz, B; Conway, W S; Alavidze, Z; Janisiewicz, W J; Fuchs, Y; Camp, M J; Chighladze, E; Sulakvelidze, A

    2001-08-01

    The preparation and distribution of fresh-cut produce is a rapidly developing industry that provides the consumer with convenient and nutritious food. However, fresh-cut fruits and vegetables may represent an increased food safety concern because of the absence or damage of peel and rind, which normally help reduce colonization of uncut produce with pathogenic bacteria. In this study, we found that Salmonella Enteritidis populations can (i) survive on fresh-cut melons and apples stored at 5 degrees C, (ii) increase up to 2 log units on fresh-cut fruits stored at 10 degrees C, and (iii) increase up to 5 log units at 20 degrees C during a storage period of 168 h. In addition, we examined the effect of lytic, Salmonella-specific phages on reducing Salmonella numbers in experimentally contaminated fresh-cut melons and apples stored at various temperatures. We found that the phage mixture reduced Salmonella populations by approximately 3.5 logs on honeydew melon slices stored at 5 and 10 degrees C and by approximately 2.5 logs on slices stored at 20 degrees C, which is greater than the maximal amount achieved using chemical sanitizers. However, the phages did not significantly reduce Salmonella populations on the apple slices at any of the three temperatures. The titer of the phage preparation remained relatively stable on melon slices, whereas on apple slices the titer decreased to nondetectable levels in 48 h at all temperatures tested. Inactivation of phages, possibly by the acidic pH of apple slices (pH 4.2 versus pH 5.8 for melon slices), may have contributed to their inability to reduce Salmonella contamination in the apple slices. Higher phage concentrations and/or the use of low-pH-tolerant phage mutants may be required to increase the efficacy of the phage treatment in reducing Salmonella contamination of fresh-cut produce with a low pH.

  5. Risk identification and risk mitigation during metro station construction by enlarging shield tunnel combined with cut-and-cover method

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Constructing a metro station by enlarging shield tunnels combined with a mining/cut-and-cover method provides a new method to solve the contradictions of construction time limits of shield tunnels and stations. As a new-style construction method, there are several specific risks involved in the construction process. Based on the test section of Sanyuanqiao station on Beijing metro line 10, and combined with the existing methods of risk identification at present, including a review of world-wide operational ...

  6. A high-order adaptive Cartesian cut-cell method for simulation of compressible viscous flow over immersed bodies

    Science.gov (United States)

    Muralidharan, Balaji; Menon, Suresh

    2016-09-01

    A new adaptive finite volume conservative cut-cell method that is third-order accurate for simulation of compressible viscous flows is presented. A high-order reconstruction approach using cell centered piecewise polynomial approximation of flow quantities, developed in the past for body-fitted grids, is now extended to the Cartesian based cut-cell method. It is shown that the presence of cut-cells of very low volume results in numerical oscillations in the flow solution near the embedded boundaries when standard small cell treatment techniques are employed. A novel cell clustering approach for polynomial reconstruction in the vicinity of the small cells is proposed and is shown to achieve smooth representation of flow field quantities and their derivatives on immersed interfaces. It is further shown through numerical examples that the proposed clustering method achieves the design order of accuracy and is fairly insensitive to the cluster size. Results are presented for canonical flow past a single cylinder and a sphere at different flow Reynolds numbers to verify the accuracy of the scheme. Investigations are then performed for flow over two staggered cylinders and the results are compared with prior data for the same configuration. All the simulations are carried out with both quadratic and cubic reconstruction, and the results indicate a clear improvement with the cubic reconstruction. The new cut-cell approach with cell clustering is able to predict accurate results even at relatively low resolutions. The ability of the high-order cut-cell method in handling sharp geometrical corners and narrow gaps is also demonstrated using various examples. Finally, three-dimensional flow interactions between a pair of spheres in cross flow is investigated using the proposed cut-cell scheme. The results are shown to be in excellent agreement with past studies, which employed body-fitted grids for studying this complex case.

  7. A measurement method of cutting tool position for relay fabrication of microstructured surface

    Science.gov (United States)

    Chen, Yuan-Liu; Gao, Wei; Ju, Bing-Feng; Shimizu, Yuki; Ito, So

    2014-06-01

    By using the secondary function of a force sensor integrated fast tool servo (FTS) for surface profile measurement, the three-dimensional tip position of a micro-cutting tool in the FTS with respect to the fabricated microstructures was measured without using any additional instrument for realizing the concept of relay fabrication of microstructured surface. It was verified from the experiments for testing the basic performances of tool tip position measurement that the delay of the force feedback control loop of the FTS was a big factor influencing the position measurement accuracy. A bidirectional scanning strategy was then employed to reduce the position measurement error due to the delay of the feedback control loop. Tool tip position measurement experiments by using micro-tools with a nose radius of 100 µm for relay fabrications with sub-micrometer accuracies, including stitching fabrication of a micro-groove line array and filling fabrication of a microlens lattice pattern, were carried out to demonstrate the feasibility of the tool position measurement method.

  8. Association between Pseudonocardia symbionts and Atta leaf-cutting ants suggested by improved isolation methods.

    Science.gov (United States)

    Marsh, Sarah E; Poulsen, Michael; Gorosito, Norma B; Pinto-Tomás, Adrián; Masiulionis, Virginia E; Currie, Cameron R

    2013-03-01

    Fungus-growing ants associate with multiple symbiotic microbes, including Actinobacteria for production of antibiotics. The best studied of these bacteria are within the genus Pseudonocardia, which in most fungus-growing ants are conspicuously visible on the external cuticle of workers. However, given that fungus-growing ants in the genus Atta do not carry visible Actinobacteria on their cuticle, it is unclear if this genus engages in the symbiosis with Pseudonocardia. Here we explore whether improving culturing techniques can allow for successful isolation of Pseudonocardia from Atta cephalotes leaf-cutting ants. We obtained Pseudonocardia from 9 of 11 isolation method/colony component combinations from all 5 colonies intensively sampled. The most efficient technique was bead-beating workers in phosphate buffer solution, then plating the suspension on carboxymethylcellulose medium. Placing these strains in a fungus-growing ant-associated Pseudonocardia phylogeny revealed that while some strains grouped with clades of Pseudonocardia associated with other genera of fungus-growing ants, a large portion of the isolates fell into two novel phylogenetic clades previously not identified from this ant-microbe symbiosis. Our findings suggest that Pseudonocardia may be associated with Atta fungus-growing ants, potentially internalized, and that localizing the symbiont and exploring its role is necessary to shed further light on the association.

  9. Study on Platinum Coating Depth in Focused Ion Beam Diamond Cutting Tool Milling and Methods for Removing Platinum Layer

    Directory of Open Access Journals (Sweden)

    Woong Kirl Choi

    2015-09-01

    Full Text Available In recent years, nanomachining has attracted increasing attention in advanced manufacturing science and technologies as a value-added processes to control material structures, components, devices, and nanoscale systems. To make sub-micro patterns on these products, micro/nanoscale single-crystal diamond cutting tools are essential. Popular non-contact methods for the macro/micro processing of diamond composites are pulsed laser ablation (PLA and electric discharge machining (EDM. However, for manufacturing nanoscale diamond tools, these machining methods are not appropriate. Despite diamond’s extreme physical properties, diamond can be micro/nano machined relatively easily using a focused ion beam (FIB technique. In the FIB milling process, the surface properties of the diamond cutting tool is affected by the amorphous damage layer caused by the FIB gallium ion collision and implantation and these influence the diamond cutting tool edge sharpness and increase the processing procedures. To protect the diamond substrate, a protection layer—platinum (Pt coating is essential in diamond FIB milling. In this study, the depth of Pt coating layer which could decrease process-induced damage during FIB fabrication is investigated, along with methods for removing the Pt coating layer on diamond tools. The optimum Pt coating depth has been confirmed, which is very important for maintaining cutting tool edge sharpness and decreasing processing procedures. The ultra-precision grinding method and etching with aqua regia method have been investigated for removing the Pt coating layer. Experimental results show that when the diamond cutting tool width is bigger than 500 nm, ultra-precision grinding method is appropriate for removing Pt coating layer on diamond tool. However, the ultra-precision grinding method is not recommended for removing the Pt coating layer when the cutting tool width is smaller than 500 nm, because the possibility that the diamond

  10. Analysis micro-mechanism of burrs formation in the case of nanometric cutting process using numerical simulation method

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Burrs (exit failure), being one of the important factors influencing the final precision of workpiece, have been widely studied. Today, with the development of manufac- turing technology, the depth of cut falls into the range of nanometer or sub- nanometer, there may be some different disciplines dominating the burrs genera- tion process. Molecular dynamics (MD) method, which is different from continuous mechanics, plays an important role in describing microscopic world. Take the ex- ample of single crystal copper, this paper carries out MD analysis micro-mecha- nism of burrs generation process. The results show that the burrs geometry de- pends on the type of workpiece (ductile or brittle). The depth of cut is decreased in the case of positive burrs generation process while the depth of cut is increased in case of negative burrs generation process.

  11. Modified Cheeger and ratio cut methods using the Ginzburg-Landau functional for classification of high-dimensional data

    Science.gov (United States)

    Merkurjev, Ekaterina; Bertozzi, Andrea; Yan, Xiaoran; Lerman, Kristina

    2017-07-01

    Recent advances in clustering have included continuous relaxations of the Cheeger cut problem and those which address its linear approximation using the graph Laplacian. In this paper, we show how to use the graph Laplacian to solve the fully nonlinear Cheeger cut problem, as well as the ratio cut optimization task. Both problems are connected to total variation minimization, and the related Ginzburg-Landau functional is used in the derivation of the methods. The graph framework discussed in this paper is undirected. The resulting algorithms are efficient ways to cluster the data into two classes, and they can be easily extended to the case of multiple classes, or used on a multiclass data set via recursive bipartitioning. In addition to showing results on benchmark data sets, we also show an application of the algorithm to hyperspectral video data.

  12. Electromyography normalization methods for high-velocity muscle actions: review and recommendations.

    Science.gov (United States)

    Ball, Nick; Scurr, Joanna

    2013-10-01

    Electromyograms used to assess neuromuscular demand during high-velocity tasks require normalization to aid interpretation. This paper posits that, to date, methodological approaches to normalization have been ineffective and have limited the application of electromyography (EMG). There is minimal investigation seeking alternative normalization methods, which must be corrected to improve EMG application in sports. It is recognized that differing normalization methods will prevent cross-study comparisons. Users of EMG should aim to identify normalization methods that provide good reliability and a representative measure of muscle activation. The shortcomings of current normalization methods in high-velocity muscle actions assessment are evident. Advances in assessing alternate normalization methods have been done in cycling and sprinting. It is advised that when normalizing high-intensity muscle actions, isometric methods are used with caution and a dynamic alternative, where the muscle action is similar to that of the task is preferred. It is recognized that optimal normalization methods may be muscle and task dependent.

  13. Design and Analysis of Bionic Cutting Blades Using Finite Element Method.

    Science.gov (United States)

    Li, Mo; Yang, Yuwang; Guo, Li; Chen, Donghui; Sun, Hongliang; Tong, Jin

    2015-01-01

    Praying mantis is one of the most efficient predators in insect world, which has a pair of powerful tools, two sharp and strong forelegs. Its femur and tibia are both armed with a double row of strong spines along their posterior edges which can firmly grasp the prey, when the femur and tibia fold on each other in capturing. These spines are so sharp that they can easily and quickly cut into the prey. The geometrical characteristic of the praying mantis's foreleg, especially its tibia, has important reference value for the design of agricultural soil-cutting tools. Learning from the profile and arrangement of these spines, cutting blades with tooth profile were designed in this work. Two different sizes of tooth structure and arrangement were utilized in the design on the cutting edge. A conventional smooth-edge blade was used to compare with the bionic serrate-edge blades. To compare the working efficiency of conventional blade and bionic blades, 3D finite element simulation analysis and experimental measurement were operated in present work. Both the simulation and experimental results indicated that the bionic serrate-edge blades showed better performance in cutting efficiency.

  14. Design and Analysis of Bionic Cutting Blades Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    Mo Li

    2015-01-01

    Full Text Available Praying mantis is one of the most efficient predators in insect world, which has a pair of powerful tools, two sharp and strong forelegs. Its femur and tibia are both armed with a double row of strong spines along their posterior edges which can firmly grasp the prey, when the femur and tibia fold on each other in capturing. These spines are so sharp that they can easily and quickly cut into the prey. The geometrical characteristic of the praying mantis’s foreleg, especially its tibia, has important reference value for the design of agricultural soil-cutting tools. Learning from the profile and arrangement of these spines, cutting blades with tooth profile were designed in this work. Two different sizes of tooth structure and arrangement were utilized in the design on the cutting edge. A conventional smooth-edge blade was used to compare with the bionic serrate-edge blades. To compare the working efficiency of conventional blade and bionic blades, 3D finite element simulation analysis and experimental measurement were operated in present work. Both the simulation and experimental results indicated that the bionic serrate-edge blades showed better performance in cutting efficiency.

  15. Evaluation of three vacuum packaging methods for retail beef loin cuts.

    Science.gov (United States)

    Strydom, Phillip E; Hope-Jones, Michelle

    2014-12-01

    Meat from beef T-bone cuts was packaged as follows: (1) Sub-primal cuts vacuum packaged (VP) in shrink bags, aged for 14 days, portioned, VP again and aged for a further 7 days (VPR), (2) individual T-bone steaks VP in shrink bags aged for 21 days (VPP), and (3) individual T-bone steaks aged in vacuum-skin packaging (VSP) for 21 days. VSP recorded less purge and showed higher oxymyoglobin values after 2 days and higher chroma after 3 days of aerobic display (P < 0.001) than VPR and VPP. Similar differences in colour stability were recorded for VPP compared to VPR.

  16. Manufacturing Methods for Cutting, Machining and Drilling Composites. Volume 1. Composites Machining Handbook

    Science.gov (United States)

    1978-08-01

    0.003 to 0.014 inch in diameter. It should be noted that a hand-held cutting head has been recently marketed . 4.5 RECIPROCATING MECHANICAL CUTTER... PRODUCTO MACHINE CO. MODEL 4F CHUCK 1/4 INCH DIAMETER SPEED 0-350 STROKES/MINUTE 16,000 RPM WITH ROUTER MOTOR FEED HAND EQUIPMENT RELIABILITY...WITH LOT SIZE, MARKET CONDITIONS, ETC. Figure 9-3 Cutting Tool Cost Summary 94 5 - 4 - CO O O o 5 tr 3 - MANUAL NOTE: COSTS FOR N/C

  17. A Simulation Method of Soft Tissue Cutting In Virtual Environment with Haptics

    Directory of Open Access Journals (Sweden)

    Prasad V. Suryawanshi

    2015-07-01

    Full Text Available Currently, virtual simulation has an increasing role in the medical field. Now virtual surgery simulation has been largely explored in medical field. Virtual surgery is a good complement to traditional Surgical Training. Modeling effects of soft tissue during cutting is quite complex, hence the concept of virtuality is used to develop realistic surgical instruments for providing exact force feedback to the surgeon during surgical operation and simulation of soft tissue processes. Scalpel is a basic instrument required for soft tissue simulation. Hence we will design a virtual organ to cut by using Scalpel in Haptic Environment.

  18. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    Science.gov (United States)

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2016-10-02

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation.

  19. Application of Taguchi method for cutting force optimization in rock sawing by circular diamond sawblades

    Indian Academy of Sciences (India)

    Izzet Karakurt

    2014-10-01

    In this paper, an optimization study was carried out for the cutting force (Fc) acting on circular diamond sawblades in rock sawing. The peripheral speed, traverse speed, cut depth and flow rate of cooling fluid were considered as operating variables and optimized by using Taguchi approach for the Fc. L16(44) orthogonal array was chosen for the experimental trials based on the operating variables and their levels. The signal-to-noise (S/N) ratios and the analysis of variance (ANOVA) were employed to determine optimum cutting conditions and significant operating variables, respectively. The Fc was also modelled based on the operating variables using regression analysis. The developed model was then verified using various statistical approaches. The results revealed that the operating variables of circular diamond sawblades can be precisely optimized by Taguchi approach for the Fc in rock sawing. The cut depth and peripheral speed were statistically determined as the significant operating variables affecting Fc. Furthermore, the modelling results showed that the proposed model can be effectively used for the prediction of Fc in practical applications.

  20. Calculation Method for Normal Inducedlongitudinal Voltage on Pilot Cable

    Directory of Open Access Journals (Sweden)

    Abdelaziz B.M. Kamel,

    2014-09-01

    Full Text Available In this paper a full study and detailed calculations of the induced voltage in pilot cables are carried out. First an introduction showing the importance of the induced voltage and its effect in pilot cables. The first calculation method Flat Formation. The second calculation method Trefoil Formation. Then the results obtained for both methods and compared. Finally a conclusion is conduct.

  1. Angoff Method of Setting Cut Scores for High-Stakes Testing: Foley Catheter Checkoff as an Exemplar.

    Science.gov (United States)

    Kardong-Edgren, Suzan; Mulcock, Pamela M

    2016-01-01

    The Angoff method is a commonly used and legally defensible method for setting passing or cut scores for high-stakes examinations. It also can be used for setting passing scores on clinical skill checklists. Two variations of the Angoff method were compared with a traditional and arbitrary 75% passing score, using a Foley catheter insertion checklist as an exemplar. Both Angoff methods produced slightly lower scores than our traditional scoring; because of "must pass" steps on our checklist, 12 of 13 students still failed the evaluation. The project uncovered multiple variations of checklists within different courses and variations in teaching practices for this skill.

  2. Full sun-spay-cutting method of Rosa Davurica Pall ' twig cutting%全光雾喷法在刺玫果嫩枝扦插上的应用

    Institute of Scientific and Technical Information of China (English)

    周军方; 段进潮; 徐岩; 任跃英; 王远亮; 肖兴利; 王秀萍

    2011-01-01

    全光雾喷法扦插育苗是近几年应用较多的快速育苗技术,它具有生根快、繁殖率高的特点,是快速繁殖优良品种苗木的一条重要途径。笔者对刺玫果分别用普通扦插和全光雾喷法进行夏季嫩枝扦插试验,结果表明用浓度为2×10-5g/mL的IBA浸泡16小时处理的插穗使用全光雾插法生根率可高达96%。%Full sun-spay-cutting method is the most applied on rapid seedling technique in recent years, its characteristic takes root quickly and reproductive rate high. It is important way of good varieties 'rapid propagation. By using chemical treatments of IBA on twig cuttings under common cutting and the full sun-spay-cutting method for summer cuttings test .The results show that adopted full sun-spay-eutting method ,which was cutting stems in summer, dousing in 2×10-5g/mL IBA for 16 hours .The rooting rate could reach 96%.

  3. Fabricating 40 µm-thin silicon solar cells with different orientations by using SLiM-cut method

    Science.gov (United States)

    Wang, Teng-Yu; Chen, Chien-Hsun; Shiao, Jui-Chung; Chen, Sung-Yu; Du, Chen-Hsun

    2017-10-01

    Thin silicon foils with different crystal orientations were fabricated using the stress induced lift-off (SLiM-cut) method. The thickness of the silicon foils was approximately 40 µm. The ≤ft foil had a smoother surface than the ≤ft foil. With surface passivation, the minority carrier lifetimes of the ≤ft and ≤ft silicon foil were 1.0 µs and 1.6 µs, respectively. In this study, 4 cm2-thin silicon solar cells with heterojunction structures were fabricated. The energy conversion efficiencies were determined to be 10.74% and 14.74% for the ≤ft and ≤ft solar cells, respectively. The surface quality of the silicon foils was determined to affect the solar cell character. This study demonstrated that fabricating the solar cell by using silicon foil obtained from the SLiM-cut method is feasible.

  4. Methods of Maintaining Steady Oil Production In Late High- Water-Cut Stage in Daqing Oilfields

    Institute of Scientific and Technical Information of China (English)

    Liu Heng; Wang Jiaying

    1994-01-01

    @@ Introduction By the end of 1990, Daqing Oilfields had been producing for 30 years with water flooding. The composite water cut was as high as 79%. In the general regularity of developing large scale sandstone reservoirs,the oilfields had been in their later stage of production, characterized by rapid decline of oil output. It is very difficult to keep a steady oil production without any breakthrough in oilfield development technologies.

  5. Assessment of undergorund mining of Nussir copper deposit: With special emphasize on cut-off and mining method

    OpenAIRE

    Sletten, Audun Mortveit

    2012-01-01

    The narrow steeply dipping strata bound sediment hosted copper deposit, Nussir, located in Kvalsund community, Finmark, Norway, have been subject to an assessment of underground mining. Resources classified as Indicated, within a 0,9%Cu cut-off limit have been identified as suitable for sublevel open stoping mining method, accessed by 590m long tunnel from fjord, a 2000m haulage tunnel in footwall progressing westward along strike and two individual ramps separated by 1280m along strike. Stop...

  6. Evaluation of Combined Disinfection Methods for Reducing Escherichia coli O157:H7 Population on Fresh-Cut Vegetables

    Directory of Open Access Journals (Sweden)

    Eva Petri

    2015-07-01

    Full Text Available Most current disinfection strategies for fresh-cut industry are focused on the use of different chemical agents; however, very little has been reported on the effectiveness of the hurdle technology. The effect of combined decontamination methods based on the use of different sanitizers (peroxyacetic acid and chlorine dioxide and the application of pressure (vacuum/positive pressure on the inactivation of the foodborne pathogen E. coli O157:H7 on fresh-cut lettuce (Lactuca sativa and carrots (Daucus carota was studied. Fresh produce, inoculated with E. coli O157:H7, was immersed (4 °C, 2 min in tap water (W, chlorine water (CW, chlorine dioxide (ClO2: 2 mg/L and peroxyacetic acid (PAA: 100 mg/L in combination with: (a vacuum (V: 10 mbar or (b positive pressure application (P: 3 bar. The product quality and antimicrobial effects of the treatment on bacterial counts were determined both in process washing water and on fresh-cut produce. Evidence obtained in this study, suggests that the use of combined methods (P/V + sanitizers results in a reduction on the microorganism population on produce similar to that found at atmospheric pressure. Moreover, the application of physical methods led to a significant detrimental effect on the visual quality of lettuce regardless of the solution used. Concerning the process water, PAA proved to be an effective alternative to chlorine for the avoidance of cross-contamination.

  7. A Cutting Pattern Recognition Method for Shearers Based on Improved Ensemble Empirical Mode Decomposition and a Probabilistic Neural Network

    Directory of Open Access Journals (Sweden)

    Jing Xu

    2015-10-01

    Full Text Available In order to guarantee the stable operation of shearers and promote construction of an automatic coal mining working face, an online cutting pattern recognition method with high accuracy and speed based on Improved Ensemble Empirical Mode Decomposition (IEEMD and Probabilistic Neural Network (PNN is proposed. An industrial microphone is installed on the shearer and the cutting sound is collected as the recognition criterion to overcome the disadvantages of giant size, contact measurement and low identification rate of traditional detectors. To avoid end-point effects and get rid of undesirable intrinsic mode function (IMF components in the initial signal, IEEMD is conducted on the sound. The end-point continuation based on the practical storage data is performed first to overcome the end-point effect. Next the average correlation coefficient, which is calculated by the correlation of the first IMF with others, is introduced to select essential IMFs. Then the energy and standard deviation of the reminder IMFs are extracted as features and PNN is applied to classify the cutting patterns. Finally, a simulation example, with an accuracy of 92.67%, and an industrial application prove the efficiency and correctness of the proposed method.

  8. Axisymmetric planar cracks in finite hollow cylinders of transversely isotropic material: Part II—cutting method for finite cylinders

    Science.gov (United States)

    Pourseifi, M.; Faal, R. T.; Asadi, E.

    2017-06-01

    This paper is the outcome of a companion part I paper allocated to finite hollow cylinders of transversely isotropic material. The paper provides the solution for the crack tip stress intensity factors of a system of coaxial axisymmetric planar cracks in a transversely isotropic finite hollow cylinder. The lateral surfaces of the hollow cylinder are under two inner and outer self-equilibrating distributed shear loadings. First, the stress fields due to these loadings are given for both infinite and finite cylinders. In the next step, the state of stress in an infinite hollow cylinder with transversely isotropic material containing axisymmetric prismatic and radial dislocations is extracted from part I paper. Next, using the distributed dislocation technique, the mixed mode crack problem in finite cylinder is reduced to Cauchy-type singular integral equations for dislocation densities on the surfaces of the cracks. The problem of a cracked finite hollow cylinder is treated by cutting method; i.e., the infinite cylinder is cut to a finite one by slicing it using two annular axisymmetric cracks at its ends. The cutting method is validated by comparing the state of stress of a sliced intact infinite cylinder with that of an intact finite cylinder. The paper is furnished to several examples to study the effect of crack type and location in finite cylinders on the ensuing stress intensity factors of the cracks and the interaction between the cracks.

  9. Análisis por el Método de los Elementos Finitos de las tensiones en la zona de contacto herramienta-viruta. // Stress anallisys by Finite element Method in tool-cutting zone.

    Directory of Open Access Journals (Sweden)

    M. Rodríguez Madrigal

    2002-01-01

    Full Text Available En este trabajo ha sido empleado el método de los elementos finitos para modelar las tensiones en la zona de contactoherramienta-viruta en un proceso de corte ortogonal. Se ha empleado una formulación de Lagrange actualizada parasolucionar la no linealidad del fenómeno. El comportamiento elasto-plástico del material ha sido formulado mediante lasecuaciones de Prandtl-Reuss y la teoría de endurecimiento por deformación para resolver la ecuación constitutiva elastoplásticadel proceso de corte ortogonal. Se han obtenido la tensión normal y la tensión de cizallamiento en la zona decontacto herramienta-viruta, definiéndose las zonas de adherencia y deslizamiento.Palabras claves: corte ortogonal, contacto herramienta-viruta, método de los elementos finitos, proceso decorte de metales._______________________________________________________________________________Abstract:The finite element method has been used to obtain the stress model in the tool-cutting contact zone in an orthogonal cuttingprocess. An updated Lagrange formulation has been used to solve the nonlinearity of the elasto-plastic phenomenonbehavior of the material.This behavior has been formulated by means of the Prandtl-Reuss equations and the theory ofdeformation by hardening, in order to solve the elasto-plastic equation of orthogonal cutting process. The normal and shearstresses in tool-cutting zone has been obtained, defining the adherence and sliding zones.Key words: Orthogonal cut , tool-cutting contact, finite element method, steel cutting process.

  10. Broadband Quantum Cutting in ZnO/Yb(Er)F3 Oxy-Fluoride Nanocomposite Prepared by Thermal Oxidation Method

    Science.gov (United States)

    Zhang, Wentao; Xiao, Siguo; Yang, Xiaoliang; Jin, Xiangliang

    2013-02-01

    Yb(Er)F3 nanoparticles absorbed with ZnO sheet were prepared via two-step co-precipitation method followed with thermal oxidation. In the ZnO/Yb(Er)F3 composite phosphor, ZnO can efficiently absorb ultraviolet photons of 250-380 nm and transfer its absorbed photon energy to Er3+ ions in fluoride particles. A followed quantum cutting between Er3+-Yb3+ couples in the fluoride takes place, down-converting an absorbed ultraviolet photon into two photons of 650 nm and 980 nm radiations. The composite phosphor combines the wide wavelength absorption range and high absorption cross-section of ZnO with high quantum cutting efficiency of Er3+-Yb3+ co-doped fluoride, showing potential application in the enhancement of Si solar cell efficiency.

  11. CPU-GPU mixed implementation of virtual node method for real-time interactive cutting of deformable objects using OpenCL.

    Science.gov (United States)

    Jia, Shiyu; Zhang, Weizhong; Yu, Xiaokang; Pan, Zhenkuan

    2015-09-01

    Surgical simulators need to simulate interactive cutting of deformable objects in real time. The goal of this work was to design an interactive cutting algorithm that eliminates traditional cutting state classification and can work simultaneously with real-time GPU-accelerated deformation without affecting its numerical stability. A modified virtual node method for cutting is proposed. Deformable object is modeled as a real tetrahedral mesh embedded in a virtual tetrahedral mesh, and the former is used for graphics rendering and collision, while the latter is used for deformation. Cutting algorithm first subdivides real tetrahedrons to eliminate all face and edge intersections, then splits faces, edges and vertices along cutting tool trajectory to form cut surfaces. Next virtual tetrahedrons containing more than one connected real tetrahedral fragments are duplicated, and connectivity between virtual tetrahedrons is updated. Finally, embedding relationship between real and virtual tetrahedral meshes is updated. Co-rotational linear finite element method is used for deformation. Cutting and collision are processed by CPU, while deformation is carried out by GPU using OpenCL. Efficiency of GPU-accelerated deformation algorithm was tested using block models with varying numbers of tetrahedrons. Effectiveness of our cutting algorithm under multiple cuts and self-intersecting cuts was tested using a block model and a cylinder model. Cutting of a more complex liver model was performed, and detailed performance characteristics of cutting, deformation and collision were measured and analyzed. Our cutting algorithm can produce continuous cut surfaces when traditional minimal element creation algorithm fails. Our GPU-accelerated deformation algorithm remains stable with constant time step under multiple arbitrary cuts and works on both NVIDIA and AMD GPUs. GPU-CPU speed ratio can be as high as 10 for models with 80,000 tetrahedrons. Forty to sixty percent real

  12. 切割方式对直接淬火钢板的影响%Effect of cutting method on direct quenched steel plate

    Institute of Scientific and Technical Information of China (English)

    姜洪生; 张所全; 韩剑宏

    2014-01-01

    Characteristics of microstructure and hardness near cutting area of the direct quenched steel with thickness of 20 mm and strength of 600 MPa after different cutting method were studied .The results show that flame cutting leads to an obvious softened region near the cutting face, but the plasma cutting and laser cutting do not .Plasma and laser cutting lead to re-quenching effect at the area close to the cutting face, and the hardness is much higher than that of the matrix away from the cutting face .The size of the heat affected area by flame cutting is larger than those by plasma and laser cutting .Considering the thickness range and acceptable size of the heat affected area , plasma cutting method is recommended for the investigated direct quenching steel plate .%研究了20 mm厚60 kg级直接淬火钢板不同切割方式下切割区域附近组织和硬度的特点。结果表明,火焰切割后,在割口附近会形成明显的软化区域,而等离子切割和激光切割无明显软化区域;等离子割口和激光割口处会重新淬火,其硬度远高于远离割口处;火焰切割割口的热影响区域大于等离子切割和激光切割。考虑到等离子切割和激光切割的适应厚度范围以及热影响区域的大小,推荐采用等离子切割20 mm厚60 kg级直接淬火钢板。

  13. Generalized Normal Derivatives and Their Applications in DDMs with Nonmatching Grids and DG Methods

    Institute of Scientific and Technical Information of China (English)

    Qiya Hu

    2008-01-01

    A class of normal-like derivatives for functions with low regularity defined on Lipschitz domains are introduced and studied. It is shown that the new normal-like derivatives, which are called the generalized normal derivatives, preserve the major properties of the existing standard normal derivatives. The generalized normal derivatives are then applied to analyze the convergence of domain decomposition methods (DDMs) with nonmatching grids and discontinuous Galerkin (DG) methods for second-order elliptic problems. The approximate solutions generated by these methods still possess the optimal energy-norm error estimates, even if the exact solutions to the underlying elliptic problems admit very low regularities.

  14. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  15. Asymptotic Analysis of Vertical Branch-Cut Integral of Shear Waves in a Fluid-Filled Borehole Utilizing the Steepest-Descent Method

    Institute of Scientific and Technical Information of China (English)

    YAO Gui-Jin; SONG Ruo-Long; WANG Ke-Xie

    2008-01-01

    We obtaln an asymptotic solution to the vertical branch-cut integral of shear waves excited by an impulsive pressure point source in a fluid-filled borehole,by taking the effect of the infinite singularity of the Hankel functions related to shear waves in the integrand at the shear branch point into account and using the method of steepest-descent to expand the vertical branch-cut integral of shear waves.It is theoretically proven that the saddle point of the integrand is locared at ks-i/z,where ks and z are the shear branch point and the offset.The continuous and smooth amplitude spectra and the resonant peaks of shear waves are numerically calculated from the asymptotic solution.These asymptotic results are generally in agreement with the numerical integral results.It is also found by the comparison and analysis of two results that the resonant factor and the effect of the normal and leaking mode poles around the shear branch point lead to the two-peak characteristics of the amplitude spectra of shear waves in the resonant peak zones from the numerical integral calculations.

  16. 结合粒子群算法优化归一割的图像阈值分割方法%Image threshold segmentation approach of normalized cut and particle swarm optimization algorithm

    Institute of Scientific and Technical Information of China (English)

    任爱红

    2012-01-01

    In order to get the optimal threshold in image segmentation quickly .based on the graph theory, gray-scale similar matrix takes the place of pixel-level weight matrix, normalized cut criterion is regarded as the optimization function. Using particle swarm optimization algorithm to find the best threshold in gray-scale space. Experiments show that the method is not only less computational costs, but also get a satisfactory segmentation result. The thresholds is more stable and consume less time greatly and better satisfies the request of real-time processing in image segmentation by using this new method.%为了快速得到图像分割的最佳阈值,依据图论知识,利用灰度级相似矩阵代替像素级权值矩阵,将归一化切割准则作为优化函数.利用粒子群优化算法代替穷举法优化归一化划分准则,提出粒子群算法优化归一割的图像阈值分割方法.实验表明在分割性能上有较大的提高,在分割速度上也有较大的改进,能够满足实时性要求.

  17. Linear one-dimensional cutting-packing problems: numerical experiments with the sequential value correction method (SVC and a modified branch-and-bound method (MBB

    Directory of Open Access Journals (Sweden)

    Mukhacheva E.A.

    2000-01-01

    Full Text Available Two algorithms for the one-dimensional cutting problem, namely, a modified branch-and-bound method (exact method and a heuristic sequential value correction method are suggested. In order to obtain a reliable assessment of the efficiency of the algorithms, hard instances of the problem were considered and from the computational experiment it seems that the efficiency of the heuristic method appears to be superior to that of the exact one, taking into account the computing time of the latter. A detailed description of the two methods is given along with suggestions for their improvements.

  18. Application of Taguchi Method for Analyzing Factors Affecting the Performance of Coated Carbide Tool When Turning FCD700 in Dry Cutting Condition

    Science.gov (United States)

    Ghani, Jaharah A.; Mohd Rodzi, Mohd Nor Azmi; Zaki Nuawi, Mohd; Othman, Kamal; Rahman, Mohd. Nizam Ab.; Haron, Che Hassan Che; Deros, Baba Md

    2011-01-01

    Machining is one of the most important manufacturing processes in these modern industries especially for finishing an automotive component after the primary manufacturing processes such as casting and forging. In this study the turning parameters of dry cutting environment (without air, normal air and chilled air), various cutting speed, and feed rate are evaluated using a Taguchi optimization methodology. An orthogonal array L27 (313), signal-to-noise (S/N) ratio and analysis of variance (ANOVA) are employed to analyze the effect of these turning parameters on the performance of a coated carbide tool. The results show that the tool life is affected by the cutting speed, feed rate and cutting environment with contribution of 38%, 32% and 27% respectively. Whereas for the surface roughness, the feed rate is significantly controlled the machined surface produced by 77%, followed by the cutting environment of 19%. The cutting speed is found insignificant in controlling the machined surface produced. The study shows that the dry cutting environment factor should be considered in order to produce longer tool life as well as for obtaining a good machined surface.

  19. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  20. A non-linear branch and cut method for solving discrete minimum compliance problems to global optimality

    DEFF Research Database (Denmark)

    Stolpe, Mathias; Bendsøe, Martin P.

    2007-01-01

    This paper present some initial results pertaining to a search for globally optimal solutions to a challenging benchmark example proposed by Zhou and Rozvany. This means that we are dealing with global optimization of the classical single load minimum compliance topology design problem with a fixed...... finite element discretization and with discrete design variables. Global optimality is achieved by the implementation of some specially constructed convergent nonlinear branch and cut methods, based on the use of natural relaxations and by applying strengthening constraints (linear valid inequalities...

  1. 机械化盘区通风方法%Ventilation method of mechanized panel cutting

    Institute of Scientific and Technical Information of China (English)

    王海宁; 程哲

    2012-01-01

    依据某大型金属矿山井下机械化盘区通风的实际情况,在矿用空气幕替代辅扇的应用研究成功的基础上,现场开展硐室型风机机站的应用试验,对比研究的结果表明:硐室型风机机站可以有效替代传统风机机站,能在难以设置有风墙风机机站的巷道中安装运行,实现各盘区进风巷道风流的合理分配;硐室型风机机站能有效引射风流,控制风流短路,增加中段进风量达27.3m3/s,强化中段通风网络的排烟排尘效果,有利于保护工人的身体健康、提高矿井的有效风量率和促进矿井风流的有序流动;硐室型风机机站可以由单台风机或多台风机构成,安装在巷道侧壁的硐室内,增强多级机站通风方法的可靠性和适应性.%Based on the real underground mechanized panel cutting of some large metal mine, the application test of fan station in cavern with the successful substitute of air curtain for auxiliary fan was studied. The results show that the fan station in cavern is an effectively substitute for the traditional fan station in realizing the reasonable fan flow distribution in inlet air tunnel of every panel cutting, where the fan station with wind wall is hard to be installed. The fan station in cavern can effective induce the flow direction, control the flow short-circuit, and increase inlet air volume in middle tunnel to 27.3 m3/s. This increase could enhance the effect of smoke and dust exhaust in the section ventilation network, which is beneficial to protect the workers' health, improve the effective air volume rate and promote the orderly flow of mine airflow. The fan station in cavern can be constituted by single fan or multiple fans, and can be installed in the side wall of the tunnel. This enhances the reliability and flexibility of the multi-stage fan station.

  2. THE NEW METHOD SOLVING THE CUTTING-STOCK PROBLEM%求解板材排料问题的新方法

    Institute of Scientific and Technical Information of China (English)

    陈锦昌; 韩锷春

    2001-01-01

    Starting with the substance of the cutting-stock problem, a new method to that problem is developed, that is, converting two-dimensional layout into one dimensional layout and converting overall optimal solution into local optimal solution to get the approximate overall optimal solution. The three problems in the method, the construction of blocks, the layout of the blocks on standard boards,the check-ups on the straight cutting of the blocks and the automatic marking of the cutting size of blocks, are discussed in this paper.

  3. The BASIC program to analyse the polymodal frequency distribution into normal distributions with Marqualdt's method

    National Research Council Canada - National Science Library

    Akamine, T

    1984-01-01

    The method analysing the polymodal distribution into normal distribution enables by plotting the frequencies in the medium values of the classes to perform a resolution to the regression curve method...

  4. Application of the segment weight dynamic movement method to the normalization of gait EMG amplitude.

    Science.gov (United States)

    Nishijima, Y; Kato, T; Yoshizawa, M; Miyashita, M; Iida, H

    2010-06-01

    This study aims at determining the applicability of a segment weight dynamic movement (SWDM) method as an alternative for normalizing gait EMGs in comparison with the conventional isometric maximal voluntary contraction (MVC) method. The SWDM method employs reference exercises, each being a dynamic, repetitive movement of a joint under the load of the segment weight (i.e., the total weight of all segments distal to the joint). EMG amplitudes of 28 healthy male subjects walking at 120 steps/min were normalized by the two methods. CV and VR were used to assess the inter-individual variability of both the normalized gait EMG for 8 muscles. The CV and VR values attained with the two methods were close to each other, as well as to those obtained by other researchers using the isometric MVC method. These results suggest that the SWDM method has a comparable level of applicability to gait EMG normalization as the isometric MVC method.

  5. High hydrostatic pressure as a method to preserve fresh-cut Hachiya persimmons: A structural approach.

    Science.gov (United States)

    Vázquez-Gutiérrez, José Luis; Quiles, Amparo; Vonasek, Erica; Jernstedt, Judith A; Hernando, Isabel; Nitin, Nitin; Barrett, Diane M

    2016-12-01

    The "Hachiya" persimmon is the most common astringent cultivar grown in California and it is rich in tannins and carotenoids. Changes in the microstructure and some physicochemical properties during high hydrostatic pressure processing (200-400 MPa, 3 min, 25 ℃) and subsequent refrigerated storage were analyzed in this study in order to evaluate the suitability of this non-thermal technology for preservation of fresh-cut Hachiya persimmons. The effects of high-hydrostatic pressure treatment on the integrity and location of carotenoids and tannins during storage were also analyzed. Significant changes, in particular diffusion of soluble compounds which were released as a result of cell wall and membrane damage, were followed using confocal microscopy. The high-hydrostatic pressure process also induced changes in physicochemical properties, e.g. electrolyte leakage, texture, total soluble solids, pH and color, which were a function of the amount of applied hydrostatic pressure and may affect the consumer acceptance of the product. Nevertheless, the results indicate that the application of 200 MPa could be a suitable preservation treatment for Hachiya persimmon. This treatment seems to improve carotenoid extractability and tannin polymerization, which could improve functionality and remove astringency of the fruit, respectively. © The Author(s) 2016.

  6. Strength on cut edge and ground edge glass beams with the failure analysis method

    Directory of Open Access Journals (Sweden)

    Stefano Agnetti

    2013-10-01

    Full Text Available The aim of this work is the study of the effect of the finishing of the edge of glass when it has a structural function. Experimental investigations carried out for glass specimens are presented. Various series of annealed glass beam were tested, with cut edge and with ground edge. The glass specimens are tested in four-point bending performing flaw detection on the tested specimens after failure, in order to determine glass strength. As a result, bending strength values are obtained for each specimen. Determining some physical parameter as the depth of the flaw and the mirror radius of the fracture, after the failure of a glass element, it could be possible to calculate the failure strength of that.The experimental results were analyzed with the LEFM theory and the glass strength was analyzed with a statistical study using two-parameter Weibull distribution fitting quite well the failure stress data. The results obtained constitute a validation of the theoretical models and show the influence of the edge processing on the failure strength of the glass. Furthermore, series with different sizes were tested in order to evaluate the size effect.

  7. A method to dynamic stochastic multicriteria decision making with log-normally distributed random variables.

    Science.gov (United States)

    Wang, Xin-Fan; Wang, Jian-Qiang; Deng, Sheng-Yue

    2013-01-01

    We investigate the dynamic stochastic multicriteria decision making (SMCDM) problems, in which the criterion values take the form of log-normally distributed random variables, and the argument information is collected from different periods. We propose two new geometric aggregation operators, such as the log-normal distribution weighted geometric (LNDWG) operator and the dynamic log-normal distribution weighted geometric (DLNDWG) operator, and develop a method for dynamic SMCDM with log-normally distributed random variables. This method uses the DLNDWG operator and the LNDWG operator to aggregate the log-normally distributed criterion values, utilizes the entropy model of Shannon to generate the time weight vector, and utilizes the expectation values and variances of log-normal distributions to rank the alternatives and select the best one. Finally, an example is given to illustrate the feasibility and effectiveness of this developed method.

  8. A Method to Dynamic Stochastic Multicriteria Decision Making with Log-Normally Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Xin-Fan Wang

    2013-01-01

    Full Text Available We investigate the dynamic stochastic multicriteria decision making (SMCDM problems, in which the criterion values take the form of log-normally distributed random variables, and the argument information is collected from different periods. We propose two new geometric aggregation operators, such as the log-normal distribution weighted geometric (LNDWG operator and the dynamic log-normal distribution weighted geometric (DLNDWG operator, and develop a method for dynamic SMCDM with log-normally distributed random variables. This method uses the DLNDWG operator and the LNDWG operator to aggregate the log-normally distributed criterion values, utilizes the entropy model of Shannon to generate the time weight vector, and utilizes the expectation values and variances of log-normal distributions to rank the alternatives and select the best one. Finally, an example is given to illustrate the feasibility and effectiveness of this developed method.

  9. Highly stretchable and shape-controllable three-dimensional antenna fabricated by “Cut-Transfer-Release” method

    Science.gov (United States)

    Yan, Zhuocheng; Pan, Taisong; Yao, Guang; Liao, Feiyi; Huang, Zhenlong; Zhang, Hulin; Gao, Min; Zhang, Yin; Lin, Yuan

    2017-02-01

    Recent progresses on the Kirigami-inspired method provide a new idea to assemble three-dimensional (3D) functional structures with conventional materials by releasing the prestrained elastomeric substrates. In this paper, highly stretchable serpentine-like antenna is fabricated by a simple and quick “Cut-Transfer-Release” method for assembling stretchable 3D functional structures on an elastomeric substrate with a controlled shape. The mechanical reliability of the serpentine-like 3D stretchable antenna is evaluated by the finite element method and experiments. The antenna shows consistent radio frequency performance with center frequency at 5.6 GHz during stretching up to 200%. The 3D structure is also able to eliminate the hand effect observed commonly in the conventional antenna. This work is expected to spur the applications of novel 3D structures in the stretchable electronics.

  10. Highly stretchable and shape-controllable three-dimensional antenna fabricated by “Cut-Transfer-Release” method

    Science.gov (United States)

    Yan, Zhuocheng; Pan, Taisong; Yao, Guang; Liao, Feiyi; Huang, Zhenlong; Zhang, Hulin; Gao, Min; Zhang, Yin; Lin, Yuan

    2017-01-01

    Recent progresses on the Kirigami-inspired method provide a new idea to assemble three-dimensional (3D) functional structures with conventional materials by releasing the prestrained elastomeric substrates. In this paper, highly stretchable serpentine-like antenna is fabricated by a simple and quick “Cut-Transfer-Release” method for assembling stretchable 3D functional structures on an elastomeric substrate with a controlled shape. The mechanical reliability of the serpentine-like 3D stretchable antenna is evaluated by the finite element method and experiments. The antenna shows consistent radio frequency performance with center frequency at 5.6 GHz during stretching up to 200%. The 3D structure is also able to eliminate the hand effect observed commonly in the conventional antenna. This work is expected to spur the applications of novel 3D structures in the stretchable electronics. PMID:28198812

  11. OPTIMAL CONTROL OF CNC CUTTING PROCESS

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The intelligent optimizing method of cutting parameters and the cutting stable districts searching method are set up. The cutting parameters of each cutting pass could be optimized automatically, the cutting chatter is predicted through setting up the dynamic cutting force AR(2) model on-line, the spindle rotation speed is adjusted according to the predicting results so as to ensure the cutting system work in stable district.

  12. Alternative normalization methods demonstrate widespread cortical hypometabolism in untreated de novo Parkinson's disease

    DEFF Research Database (Denmark)

    Berti, Valentina; Polito, C; Borghammer, Per

    2012-01-01

    , recent studies suggested that conventional data normalization procedures may not always be valid, and demonstrated that alternative normalization strategies better allow detection of low magnitude changes. We hypothesized that these alternative normalization procedures would disclose more widespread...... metabolic alterations in de novo PD. METHODS: [18F]FDG PET scans of 26 untreated de novo PD patients (Hoehn & Yahr stage I-II) and 21 age-matched controls were compared using voxel-based analysis. Normalization was performed using gray matter (GM), white matter (WM) reference regions and Yakushev...... normalization. RESULTS: Compared to GM normalization, WM and Yakushev normalization procedures disclosed much larger cortical regions of relative hypometabolism in the PD group with extensive involvement of frontal and parieto-temporal-occipital cortices, and several subcortical structures. Furthermore...

  13. [Water jet cutting for bones and bone cement--parameter study of possibilities and limits of a new method].

    Science.gov (United States)

    Honl, M; Rentzsch, R; Lampe, F; Müller, V; Dierk, O; Hille, E; Louis, H; Morlock, M

    2000-09-01

    Water jet techniques have been used in industrial cutting, drilling and cleaning applications for more than 30 years. Plain water is typically used for the cutting of non-metallic materials. The addition of abrasive substances to the stream allows almost any material to be cut. The first medical applications were reported in the early 1980s, when the water jet was used to cut organs. The present study investigates the use of water jet cutting technology for endoprosthesis revision surgery. Bone and PMMA (polymethylmethacrylate) samples were cut at different pressures using an industrial water jet cutting device. Using plain water at 400 bar, PMMA was cut selectively without damaging the bone; above 400 bar, bone was also cut, but the cutting depths in PMMA were significantly greater (p water-soluble abrasive disaccharide to the water results in a significantly higher removal rate for both materials (p cutting depth between the two materials was significant (p abrasive, the quality of the cut was better for both materials. The water jet technology--in particular the abrasive technique--can be used to cut biomaterials such as bone and bone cement. The diameter of the jet is a great advantage when working in the confined area at the prosthesis interface. The cutting process is essentially cold, thus eliminating a thermal effect, and the jet reaction forces are relatively low. Accurate manipulation of the hydro jet nozzle is possible both manually and by robot. The results obtained show that it is possible to remove prostheses with this cutting technique, rapidly and with little damage to the surrounding tissue. Problem areas are the development of sterile pumps and the "depth control" of the jet.

  14. Analysis of boutique arrays: a universal method for the selection of the optimal data normalization procedure.

    Science.gov (United States)

    Uszczyńska, Barbara; Zyprych-Walczak, Joanna; Handschuh, Luiza; Szabelska, Alicja; Kaźmierczak, Maciej; Woronowicz, Wiesława; Kozłowski, Piotr; Sikorski, Michał M; Komarnicki, Mieczysław; Siatkowski, Idzi; Figlerowicz, Marek

    2013-09-01

    DNA microarrays, which are among the most popular genomic tools, are widely applied in biology and medicine. Boutique arrays, which are small, spotted, dedicated microarrays, constitute an inexpensive alternative to whole-genome screening methods. The data extracted from each microarray-based experiment must be transformed and processed prior to further analysis to eliminate any technical bias. The normalization of the data is the most crucial step of microarray data pre-processing and this process must be carefully considered as it has a profound effect on the results of the analysis. Several normalization algorithms have been developed and implemented in data analysis software packages. However, most of these methods were designed for whole-genome analysis. In this study, we tested 13 normalization strategies (ten for double-channel data and three for single-channel data) available on R Bioconductor and compared their effectiveness in the normalization of four boutique array datasets. The results revealed that boutique arrays can be successfully normalized using standard methods, but not every method is suitable for each dataset. We also suggest a universal seven-step workflow that can be applied for the selection of the optimal normalization procedure for any boutique array dataset. The described workflow enables the evaluation of the investigated normalization methods based on the bias and variance values for the control probes, a differential expression analysis and a receiver operating characteristic curve analysis. The analysis of each component results in a separate ranking of the normalization methods. A combination of the ranks obtained from all the normalization procedures facilitates the selection of the most appropriate normalization method for the studied dataset and determines which methods can be used interchangeably.

  15. Longwall mining“cutting cantilever beam theory”and 110 mining method in ChinadThe third mining science innovation

    Institute of Scientific and Technical Information of China (English)

    Manchao He; Guolong Zhu; Zhibiao Guo

    2015-01-01

    abstract With the third innovation in science and technology worldwide, China has also experienced this marvelous progress. Concerning the longwall mining in China, the “masonry beam theory” (MBT) was first proposed in the 1960s, illustrating that the transmission and equilibrium method of overburden pressure using reserved coal pillar in mined-out areas can be realized. This forms the so-called “121 mining method”, which lays a solid foundation for development of mining science and technology in China. The“transfer rock beam theory”(TRBT) proposed in the 1980s gives a further understanding for the transmission path of stope overburden pressure and pressure distribution in high-stress areas. In this regard, the advanced 121 mining method was proposed with smaller coal pillar for excavation design, making significant contributions to improvement of the coal recovery rate in that era. In the 21st century, the traditional mining technologies faced great challenges and, under the theoretical developments pioneered by Profs. Minggao Qian and Zhenqi Song, the “cutting cantilever beam theory” (CCBT) was proposed in 2008. After that the 110 mining method is formulated subsequently, namely one stope face, after the first mining cycle, needs one advanced gateway excavation, while the other one is automatically formed during the last mining cycle without coal pillars left in the mining area. This method can be implemented using the CCBT by incorporating the key technologies, including the directional pre-splitting roof cutting, constant resistance and large deformation (CRLD) bolt/anchor supporting system with negative Poisson’s ratio (NPR) effect material, and remote real-time monitoring technology. The CCBT and 110 mining method will provide the theoretical and technical basis for the development of mining industry in China.

  16. Longwall mining “cutting cantilever beam theory” and 110 mining method in China—The third mining science innovation

    Directory of Open Access Journals (Sweden)

    Manchao He

    2015-10-01

    Full Text Available With the third innovation in science and technology worldwide, China has also experienced this marvelous progress. Concerning the longwall mining in China, the “masonry beam theory” (MBT was first proposed in the 1960s, illustrating that the transmission and equilibrium method of overburden pressure using reserved coal pillar in mined-out areas can be realized. This forms the so-called “121 mining method”, which lays a solid foundation for development of mining science and technology in China. The “transfer rock beam theory” (TRBT proposed in the 1980s gives a further understanding for the transmission path of stope overburden pressure and pressure distribution in high-stress areas. In this regard, the advanced 121 mining method was proposed with smaller coal pillar for excavation design, making significant contributions to improvement of the coal recovery rate in that era. In the 21st century, the traditional mining technologies faced great challenges and, under the theoretical developments pioneered by Profs. Minggao Qian and Zhenqi Song, the “cutting cantilever beam theory” (CCBT was proposed in 2008. After that the 110 mining method is formulated subsequently, namely one stope face, after the first mining cycle, needs one advanced gateway excavation, while the other one is automatically formed during the last mining cycle without coal pillars left in the mining area. This method can be implemented using the CCBT by incorporating the key technologies, including the directional pre-splitting roof cutting, constant resistance and large deformation (CRLD bolt/anchor supporting system with negative Poisson's ratio (NPR effect material, and remote real-time monitoring technology. The CCBT and 110 mining method will provide the theoretical and technical basis for the development of mining industry in China.

  17. Effect of Various Management Methods of Apical Flower Bud on Cut Flower Quality in Three Cultivars of Greenhouse Roses

    Directory of Open Access Journals (Sweden)

    mansour matloobi

    2017-02-01

    Full Text Available Introduction: In greenhouse roses, canopy management has been highly noted and emphasized during the past decades. It was recognized that improving canopy shape by implementing some techniques such as stem bending and flower bud removing can highly affect the marketable quality of cut roses. For most growers, the best method of flower bud treatment has not yet been described and determined physiologically. This experiment was designed to answer some questions related to this problem. Materials and Methods: A plastic commercial cut rose greenhouse was selected to carry out the trial. Three greenhouse rose cultivars, namely Eros, Cherry Brandy and Dancing Queen, were selected as the first factor, and three methods of flower bud treatment along with bending types were chosen as the second factor. Cuttings were taken from mother plants and rooted under mist conditions. The first shoot emerging from the cutting was treated at pea bud stage by one of the following methods: shoot bending at stem base with intact bud, immediate shoot bending at stem base after removing flower bud and shoot bending at stem base two weeks after flower bud removal. Some marketable stem properties including stem length, diameter and weight, and characteristics related to bud growth potential were measured, and then the data were subjected to statistical analysis. Results and Discussion: Analysis of variance showed that cultivars differ in their marketable features. Cherry Brandy produced longer cut flowers with higher stem diameter compared to the two other cultivars. This cultivar was also good in stem weight trait; however its difference from Eros was not significant. Dancing Queen did not perform well in producing high quality stems on the whole. Regarding number of days until bud release and growth, Cherry Brandy’s buds spent fewest days until growing. In many studies, the effect of cultivar on rose shoot growth quality has been documented and explained. For instance

  18. Application of the CBR method for adding the process of cutting tools and parameters selection

    Science.gov (United States)

    Ociepka, P.; Herbuś, K.

    2016-08-01

    The paper presents a method, basing on engineering knowledge and experience, designated to aid the selection of tools and machining parameters for the processes of turning. In this method, the informatics system is built basing on a Case Based Reasoning (CBR) method. This is a method of problems solving based on experience. It consists in finding analogies between the currently being solved task, and earlier realized tasks that have been stored in the database of the CBR system. The article presents the structure of the developed software, as well as the functioning of the CBR method. It also presents the possibility of integrating the developed method with the CAM module of the SIEMENS PLM NX program.

  19. A New Speaker Verification Method with GlobalSpeaker Model and Likelihood Score Normalization

    Institute of Scientific and Technical Information of China (English)

    张怡颖; 朱小燕; 张钹

    2000-01-01

    In this paper a new text-independent speaker verification method GSMSV is proposed based on likelihood score normalization. In this novel method a global speaker model is established to represent the universal features of speech and normalize the likelihood score. Statistical analysis demonstrates that this normalization method can remove common factors of speech and bring the differences between speakers into prominence. As a result the equal error rate is decreased significantly,verification procedure is accelerated and system adaptability to speaking speed is improved.

  20. The analysis of carbohydrates in milk powder by a new "heart-cutting" two-dimensional liquid chromatography method.

    Science.gov (United States)

    Ma, Jing; Hou, Xiaofang; Zhang, Bing; Wang, Yunan; He, Langchong

    2014-03-01

    In this study, a new"heart-cutting" two-dimensional liquid chromatography method for the simultaneous determination of carbohydrate contents in milk powder was presented. In this two dimensional liquid chromatography system, a Venusil XBP-C4 analysis column was used in the first dimension ((1)D) as a pre-separation column, a ZORBAX carbohydrates analysis column was used in the second dimension ((2)D) as a final-analysis column. The whole process was completed in less than 35min without a particular sample preparation procedure. The capability of the new two dimensional HPLC method was demonstrated in the determination of carbohydrates in various brands of milk powder samples. A conventional one dimensional chromatography method was also proposed. The two proposed methods were both validated in terms of linearity, limits of detection, accuracy and precision. The comparison between the results obtained with the two methods showed that the new and completely automated two dimensional liquid chromatography method is more suitable for milk powder sample because of its online cleanup effect involved.

  1. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment.

  2. A single method for recovery and concentration of enteric viruses and bacteria from fresh-cut vegetables.

    Science.gov (United States)

    Sánchez, G; Elizaquível, P; Aznar, R

    2012-01-03

    Fresh-cut vegetables are prone to be contaminated with foodborne pathogens during growth, harvest, transport and further processing and handling. As most of these products are generally eaten raw or mildly treated, there is an increase in the number of outbreaks caused by viruses and bacteria associated with fresh vegetables. Foodborne pathogens are usually present at very low levels and have to be concentrated (i.e. viruses) or enriched (i.e. bacteria) to enhance their detection. With this aim, a rapid concentration method has been developed for the simultaneous recovery of hepatitis A virus (HAV), norovirus (NV), murine norovirus (MNV) as a surrogate for NV, Escherichia coli O157:H7, Listeria monocytogenes and Salmonella enterica. Initial experiments focused on evaluating the elution conditions suitable for virus release from vegetables. Finally, elution with buffered peptone water (BPW), using a Pulsifier, and concentration by polyethylene glycol (PEG) precipitation were the methods selected for the elution and concentration of both, enteric viruses and bacteria, from three different types of fresh-cut vegetables by quantitative PCR (qPCR) using specific primers. The average recoveries from inoculated parsley, spinach and salad, were ca. 9.2%, 43.5%, and 20.7% for NV, MNV, and HAV, respectively. Detection limits were 132 RT-PCR units (PCRU), 1.5 50% tissue culture infectious dose (TCID₅₀), and 6.6 TCID₅₀ for NV, MNV, and HAV, respectively. This protocol resulted in average recoveries of 57.4%, 64.5% and 64.6% in three vegetables for E. coli O157:H7, L. monocytogenes and Salmonella with corresponding detection limits of 10³, 10² and 10³ CFU/g, respectively. Based on these results, it can be concluded that the procedure herein is suitable to recover, detect and quantify enteric viruses and foodborne pathogenic bacteria within 5 h and can be applied for the simultaneous detection of both types of foodborne pathogens in fresh-cut vegetables.

  3. Improvment of short cut numerical method for determination of periods of free oscillations for basins with irregular geometry and bathymetry

    Science.gov (United States)

    Chernov, Anton; Kurkin, Andrey; Pelinovsky, Efim; Yalciner, Ahmet; Zaytsev, Andrey

    2010-05-01

    A short cut numerical method for evaluation of the modes of free oscillations of the basins which have irregular geometry and bathymetry was presented in the paper (Yalciner A.C., Pelinovsky E., 2007). In the method, a single wave is inputted to the basin as an initial impulse. The respective agitation in the basin is computed by using the numerical method solving the nonlinear form of long wave equations. The time histories of water surface fluctuations at different locations due to propagation of the waves in relation to the initial impulse are stored and analyzed by the fast Fourier transform technique (FFT) and energy spectrum curves for each location are obtained. The frequencies of each mode of free oscillations are determined from the peaks of the spectrum curves. Some main features were added for this method and will be discussed here: 1. Instead of small number of gauges which were manually installed in the studied area the information from numerical simulation now is recorded on the regular net of the «simulation» gauges which was place everywhere on the sea surface in the depth deeper than "coast" level with the fixed presetted distance between gauges. The spectral analysis of wave records was produced by Welch periodorgam method instead of simple FFT so it's possible to get spectral power estimation for wave process and determine confidence interval for spectra peaks. 2. After the power spectral estimation procedure the common peak of studied seiche can be found and mean spectral amplitudes for this peak were calculated numerically by a Simpson integration method for all gauges in the basin and the mean spectral amplitudes spatial distribution map can be ploted. The spatial distribution helps to study structure of seiche and determine effected dangerous areas. 3. Nested grid module in the NAMI-DANCE - nonlinear shallow water equations calculation software package was developed. This is very important feature for complicated different scale (ocean

  4. COMPLEX INNER PRODUCT AVERAGING METHOD FOR CALCULATING NORMAL FORM OF ODE

    Institute of Scientific and Technical Information of China (English)

    陈予恕; 孙洪军

    2001-01-01

    This paper puts forward a complex inner product averaging method for calculating normal form of ODE. Compared with conventional averaging method, the theoretic analytical process has such simple forms as to realize computer program easily.Results can be applied in both autonomous and non-autonomous systems. At last, an example is resolved to verify the method.

  5. 切分西红柿微波短时处理常温贮藏保鲜试验研究%Cut-Tomato Preservation at Normal Atmospheric Temperature With Short-Time Microwave Treatment

    Institute of Scientific and Technical Information of China (English)

    殷涌光; 刘静波; 房新平

    2002-01-01

    Molding, rottening and fresh degree decreasing often happen during the process of fruits and vegetables preservation and transportation. Even worse it is losing its commercial value completely, which will cause huge economic loss. These fruits and vegetables cannot meet the need of consumers. Solving the problem of fruits and vegetables preservation, especially during transporting, can not only satisfy the requirement of market, but also regulate fruits and vegetables supplying, as well as gain economic profits. In this paper, orthogonal tests were conducted to study cut-tomato preservation with short-time microwave treatment. It was found that the vacuum packaged cut-tomato only reached low temperature of pasteurization after short-time microwave treatment, but it could be preserved for a long time under normal atmospheric temperature. The optimal treatment parameters were determined. The parameters are solution contains 0.12% vitamin C and microwave treatment time is 54 s; The image processing of samples were carried out to get color histograms and corresponding color mean, and effects of the method were explained with certain value. Effects of package on tomato were analyzed; it was considered that cut-tomato would be packaged with lightproof package.%目前由于在果蔬的储运中常发生霉变、腐烂,鲜度下降,甚至完全失去商品价值,造成巨大的经济损失,难以满足市场要求.因此,解决果蔬的贮藏特别是运输过程中的保鲜,不仅能满足市场需要,增加果蔬市场供应品种和调节供应,同时还将获得巨大的经济效益.通过正交试验对切分西红柿进行了微波处理常温保鲜的试验研究,得出了真空包装的切分西红柿在微波短时处理后只达到巴氏杀菌的低温杀菌温度,却得到能长期贮存的品质优良的样品的结果;确定了这种方法的最佳处理参数,其最佳参数为0.12%维生素C处理时间为54 s;对样品进行图像处理,得到色彩直方

  6. Empirical comparison of color normalization methods for epithelial-stromal classification in H and E images

    Science.gov (United States)

    Sethi, Amit; Sha, Lingdao; Vahadane, Abhishek Ramnath; Deaton, Ryan J.; Kumar, Neeraj; Macias, Virgilia; Gann, Peter H.

    2016-01-01

    Context: Color normalization techniques for histology have not been empirically tested for their utility for computational pathology pipelines. Aims: We compared two contemporary techniques for achieving a common intermediate goal – epithelial-stromal classification. Settings and Design: Expert-annotated regions of epithelium and stroma were treated as ground truth for comparing classifiers on original and color-normalized images. Materials and Methods: Epithelial and stromal regions were annotated on thirty diverse-appearing H and E stained prostate cancer tissue microarray cores. Corresponding sets of thirty images each were generated using the two color normalization techniques. Color metrics were compared for original and color-normalized images. Separate epithelial-stromal classifiers were trained and compared on test images. Main analyses were conducted using a multiresolution segmentation (MRS) approach; comparative analyses using two other classification approaches (convolutional neural network [CNN], Wndchrm) were also performed. Statistical Analysis: For the main MRS method, which relied on classification of super-pixels, the number of variables used was reduced using backward elimination without compromising accuracy, and test - area under the curves (AUCs) were compared for original and normalized images. For CNN and Wndchrm, pixel classification test-AUCs were compared. Results: Khan method reduced color saturation while Vahadane reduced hue variance. Super-pixel-level test-AUC for MRS was 0.010–0.025 (95% confidence interval limits ± 0.004) higher for the two normalized image sets compared to the original in the 10–80 variable range. Improvement in pixel classification accuracy was also observed for CNN and Wndchrm for color-normalized images. Conclusions: Color normalization can give a small incremental benefit when a super-pixel-based classification method is used with features that perform implicit color normalization while the gain is

  7. Empirical comparison of color normalization methods for epithelial-stromal classification in H and E images

    Directory of Open Access Journals (Sweden)

    Amit Sethi

    2016-01-01

    Full Text Available Context: Color normalization techniques for histology have not been empirically tested for their utility for computational pathology pipelines. Aims: We compared two contemporary techniques for achieving a common intermediate goal - epithelial-stromal classification. Settings and Design: Expert-annotated regions of epithelium and stroma were treated as ground truth for comparing classifiers on original and color-normalized images. Materials and Methods: Epithelial and stromal regions were annotated on thirty diverse-appearing H and E stained prostate cancer tissue microarray cores. Corresponding sets of thirty images each were generated using the two color normalization techniques. Color metrics were compared for original and color-normalized images. Separate epithelial-stromal classifiers were trained and compared on test images. Main analyses were conducted using a multiresolution segmentation (MRS approach; comparative analyses using two other classification approaches (convolutional neural network [CNN], Wndchrm were also performed. Statistical Analysis: For the main MRS method, which relied on classification of super-pixels, the number of variables used was reduced using backward elimination without compromising accuracy, and test - area under the curves (AUCs were compared for original and normalized images. For CNN and Wndchrm, pixel classification test-AUCs were compared. Results: Khan method reduced color saturation while Vahadane reduced hue variance. Super-pixel-level test-AUC for MRS was 0.010-0.025 (95% confidence interval limits ± 0.004 higher for the two normalized image sets compared to the original in the 10-80 variable range. Improvement in pixel classification accuracy was also observed for CNN and Wndchrm for color-normalized images. Conclusions: Color normalization can give a small incremental benefit when a super-pixel-based classification method is used with features that perform implicit color normalization while the

  8. Normalization method for metabolomics data using optimal selection of multiple internal standards

    Directory of Open Access Journals (Sweden)

    Yetukuri Laxman

    2007-03-01

    Full Text Available Abstract Background Success of metabolomics as the phenotyping platform largely depends on its ability to detect various sources of biological variability. Removal of platform-specific sources of variability such as systematic error is therefore one of the foremost priorities in data preprocessing. However, chemical diversity of molecular species included in typical metabolic profiling experiments leads to different responses to variations in experimental conditions, making normalization a very demanding task. Results With the aim to remove unwanted systematic variation, we present an approach that utilizes variability information from multiple internal standard compounds to find optimal normalization factor for each individual molecular species detected by metabolomics approach (NOMIS. We demonstrate the method on mouse liver lipidomic profiles using Ultra Performance Liquid Chromatography coupled to high resolution mass spectrometry, and compare its performance to two commonly utilized normalization methods: normalization by l2 norm and by retention time region specific standard compound profiles. The NOMIS method proved superior in its ability to reduce the effect of systematic error across the full spectrum of metabolite peaks. We also demonstrate that the method can be used to select best combinations of standard compounds for normalization. Conclusion Depending on experiment design and biological matrix, the NOMIS method is applicable either as a one-step normalization method or as a two-step method where the normalization parameters, influenced by variabilities of internal standard compounds and their correlation to metabolites, are first calculated from a study conducted in repeatability conditions. The method can also be used in analytical development of metabolomics methods by helping to select best combinations of standard compounds for a particular biological matrix and analytical platform.

  9. Modelling of orthogonal cutting by incremental elastoplastic analysis and meshless method

    Science.gov (United States)

    Boudaia, Elhassan; Bousshine, Lahbib; Fihri, Hicham Fassi; De Saxce, Gery

    2009-11-01

    This Note introduces an application of the meshless method to the case of machining simulation in small deformations, which is still subjected to numerical limitations. The treatment of the contact problem at the tool/chip interface is presented, and highlights the interest of the coupling of the contact law with friction. Validation results are detailed through typical example. To cite this article: E. Boudaia et al., C. R. Mecanique 337 (2009).

  10. A high-precision calculation method for interface normal and curvature on an unstructured grid

    Science.gov (United States)

    Ito, Kei; Kunugi, Tomoaki; Ohno, Shuji; Kamide, Hideki; Ohshima, Hiroyuki

    2014-09-01

    In the volume-of-fluid algorithm, the calculations of the interface normal and curvature are crucially important for accurately simulating interfacial flows. However, few methods have been proposed for the high-precision interface calculation on an unstructured grid. In this paper, the authors develop a height function method that works appropriately on an unstructured grid. In the process, the definition of the height function is discussed, and the high-precision calculation method of the interface normal is developed to meet the necessary condition for a second-order method. This new method has highly reduced computational cost compared with a conventional high-precision method because the interface normal calculation is completed by solving relatively simple algebraic equations. The curvature calculation method is also discussed and the approximated quadric curve of an interface is employed to calculate the curvature. Following a basic verification, the developed height function method is shown to successfully provide superior calculation accuracy and highly reduced computational cost compared with conventional calculation methods in terms of the interface normal and curvature. In addition, the height function method succeeds in calculating accurately the slotted-disk revolution problem and the oscillating drop on unstructured grids. Therefore, the developed height function method is confirmed to be an efficient technique for the high-precision numerical simulation of interfacial flows on an unstructured grid.

  11. Cutting assembly

    Science.gov (United States)

    Racki, Daniel J.; Swenson, Clark E.; Bencloski, William A.; Wineman, Arthur L.

    1984-01-01

    A cutting apparatus includes a support table mounted for movement toward and away from a workpiece and carrying a mirror which directs a cutting laser beam onto the workpiece. A carrier is rotatably and pivotally mounted on the support table between the mirror and workpiece and supports a conduit discharging gas toward the point of impingement of the laser beam on the workpiece. Means are provided for rotating the carrier relative to the support table to place the gas discharging conduit in the proper positions for cuts made in different directions on the workpiece.

  12. Bone cutting.

    Science.gov (United States)

    Giraud, J Y; Villemin, S; Darmana, R; Cahuzac, J P; Autefage, A; Morucci, J P

    1991-02-01

    Bone cutting has always been a problem for surgeons because bone is a hard living material, and many osteotomes are still very crude tools. Technical improvement of these surgical tools has first been their motorization. Studies of the bone cutting process have indicated better features for conventional tools. Several non-conventional osteotomes, particularly ultrasonic osteotomes are described. Some studies on the possible use of lasers for bone cutting are also reported. Use of a pressurised water jet is also briefly examined. Despite their advantages, non-conventional tools still require improvement if they are to be used by surgeons.

  13. Comparison of normalization methods for Illumina BeadChip HumanHT-12 v3

    Directory of Open Access Journals (Sweden)

    Eils Roland

    2010-06-01

    Full Text Available Abstract Background Normalization of microarrays is a standard practice to account for and minimize effects which are not due to the controlled factors in an experiment. There is an overwhelming number of different methods that can be applied, none of which is ideally suited for all experimental designs. Thus, it is important to identify a normalization method appropriate for the experimental setup under consideration that is neither too negligent nor too stringent. Major aim is to derive optimal results from the underlying experiment. Comparisons of different normalization methods have already been conducted, none of which, to our knowledge, comparing more than a handful of methods. Results In the present study, 25 different ways of pre-processing Illumina Sentrix BeadChip array data are compared. Among others, methods provided by the BeadStudio software are taken into account. Looking at different statistical measures, we point out the ideal versus the actual observations. Additionally, we compare qRT-PCR measurements of transcripts from different ranges of expression intensities to the respective normalized values of the microarray data. Taking together all different kinds of measures, the ideal method for our dataset is identified. Conclusions Pre-processing of microarray gene expression experiments has been shown to influence further downstream analysis to a great extent and thus has to be carefully chosen based on the design of the experiment. This study provides a recommendation for deciding which normalization method is best suited for a particular experimental setup.

  14. Normal contour error measurement on-machine and compensation method for polishing complex surface by MRF

    Science.gov (United States)

    Chen, Hua; Chen, Jihong; Wang, Baorui; Zheng, Yongcheng

    2016-10-01

    The Magnetorheological finishing (MRF) process, based on the dwell time method with the constant normal spacing for flexible polishing, would bring out the normal contour error in the fine polishing complex surface such as aspheric surface. The normal contour error would change the ribbon's shape and removal characteristics of consistency for MRF. Based on continuously scanning the normal spacing between the workpiece and the finder by the laser range finder, the novel method was put forward to measure the normal contour errors while polishing complex surface on the machining track. The normal contour errors was measured dynamically, by which the workpiece's clamping precision, multi-axis machining NC program and the dynamic performance of the MRF machine were achieved for the verification and security check of the MRF process. The unit for measuring the normal contour errors of complex surface on-machine was designed. Based on the measurement unit's results as feedback to adjust the parameters of the feed forward control and the multi-axis machining, the optimized servo control method was presented to compensate the normal contour errors. The experiment for polishing 180mm × 180mm aspherical workpiece of fused silica by MRF was set up to validate the method. The results show that the normal contour error was controlled in less than 10um. And the PV value of the polished surface accuracy was improved from 0.95λ to 0.09λ under the conditions of the same process parameters. The technology in the paper has been being applied in the PKC600-Q1 MRF machine developed by the China Academe of Engineering Physics for engineering application since 2014. It is being used in the national huge optical engineering for processing the ultra-precision optical parts.

  15. Short-cut math

    CERN Document Server

    Kelly, Gerard W

    1984-01-01

    Clear, concise compendium of about 150 time-saving math short-cuts features faster, easier ways to add, subtract, multiply, and divide. Each problem includes an explanation of the method. No special math ability needed.

  16. Methods to Predict Stresses in Cutting Inserts Brazed Using Iron-Carbon Brazing Alloy

    Science.gov (United States)

    Konovodov, V. V.; Valentov, A. V.; Retuynskiy, O. Yu; Esekuev, Sh B.

    2016-04-01

    This work describes a method for predicting residual and operating stresses in a flat-form tool insert made of tungsten free carbides brazed using iron-carbon alloy. According to the studies’ results it is concluded that the recommendations relating to the limitation of a melting point of tool brazing alloys (950-1100°C according to different data) are connected with a negative impact on tools as a composite made of dissimilar materials rather than on hard alloys as a tool material. Due to the cooling process stresses inevitably occur in the brazed joint of dissimilar materials, and these stresses increase with the higher solidification temperature of the brazing alloy.

  17. Discrimination methods of biological contamination on fresh-cut lettuce based on VNIR and NIR hyperspectral imaging

    Science.gov (United States)

    Multispectral imaging algorithms were developed using visible-near-infrared (VNIR) and near-infrared (NIR) hyperspectral imaging (HSI) techniques to detect worms on fresh-cut lettuce. The optimal wavebands that detect worm on fresh-cut lettuce for each type of HSI were investigated using the one-way...

  18. Method for cutting steam heat losses during cyclic steam injection of wells. Second quarterly report

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The Midway-Sunset Field (CA) is the largest Heavy Oil field in California and steam injection methods have been successfully used for more than 30 years to produce the Heavy Oil from many of its unconsolidated sand reservoirs. In partnership with another DOE/ERIP grantee, our Company has acquired an 80 ac. lease in the SE part of this field, in order to demonstrate our respective technologies in the Monarch sand, of Miocene Age, which is one of the reservoirs targeted by the DOE Class 3 Oil Program. This reservoir contains a 13 API oil, which has a much higher market value, as a Refinery Feedstock, than the 5 to 8 API Vaca Tar, used only as road paving material. This makes it easier to justify the required investment in a vertical well equipped with two horizontal drainholes. The economic viability of such a project is likely to be enhanced if Congress approves the export to Japan of a portion of the 27 API (1% Sulfur) AK North Slope oil, which currently is landed in California in preference to lighter and sweeter Far East imported crudes. This is a major cause of the depressed prices for California Heavy Oil in local refineries, which have reduced the economic viability of all EOR methods, including steam injection, in California. Two proposals, for a Near-Term (3 y.) and for a Mid-Term (6 y.) project respectively, were jointly submitted to the DOE for Field Demonstration of the Partners` new technologies under the DOE Class 3 Oil Program. The previous design of a special casing joint for the Oxnard field well was reviewed and adapted to the use of existing Downhole Hardware components from three suppliers, instead of one. The cost of drilling and completion of a well equipped with two horizontal drainholes was re-evaluated for the conditions prevailing in the Midway Sunset field, which are more favorable than in the Oxnard field, leading to considerable reductions in drilling rig time and cost.

  19. An overview of the Normal Ogive Harmonic Analysis Robust Method (NOHARM approach to item response theory

    Directory of Open Access Journals (Sweden)

    Lee, J. J.

    2016-01-01

    Full Text Available Here we provide a description of the IRT estimation method known as Normal Ogive Harmonic Analysis Robust Method (NOHARM. Although in some ways this method has been superseded by new computer programs that also adopt a specifically factor-analytic approach, its fundamental principles remain useful in certain applications, which include calculating the residual covariance matrix and rescaling the distribution of the common factor (latent trait. These principles can be applied to parameter estimates obtained by any method.

  20. Making the cut: Innovative methods for optimizing perfusion-based migration assays.

    Science.gov (United States)

    Holt, Andrew W; Howard, William E; Ables, Elizabeth T; George, Stephanie M; Kukoly, Cindy A; Rabidou, Jake E; Francisco, Jake T; Chukwu, Angel N; Tulis, David A

    2017-03-01

    Application of fluid shear stress to adherent cells dramatically influences their cytoskeletal makeup and differentially regulates their migratory phenotype. Because cytoskeletal rearrangements are necessary for cell motility and migration, preserving these adaptations under in vitro conditions and in the presence of fluid flow are physiologically essential. With this in mind, parallel plate flow chambers and microchannels are often used to conduct in vitro perfusion experiments. However, both of these systems currently lack capacity to accurately study cell migration in the same location where cells were perfused. The most common perfusion/migration assays involve cell perfusion followed by trypsinization which can compromise adaptive cytoskeletal geometry and lead to misleading phenotypic conclusions. The purpose of this study was to quantitatively highlight some limitations commonly found with currently used cell migration approaches and to introduce two new advances which use additive manufacturing (3D printing) or laser capture microdissection (LCM) technology. The residue-free 3D printed insert allows accurate cell seeding within defined areas, increases cell yield for downstream analyses, and more closely resembles the reported levels of fluid shear stress calculated with computational fluid dynamics as compared to other residue-free cell seeding techniques. The LCM approach uses an ultraviolet laser for "touchless technology" to rapidly and accurately introduce a custom-sized wound area in otherwise inaccessible perfusion microchannels. The wound area introduced by LCM elicits comparable migration characteristics compared to traditional pipette tip-induced injuries. When used in perfusion experiments, both of these newly characterized tools were effective in yielding similar results yet without the limitations of the traditional modalities. These innovative methods provide valuable tools for exploring mechanisms of clinically important aspects of cell

  1. Image Retrieval and Classification Method Based on Euclidian Distance Between Normalized Features Including Wavelet Descriptor

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-10-01

    Full Text Available Image retrieval method based on Euclidian distance between normalized features with their mean and variance in feature space is proposed. Effectiveness of the normalization is evaluated together with a validation of the proposed image retrieval method. The proposed method is applied for discrimination and identifying dangerous red tide species based on wavelet utilized classification methods together with texture and color features. Through experiments, it is found that classification performance with the proposed wavelet derived shape information extracted from the microscopic view of the phytoplankton is effective for identifying dangerous red tide species among the other red tide species rather than the other conventional texture, color information. Moreover, it is also found that the proposed normalization of features is effective to improve identification performance.

  2. Tangential and sagittal curvature from the normals computed by the null screen method in corneal topography

    Science.gov (United States)

    Estrada-Molina, Amilcar; Díaz-Uribe, Rufino

    2011-08-01

    A new method for computing the tangential and sagittal curvatures from the normals to a cornea is proposed. The normals are obtained through a Null Screen method from the coordinates of the drops shaped spots at the null screen, the coordinates on a reference approximating surface and the centroids on the image plane. This method assumes that the cornea has rotational symmetry and our derivations will be carried out in the meridional plane that contains the symmetry axis. Experimental results are shown for a calibration spherical surface, using cylindrical null screens with radial point arrays.

  3. 会阴不侧切的快速评估办法%Fast evaluation method for side cutting perineum

    Institute of Scientific and Technical Information of China (English)

    邱虹

    2015-01-01

    1996年世界卫生组织提出“分娩爱母行动”,建议会阴侧切率应在20%,最好达到5‰。采用不侧切,无会阴保护等,严格控制会阴侧切,促进自然分娩。促进怀孕产妇产后恢复,维护妇女儿童身心健康,提高产后生活质量。我院自2013年11月至2015年2月对813名正常分娩的产妇进行会阴侧切评估,使会阴侧切率下降至30%-40%;同时对会阴进行有效的评估,也促进了新上岗的助产人员的业务培养,让她们有充分的依据,对每一位正常分娩的产妇进行良性评估,对降低会阴侧切提升了信心。%in 1996 WHo proposed “safe delivery action”,recommended the perineotomy rate should be 20%,the best reaches 5‰.the side cut, no protection of perineum, strict control of episiotomy,promote natural delivery.Promote the postpartum recovery, safeguarding women’s physical and mental health of children, improve the quality of life of postpartum. in our hospital from 2013 november to 2015 february 813 for normal delivery of maternal perineal assessment, the episiotomy rate decreased to 30%-40%; at the same time to effectively evaluate the perineum, but also promote the new posts of midwifery staff business training, so that they have sufficient basis,to assess every benign normal delivery of maternal, to reduce episiotomy to enhance confidence.

  4. New Fuzzy-based Retinex Method for the Illumination Normalization of Face Recognition

    Directory of Open Access Journals (Sweden)

    Gi Pyo Nam

    2012-10-01

    Full Text Available We propose a new illumination normalization for face recognition which robust in relation to the illumination variations on mobile devices. This research is novel in the following five ways when compared to previous works: (i a new fuzzy‐based Retinex method is proposed for illumination normalization; (ii the performance of face recognition is enhanced by determining the optimal parameter of Retinex filtering based on fuzzy logic; (iii the output of the fuzzy membership function is adaptively determined based on the mean and standard deviations of the grey values of the detected face region; (iv through the comparison of various defuzzification methods in terms of the accuracy of face recognition, one optimal method was selected; (v we proved the validations of the proposed method by testing it with various face recognition methods. Experimental results showed that the accuracy of the face recognition with the proposed method was enhanced compared to previous ones.

  5. Comparison of the Cut-and-Paste and Full Moment Tensor Methods for Estimating Earthquake Source Parameters

    Science.gov (United States)

    Templeton, D.; Rodgers, A.; Helmberger, D.; Dreger, D.

    2008-12-01

    Earthquake source parameters (seismic moment, focal mechanism and depth) are now routinely reported by various institutions and network operators. These parameters are important for seismotectonic and earthquake ground motion studies as well as calibration of moment magnitude scales and model-based earthquake-explosion discrimination. Source parameters are often estimated from long-period three- component waveforms at regional distances using waveform modeling techniques with Green's functions computed for an average plane-layered models. One widely used method is waveform inversion for the full moment tensor (Dreger and Helmberger, 1993). This method (TDMT) solves for the moment tensor elements by performing a linearized inversion in the time-domain that minimizes the difference between the observed and synthetic waveforms. Errors in the seismic velocity structure inevitably arise due to either differences in the true average plane-layered structure or laterally varying structure. The TDMT method can account for errors in the velocity model by applying a single time shift at each station to the observed waveforms to best match the synthetics. Another method for estimating source parameters is the Cut-and-Paste (CAP) method. This method breaks the three-component regional waveforms into five windows: vertical and radial component Pnl; vertical and radial component Rayleigh wave; and transverse component Love waves. The CAP method performs a grid search over double-couple mechanisms and allows the synthetic waveforms for each phase (Pnl, Rayleigh and Love) to shift in time to account for errors in the Green's functions. Different filtering and weighting of the Pnl segment relative to surface wave segments enhances sensitivity to source parameters, however, some bias may be introduced. This study will compare the TDMT and CAP methods in two different regions in order to better understand the advantages and limitations of each method. Firstly, we will consider the

  6. Modified Cheeger and Ratio Cut Methods Using the Ginzburg-Landau Functional for Classification of High-Dimensional Data

    Science.gov (United States)

    2016-02-01

    vertices it is connecting are similar and a small weight otherwise. One popular choice for the weight function is the Gaussian w(x, y) = e− M(x,y)2...undirected graph with the set of vertices V and set of edges E, and consider a target set X of size n embedded in a graph G. A weight function is defined on...containing the weight function values. The minimum cut problem is to find the set S ⊂ V such that the following value is minimized: cut(S, S̄) = ∑ x∈S,y

  7. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    Energy Technology Data Exchange (ETDEWEB)

    Tam, Allison [Stanford Institutes of Medical Research Program, Stanford University School of Medicine, Stanford, California 94305 (United States); Barker, Jocelyn [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 (United States); Rubin, Daniel [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 and Department of Medicine (Biomedical Informatics Research), Stanford University School of Medicine, Stanford, California 94305 (United States)

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  8. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Parallel Preconditioned Conjugate Gradient Square Method Based on Normalized Approximate Inverses

    Directory of Open Access Journals (Sweden)

    George A. Gravvanis

    2005-01-01

    Full Text Available A new class of normalized explicit approximate inverse matrix techniques, based on normalized approximate factorization procedures, for solving sparse linear systems resulting from the finite difference discretization of partial differential equations in three space variables are introduced. A new parallel normalized explicit preconditioned conjugate gradient square method in conjunction with normalized approximate inverse matrix techniques for solving efficiently sparse linear systems on distributed memory systems, using Message Passing Interface (MPI communication library, is also presented along with theoretical estimates on speedups and efficiency. The implementation and performance on a distributed memory MIMD machine, using Message Passing Interface (MPI is also investigated. Applications on characteristic initial/boundary value problems in three dimensions are discussed and numerical results are given.

  10. Performance Testing of Cutting Fluids

    DEFF Research Database (Denmark)

    Belluco, Walter

    within the whole range of operations, materials, cutting fluids, operating conditions, etc. Cutting fluid performance was evaluated in turning, drilling, reaming and tapping, and with respect to tool life, cutting forces, chip formation and product quality (dimensional accuracy and surface integrity......). A number of different work materials were considered, with emphasis on austenitic stainless steel. Cutting fluids from two main groups were investigated, water miscible (reviewed from previous work) and straight oils. Results show that correlation of cutting fluid performance in different operations exists...... within the same group of cutting fluids, for stainless steel. A possible rationalisation of cutting fluid performance tests is suggested. In order to select a set of basic tests and optimise them for use as general and standardised testing methods, an original approach to the evaluation of cutting force...

  11. Performance Testing of Cutting Fluids

    DEFF Research Database (Denmark)

    Belluco, Walter

    The importance of cutting fluid performance testing has increased with documentation requirements of new cutting fluid formulations based on more sustainable products, as well as cutting with minimum quantity of lubrication and dry cutting. Two sub-problems have to be solved: i) which machining...... tests feature repeatability, reproducibility and sensitivity to cutting fluids, and ii) to what extent results of one test ensure relevance to a wider set of machining situations. The present work is aimed at assessing the range of validity of the different testing methods, investigating correlation...... within the whole range of operations, materials, cutting fluids, operating conditions, etc. Cutting fluid performance was evaluated in turning, drilling, reaming and tapping, and with respect to tool life, cutting forces, chip formation and product quality (dimensional accuracy and surface integrity...

  12. Simulation of compressible two-phase flows with topology change of fluid-fluid interface by a robust cut-cell method

    Science.gov (United States)

    Lin, Jian-Yu; Shen, Yi; Ding, Hang; Liu, Nan-Sheng; Lu, Xi-Yun

    2017-01-01

    We develop a robust cut-cell method for numerical simulation of compressible two-phase flows with topology change of the fluid-fluid interface. In cut cell methods the flows can be solved in the finite volume framework and the jump conditions at the interface are resolved by solving a local Riemann problem. Therefore, cut cell methods can obtain interface evolution with high resolution, and at the same time satisfactorily maintain the conservation of flow quantities. However, it remains a challenge for the cut cell methods to handle interfaces with topology change or very high curvature, where the mesh is not sufficiently fine to resolve the interface. Inappropriate treatment could give rise to either distorted interface advection or unphysical oscillation of flow variables, especially when the regularization process (e.g. reinitialization in the level set methods) is implemented. A robust cut-cell method is proposed here, with the interface being tracked by a level set function. The local unphysical oscillation of flow variables in the presence of topology change is shown to be greatly suppressed by using a delayed reinitialization. The method can achieve second-order accuracy with respect to the interface position in the absence of topology changes of interface, while locally degrading to first-order at the interface region where topology change occurs. Its performance is examined through a variety of numerical tests, such as Rayleigh collapse, shock-bubble interaction, and shock-induced bubble collapse in water. Numerical results are compared against either benchmark solutions or experimental observations, and good agreement has been achieved qualitatively and/or quantitatively. Finally, we apply the method to investigating the collapse process of two tandem bubbles in water.

  13. Cutting Cosmos

    DEFF Research Database (Denmark)

    Mikkelsen, Henrik Hvenegaard

    The foundation for this book is an ethnographic study of masculinity in a Bugkalot village in northern Philippines. While offering new research on the Bugkalot, widely known as the Ilongot, more than 30 years after the last important works were written on this famous hill-people, Cutting Cosmos...... into egalitarian relations. Cutting Cosmos shows how these seemingly opposed characteristics of male life - the egalitarianism and the assertive ideals - are interwoven. Acts of dominance are presented as acts of transgression that are persistently ritualized, contained and isolated as spectacular events within...

  14. Goulphar: rapid access and expertise for standard two-color microarray normalization methods

    Directory of Open Access Journals (Sweden)

    Servant Nicolas

    2006-10-01

    Full Text Available Abstract Background Raw data normalization is a critical step in microarray data analysis because it directly affects data interpretation. Most of the normalization methods currently used are included in the R/BioConductor packages but it is often difficult to identify the most appropriate method. Furthermore, the use of R commands for functions and graphics can introduce mistakes that are difficult to trace. We present here a script written in R that provides a flexible means of access to and monitoring of data normalization for two-color microarrays. This script combines the power of BioConductor and R analysis functions and reduces the amount of R programming required. Results Goulphar was developed in and runs using the R language and environment. It combines and extends functions found in BioConductor packages (limma and marray to correct for dye biases and spatial artifacts. Goulphar provides a wide range of optional and customizable filters for excluding incorrect signals during the pre-processing step. It displays informative output plots, enabling the user to monitor the normalization process, and helps adapt the normalization method appropriately to the data. All these analyses and graphical outputs are presented in a single PDF report. Conclusion Goulphar provides simple, rapid access to the power of the R/BioConductor statistical analysis packages, with precise control and visualization of the results obtained. Complete documentation, examples and online forms for setting script parameters are available from http://transcriptome.ens.fr/goulphar/.

  15. Normal Vector Projection Method used for Convex Optimization of Chan-Vese Model for Image Segmentation

    Science.gov (United States)

    Wei, W. B.; Tan, L.; Jia, M. Q.; Pan, Z. K.

    2017-01-01

    The variational level set method is one of the main methods of image segmentation. Due to signed distance functions as level sets have to keep the nature of the functions through numerical remedy or additional technology in an evolutionary process, it is not very efficient. In this paper, a normal vector projection method for image segmentation using Chan-Vese model is proposed. An equivalent formulation of Chan-Vese model is used by taking advantage of property of binary level set functions and combining with the concept of convex relaxation. Threshold method and projection formula are applied in the implementation. It can avoid the above problems and obtain a global optimal solution. Experimental results on both synthetic and real images validate the effects of the proposed normal vector projection method, and show advantages over traditional algorithms in terms of computational efficiency.

  16. AFM method to detect differences in adhesion of silica bids to cancer and normal epithelial cells

    Science.gov (United States)

    Sokolov, Igor; Iyer, Swaminathan; Gaikwad, Ravi; Woodworth, Craig

    2009-03-01

    To date, the methods of detection of cancer cells have been mostly based on traditional techniques used in biology, such as visual identification of malignant changes, cell growth analysis, specific ligand-receptor labeling, or genetic tests. Despite being well developed, these methods are either insufficiently accurate or require a lengthy complicated analysis. A search for alternative methods for the detection of cancer cells may be a fruitful approach. Here we describe an AFM study that may result in a new method for detection of cancer cells in vitro. Here we use atomic force microscopy (AFM) to study adhesion of single silica beads to malignant and normal cells cultured from human cervix. We found that adhesion depends on the time of contact, and can be statistically different for malignant and normal cells. Using these data, one could develop an optical method of cancer detection based on adhesion of various silica beads.

  17. Automated counting of morphologically normal red blood cells by using digital holographic microscopy and statistical methods

    Science.gov (United States)

    Moon, Inkyu; Yi, Faliu

    2015-09-01

    In this paper we overview a method to automatically count morphologically normal red blood cells (RBCs) by using off-axis digital holographic microscopy and statistical methods. Three kinds of RBC are used as training and testing data. All of the RBC phase images are obtained with digital holographic microscopy (DHM) that is robust to transparent or semitransparent biological cells. For the determination of morphologically normal RBCs, the RBC's phase images are first segmented with marker-controlled watershed transform algorithm. Multiple features are extracted from the segmented cells. Moreover, the statistical method of Hotelling's T-square test is conducted to show that the 3D features from 3D imaging method can improve the discrimination performance for counting of normal shapes of RBCs. Finally, the classifier is designed by using statistical Bayesian algorithm and the misclassification rates are measured with leave-one-out technique. Experimental results show the feasibility of the classification method for calculating the percentage of each typical normal RBC shape.

  18. Evaluation of directional normalization methods for Landsat TM/ETM+ over primary Amazonian lowland forests

    Science.gov (United States)

    Van doninck, Jasper; Tuomisto, Hanna

    2017-06-01

    Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.

  19. Evaluation on radioactive waste disposal amount of Kori Unit 1 reactor vessel considering cutting and packaging methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yu Jong; Lee, Seong Cheol; Kim, Chang Lak [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-06-15

    Decommissioning of nuclear power plants has become a big issue in South Korea as some of the nuclear power plants in operation including Kori unit 1 and Wolsung unit 1 are getting old. Recently, Wolsung unit 1 received permission to continue operation while Kori unit 1 will shut down permanently in June 2017. With the consideration of segmentation method and disposal containers, this paper evaluated final disposal amount of radioactive waste generated from decommissioning of the reactor pressure vessel in Kori unit 1 which will be decommissioned as the first in South Korea. The evaluation results indicated that the final disposal amount from the top and bottom heads of the reactor pressure vessel with hemisphere shape decreased as they were cut in smaller more effectively than the cylindrical part of the reactor pressure vessel. It was also investigated that 200 L and 320 L radioactive waste disposal containers used in Kyung-Ju disposal facility had low payload efficiency because of loading weight limitation.

  20. A robust method for calculating interface curvature and normal vectors using an extracted local level set

    CERN Document Server

    Ervik, Åsmund; Munkejord, Svend Tollak

    2014-01-01

    The level-set method is a popular interface tracking method in two-phase flow simulations. An often-cited reason for using it is that the method naturally handles topological changes in the interface, e.g. merging drops, due to the implicit formulation. It is also said that the interface curvature and normal vectors are easily calculated. This last point is not, however, the case in the moments during a topological change, as several authors have already pointed out. Various methods have been employed to circumvent the problem. In this paper, we present a new such method which retains the implicit level-set representation of the surface and handles general interface configurations. It is demonstrated that the method extends easily to 3D. The method is validated on static interface configurations, and then applied to two-phase flow simulations where the method outperforms the standard method and the results agree well with experiments.

  1. Changes in Organic Matter And Nutrients in Forest Floor After Applying Several Reproductive Cutting Methods in Shortleaf Pine-Hardwood Stands

    Science.gov (United States)

    Hal O. Liechty; Michael G. Shelton

    2004-01-01

    Abstract - This study was initiated to determine the effects of various regeneration cutting methods on forest floor mass and nutrient content in shortleaf pine-hardwood communities in the Ouachita and Ozark National Forests. Clearcutting generally altered forest floor concentrations of N, P, and S as well as loss on ignition by increasing the amount...

  2. A full multigrid method for linear complementarity problems arising from elastic normal contact problems

    NARCIS (Netherlands)

    Zhao, J.; Vollebregt, E.A.H.; Oosterlee, C.W.

    2014-01-01

    This paper presents a full multigrid (FMG) technique, which combines a multigrid method, an active set algorithm and a nested iteration technique, to solve a linear complementarity problem (LCP) modeling elastic normal contact problems. The governing system in this LCP is derived from a Fredholm int

  3. A study of the up-and-down method for non-normal distribution functions

    DEFF Research Database (Denmark)

    Vibholm, Svend; Thyregod, Poul

    1988-01-01

    The assessment of breakdown probabilities is examined by the up-and-down method. The exact maximum-likelihood estimates for a number of response patterns are calculated for three different distribution functions and are compared with the estimates corresponding to the normal distribution. Estimat...

  4. On convergence of the normalized elimination of the small component (NESC) method

    NARCIS (Netherlands)

    Filatov, Michael; Dyall, Kenneth G.

    2007-01-01

    The convergence behavior of the iterative solution of the normalized elimination of the small component (NESC) method is investigated. A simple and efficient computational protocol for obtaining the exact positive-energy eigenvalues of the relativistic Hamiltonian starting from the energies obtained

  5. Evaluation of normalization methods for cDNA microarray data by k-NN classification

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Wei; Xing, Eric P; Myers, Connie; Mian, Saira; Bissell, Mina J

    2004-12-17

    Non-biological factors give rise to unwanted variations in cDNA microarray data. There are many normalization methods designed to remove such variations. However, to date there have been few published systematic evaluations of these techniques for removing variations arising from dye biases in the context of downstream, higher-order analytical tasks such as classification. Ten location normalization methods that adjust spatial- and/or intensity-dependent dye biases, and three scale methods that adjust scale differences were applied, individually and in combination, to five distinct, published, cancer biology-related cDNA microarray data sets. Leave-one-out cross-validation (LOOCV) classification error was employed as the quantitative end-point for assessing the effectiveness of a normalization method. In particular, a known classifier, k-nearest neighbor (k-NN), was estimated from data normalized using a given technique, and the LOOCV error rate of the ensuing model was computed. We found that k-NN classifiers are sensitive to dye biases in the data. Using NONRM and GMEDIAN as baseline methods, our results show that single-bias-removal techniques which remove either spatial-dependent dye bias (referred later as spatial effect) or intensity-dependent dye bias (referred later as intensity effect) moderately reduce LOOCV classification errors; whereas double-bias-removal techniques which remove both spatial- and intensity effect reduce LOOCV classification errors even further. Of the 41 different strategies examined, three two-step processes, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, all of which removed intensity effect globally and spatial effect locally, appear to reduce LOOCV classification errors most consistently and effectively across all data sets. We also found that the investigated scale normalization methods do not reduce LOOCV classification error. Using LOOCV error of k-NNs as the evaluation criterion, three double

  6. A systematic study of genome context methods: calibration, normalization and combination

    Directory of Open Access Journals (Sweden)

    Dale Joseph M

    2010-10-01

    Full Text Available Abstract Background Genome context methods have been introduced in the last decade as automatic methods to predict functional relatedness between genes in a target genome using the patterns of existence and relative locations of the homologs of those genes in a set of reference genomes. Much work has been done in the application of these methods to different bioinformatics tasks, but few papers present a systematic study of the methods and their combination necessary for their optimal use. Results We present a thorough study of the four main families of genome context methods found in the literature: phylogenetic profile, gene fusion, gene cluster, and gene neighbor. We find that for most organisms the gene neighbor method outperforms the phylogenetic profile method by as much as 40% in sensitivity, being competitive with the gene cluster method at low sensitivities. Gene fusion is generally the worst performing of the four methods. A thorough exploration of the parameter space for each method is performed and results across different target organisms are presented. We propose the use of normalization procedures as those used on microarray data for the genome context scores. We show that substantial gains can be achieved from the use of a simple normalization technique. In particular, the sensitivity of the phylogenetic profile method is improved by around 25% after normalization, resulting, to our knowledge, on the best-performing phylogenetic profile system in the literature. Finally, we show results from combining the various genome context methods into a single score. When using a cross-validation procedure to train the combiners, with both original and normalized scores as input, a decision tree combiner results in gains of up to 20% with respect to the gene neighbor method. Overall, this represents a gain of around 15% over what can be considered the state of the art in this area: the four original genome context methods combined using a

  7. 水切割机器人路径规划方法%Path planning method of water-jet cutting robot

    Institute of Scientific and Technical Information of China (English)

    王玫; 孟正大

    2012-01-01

    以汽车内饰件切割路径优化为研究对象,提出了一种改进禁忌表蚁群算法,实现优化排序.根据水切割过程特点和工艺要求,进行了水切割路径规划问题分析与建模,设计了改进的禁忌表,利用分层思想将禁忌表划分为3段:内部小环段、内部大环段和外部轮廓段,各段的优先级依次降低,并确定了与此相应的禁忌表的更新规则.在此基础上,给出了基于改进禁忌表蚁群算法的水切割路径优化排序方法,对轮廓切割顺序和各轮廓起始点选择同时进行优化.仿真与实验结果表明,改进禁忌表蚁群算法是可行、有效的,可大大缩短水切割机器人的示教编程时间,显著提高水切割作业的效率和质量.%Taking the optimization of path planning of cutting automotive interior ornament as the research object, an improved tabu list based ant colony algorithm is presented to achieve cutting sequence optimization. According to the characteristics and technology demands of the water-jet cutting process, water-jet cutting path planning problem was analyzed and modeled. Taking advantage of the hierarchy principle, an improved tabu list was designed, which was divided into three sections : interior small loop, interior large loop, exterior outline section, and their priority reduces successively. Corresponding updating rules of the tabu list were proposed. The water-jet cutting path planning method based on the improved ant colony algorithm was proposed with which the cutting sequence of outlines and selection of starting point for cutting every outline are optimized simultaneously. Simulation and experimental results show that the improved tabu list based ant colony algorithm is feasible and effective. The teaching programming time of water-jet cutting robots can be shorten greatly, efficiency and quality of water-jet cutting jobs can be raised evidently.

  8. Experimental Method for Characterizing Electrical Steel Sheets in the Normal Direction

    Directory of Open Access Journals (Sweden)

    Thierry Belgrand

    2010-10-01

    Full Text Available This paper proposes an experimental method to characterise magnetic laminations in the direction normal to the sheet plane. The principle, which is based on a static excitation to avoid planar eddy currents, is explained and specific test benches are proposed. Measurements of the flux density are made with a sensor moving in and out of an air-gap. A simple analytical model is derived in order to determine the permeability in the normal direction. The experimental results for grain oriented steel sheets are presented and a comparison is provided with values obtained from literature.

  9. Evaluation of statistical methods for normalization and differential expression in mRNA-Seq experiments

    Directory of Open Access Journals (Sweden)

    Hansen Kasper D

    2010-02-01

    Full Text Available Abstract Background High-throughput sequencing technologies, such as the Illumina Genome Analyzer, are powerful new tools for investigating a wide range of biological and medical questions. Statistical and computational methods are key for drawing meaningful and accurate conclusions from the massive and complex datasets generated by the sequencers. We provide a detailed evaluation of statistical methods for normalization and differential expression (DE analysis of Illumina transcriptome sequencing (mRNA-Seq data. Results We compare statistical methods for detecting genes that are significantly DE between two types of biological samples and find that there are substantial differences in how the test statistics handle low-count genes. We evaluate how DE results are affected by features of the sequencing platform, such as, varying gene lengths, base-calling calibration method (with and without phi X control lane, and flow-cell/library preparation effects. We investigate the impact of the read count normalization method on DE results and show that the standard approach of scaling by total lane counts (e.g., RPKM can bias estimates of DE. We propose more general quantile-based normalization procedures and demonstrate an improvement in DE detection. Conclusions Our results have significant practical and methodological implications for the design and analysis of mRNA-Seq experiments. They highlight the importance of appropriate statistical methods for normalization and DE inference, to account for features of the sequencing platform that could impact the accuracy of results. They also reveal the need for further research in the development of statistical and computational methods for mRNA-Seq.

  10. Morphology of TiAlN Thin Film onto HSS as Cutting Tools by Using Mosaic-Styled Target RF Sputtering Method

    Directory of Open Access Journals (Sweden)

    Sigit Tri Wicaksono

    2016-05-01

    Full Text Available High Speed Steel (HSS has been widely used in manufacturing industry as cutting tools. Several methods have been used to improve the cutting performance of HSS in dry cutting. One of them was by growing a thin layer of hard coating on the contact surface of the cutting tool material. In this research, Titanium Aluminum Nitride (TiAlN layer were deposited on AISI M41 HSS substrate by using Radio Frequency (RF sputtering method with mosaic styled of target materials. The aluminum surface area ratios on the Titanium target are 10, 20, 30, and 40 % respectively. The deposition time are 15, 30, and 45 minutes respectively. The formation of TiAlN and AlN crystalline compounds were observed by X-Ray Diffraction method. The morphology of thin film layer with a thickness range from 1.4 to 5.2 µm was observed by using a Scanning Electron Microscopy. It was known that the deposition time affect to the thickness and also the roughness of the layer. The topography images by Atomic Force Microscopy showed that the deposition time of 45 minutes produce the finest layer with the surface roughness of 10.8 nm.

  11. Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    Directory of Open Access Journals (Sweden)

    Amin Zadeh Sarraf

    2013-09-01

    Full Text Available Purpose: Numerous companies are expecting their knowledge management (KM to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement  in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem. Keywords: Knowledge management; strategy; TOPSIS; Normal distribution; entropy

  12. CODEX: a normalization and copy number variation detection method for whole exome sequencing.

    Science.gov (United States)

    Jiang, Yuchao; Oldridge, Derek A; Diskin, Sharon J; Zhang, Nancy R

    2015-03-31

    High-throughput sequencing of DNA coding regions has become a common way of assaying genomic variation in the study of human diseases. Copy number variation (CNV) is an important type of genomic variation, but detecting and characterizing CNV from exome sequencing is challenging due to the high level of biases and artifacts. We propose CODEX, a normalization and CNV calling procedure for whole exome sequencing data. The Poisson latent factor model in CODEX includes terms that specifically remove biases due to GC content, exon capture and amplification efficiency, and latent systemic artifacts. CODEX also includes a Poisson likelihood-based recursive segmentation procedure that explicitly models the count-based exome sequencing data. CODEX is compared to existing methods on a population analysis of HapMap samples from the 1000 Genomes Project, and shown to be more accurate on three microarray-based validation data sets. We further evaluate performance on 222 neuroblastoma samples with matched normals and focus on a well-studied rare somatic CNV within the ATRX gene. We show that the cross-sample normalization procedure of CODEX removes more noise than normalizing the tumor against the matched normal and that the segmentation procedure performs well in detecting CNVs with nested structures.

  13. A python module to normalize microarray data by the quantile adjustment method.

    Science.gov (United States)

    Baber, Ibrahima; Tamby, Jean Philippe; Manoukis, Nicholas C; Sangaré, Djibril; Doumbia, Seydou; Traoré, Sekou F; Maiga, Mohamed S; Dembélé, Doulaye

    2011-06-01

    Microarray technology is widely used for gene expression research targeting the development of new drug treatments. In the case of a two-color microarray, the process starts with labeling DNA samples with fluorescent markers (cyanine 635 or Cy5 and cyanine 532 or Cy3), then mixing and hybridizing them on a chemically treated glass printed with probes, or fragments of genes. The level of hybridization between a strand of labeled DNA and a probe present on the array is measured by scanning the fluorescence of spots in order to quantify the expression based on the quality and number of pixels for each spot. The intensity data generated from these scans are subject to errors due to differences in fluorescence efficiency between Cy5 and Cy3, as well as variation in human handling and quality of the sample. Consequently, data have to be normalized to correct for variations which are not related to the biological phenomena under investigation. Among many existing normalization procedures, we have implemented the quantile adjustment method using the python computer language, and produced a module which can be run via an HTML dynamic form. This module is composed of different functions for data files reading, intensity and ratio computations and visualization. The current version of the HTML form allows the user to visualize the data before and after normalization. It also gives the option to subtract background noise before normalizing the data. The output results of this module are in agreement with the results of other normalization tools.

  14. Online, efficient and precision laser profiling of bronze-bonded diamond grinding wheels based on a single-layer deep-cutting intermittent feeding method

    Science.gov (United States)

    Deng, Hui; Chen, Genyu; He, Jie; Zhou, Cong; Du, Han; Wang, Yanyi

    2016-06-01

    In this study, an online, efficient and precision laser profiling approach that is based on a single-layer deep-cutting intermittent feeding method is described. The effects of the laser cutting depth and the track-overlap ratio of the laser cutting on the efficiency, precision and quality of laser profiling were investigated. Experiments on the online profiling of bronze-bonded diamond grinding wheels were performed using a pulsed fiber laser. The results demonstrate that an increase in the laser cutting depth caused an increase in the material removal efficiency during the laser profiling process. However, the maximum laser profiling efficiency was only achieved when the laser cutting depth was equivalent to the initial surface contour error of the grinding wheel. In addition, the selection of relatively high track-overlap ratios of laser cutting for the profiling of grinding wheels was beneficial with respect to the increase in the precision of laser profiling, whereas the efficiency and quality of the laser profiling were not affected by the change in the track-overlap ratio. After optimized process parameters were employed for online laser profiling, the circular run-out error and the parallelism error of the grinding wheel surface decreased from 83.1 μm and 324.6 μm to 11.3 μm and 3.5 μm, respectively. The surface contour precision of the grinding wheel significantly improved. The highest surface contour precision for grinding wheels of the same type that can be theoretically achieved after laser profiling is completely dependent on the peak power density of the laser. The higher the laser peak power density is, the higher the surface contour precision of the grinding wheel after profiling.

  15. CALCULATION OF LASER CUTTING COSTS

    Directory of Open Access Journals (Sweden)

    Bogdan Nedic

    2016-09-01

    Full Text Available The paper presents description methods of metal cutting and calculation of treatment costs based on model that is developed on Faculty of mechanical engineering in Kragujevac. Based on systematization and analysis of large number of calculation models of cutting with unconventional methods, mathematical model is derived, which is used for creating a software for calculation costs of metal cutting. Software solution enables resolving the problem of calculating the cost of laser cutting, comparison' of costs made by other unconventional methods and provides documentation that consists of reports on estimated costs.

  16. Cut off values of laser fluorescence for different storage methods at different time intervals in comparison to frozen condition: A 1 year in vitro study

    Science.gov (United States)

    Kaul, Rudra; Kaul, Vibhuti; Farooq, Riyaz; Wazir, Nikhil Dev; Khateeb, Shafayat Ullah; Malik, Altaf H; Masoodi, Ajaz Amin

    2014-01-01

    Aims: The aim of the following study is to evaluate the change in laser fluorescence (LF) values for extracted teeth stored in different solutions over 1 year period, to give cut-off values for different storage media at different time intervals to get them at par with the in vivo conditions and to see which medium gives best results with the least change in LF values and while enhancing the validity of DIAGNOdent in research. Materials and Methods: Ninety extracted teeth selected, from a pool of frozen teeth, were divided into nine groups of 10 each. Specimens in Groups 1-8 were stored in 1% chloramine, 10% formalin, 10% buffered formalin, 0.02% thymol, 0.12% chlorhexidine, 3% sodium hypochlorite, a commercially available saliva substitute-Wet Mouth (ICPA Pharmaceuticals) and normal saline respectively at 4°C. The last group was stored under frozen condition at −20°C without contact with any storage solution. DIAGNOdent was used to measure the change the LF values at day 30, 45, 60, 160 and 365. Statistical Analysis Used: The mean change in LF values in different storage mediums at different time intervals were compared using two-way ANOVA. Results: At the end of 1 year, significant decrease in fluorescence (P < 0.05) was observed in Groups 1-8. Maximum drop in LF values occurred between day 1 and 30. Group 9 (frozen specimens) did not significantly change their fluorescence response. Conclusions: An inevitable change in LF takes place due to various storage media commonly used in dental research at different time intervals. The values obtained from our study can remove the bias caused by the storage media and the values of LF thus obtained can hence be conveniently extrapolated to the in vivo condition. PMID:24778506

  17. Cut off values of laser fluorescence for different storage methods at different time intervals in comparison to frozen condition: A 1 year in vitro study

    Directory of Open Access Journals (Sweden)

    Rudra Kaul

    2014-01-01

    Full Text Available Aims: The aim of the following study is to evaluate the change in laser fluorescence (LF values for extracted teeth stored in different solutions over 1 year period, to give cut-off values for different storage media at different time intervals to get them at par with the in vivo conditions and to see which medium gives best results with the least change in LF values and while enhancing the validity of DIAGNOdent in research. Materials and Methods: Ninety extracted teeth selected, from a pool of frozen teeth, were divided into nine groups of 10 each. Specimens in Groups 1-8 were stored in 1% chloramine, 10% formalin, 10% buffered formalin, 0.02% thymol, 0.12% chlorhexidine, 3% sodium hypochlorite, a commercially available saliva substitute-Wet Mouth (ICPA Pharmaceuticals and normal saline respectively at 4°C. The last group was stored under frozen condition at −20°C without contact with any storage solution. DIAGNOdent was used to measure the change the LF values at day 30, 45, 60, 160 and 365. Statistical Analysis Used: The mean change in LF values in different storage mediums at different time intervals were compared using two-way ANOVA. Results: At the end of 1 year, significant decrease in fluorescence (P < 0.05 was observed in Groups 1-8. Maximum drop in LF values occurred between day 1 and 30. Group 9 (frozen specimens did not significantly change their fluorescence response. Conclusions: An inevitable change in LF takes place due to various storage media commonly used in dental research at different time intervals. The values obtained from our study can remove the bias caused by the storage media and the values of LF thus obtained can hence be conveniently extrapolated to the in vivo condition.

  18. Experimental validation of normalized uniform load surface curvature method for damage localization.

    Science.gov (United States)

    Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo

    2015-10-16

    In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise.

  19. Evaluation of Real-Time PCR to complement ISO 6579:2004 method for the detection of Salmonella in pork cuts

    Directory of Open Access Journals (Sweden)

    Frédérique Pasquali

    2012-10-01

    Full Text Available According to Commission Regulation (EC No 2073/2005 of 15 November 2005 on microbiological crite-ria for foodstuff , the analytical reference method for the detection of Salmonella in food is ISO 6579:2004. However this long and labor-intensive method is not in line with the short production times of the food industry. In the last years, Real-Time PCR is used more and more by scientists for the relia-ble, fast and specific detection of bacterial pathogens in food. The aim of the present study was to eval-uate the Salmonella detection capability of a validated Real-Time PCR assay on naturally contaminated pork cuts in comparison with the reference method ISO 6579:2004. Three sampling were performed and included 16 pork cut packaging. From each packaging, three aliquots of 10 g each were tested separate-ly by ISO 6579:2004 method and by Real-Time PCR. In particular this molecular method was applied on DNA samples extracted from pre-enrichment broth after 1 and 18 hours of incubation. Within the three sampling periods, Real-Time PCR detected Salmonella in 81%, 100% e 62,5% of pork cut samples respectively, whereas the corresponding percentages of detection of the reference method were 56%, 81% e 62,5% respectively. In conclusion the Real-Time PCR assay used in the present study might be a reliable tool for a fast detection of Salmonella on pork cuts, especially when large number of samples needs to be tested. The reference method might be applied only on positive samples for isolation purpos-es mandatory in epidemiological investigations.

  20. Discrimination methods for biological contaminants in fresh-cut lettuce based on VNIR and NIR hyperspectral imaging

    Science.gov (United States)

    Mo, Changyeun; Kim, Giyoung; Kim, Moon S.; Lim, Jongguk; Lee, Seung Hyun; Lee, Hong-Seok; Cho, Byoung-Kwan

    2017-09-01

    The rapid detection of biological contaminants such as worms in fresh-cut vegetables is necessary to improve the efficiency of visual inspections carried out by workers. Multispectral imaging algorithms were developed using visible-near-infrared (VNIR) and near-infrared (NIR) hyperspectral imaging (HSI) techniques to detect worms in fresh-cut lettuce. The optimal wavebands that can detect worms in fresh-cut lettuce were investigated for each type of HSI using one-way ANOVA. Worm-detection imaging algorithms for VNIR and NIR imaging exhibited prediction accuracies of 97.00% (RI547/945) and 100.0% (RI1064/1176, SI1064-1176, RSI-I(1064-1173)/1064, and RSI-II(1064-1176)/(1064+1176)), respectively. The two HSI techniques revealed that spectral images with a pixel size of 1 × 1 mm or 2 × 2 mm had the best classification accuracy for worms. The results demonstrate that hyperspectral reflectance imaging techniques have the potential to detect worms in fresh-cut lettuce. Future research relating to this work will focus on a real-time sorting system for lettuce that can simultaneously detect various defects such as browning, worms, and slugs.

  1. Edgeworth expansions and rates of convergence for normalized sums: Chung's 1946 method revisited

    OpenAIRE

    2010-01-01

    Abstract In this paper we revisit, correct and extend Chung?s 1946 method for deriving higher order Edgeworth expansions with respect to t-statistics and generalized self-normalized sums. Thereby we provide a set of formulas which allows the computation of the approximation of any order and specify the first four polynomials in the Edgeworth expansion the first two of which are well known. It turns out that knowledge of the first four polynomials is necessary and sufficient for cha...

  2. Impact of the Choice of Normalization Method on Molecular Cancer Class Discovery Using Nonnegative Matrix Factorization

    Science.gov (United States)

    Yang, Haixuan; Seoighe, Cathal

    2016-01-01

    Nonnegative Matrix Factorization (NMF) has proved to be an effective method for unsupervised clustering analysis of gene expression data. By the nonnegativity constraint, NMF provides a decomposition of the data matrix into two matrices that have been used for clustering analysis. However, the decomposition is not unique. This allows different clustering results to be obtained, resulting in different interpretations of the decomposition. To alleviate this problem, some existing methods directly enforce uniqueness to some extent by adding regularization terms in the NMF objective function. Alternatively, various normalization methods have been applied to the factor matrices; however, the effects of the choice of normalization have not been carefully investigated. Here we investigate the performance of NMF for the task of cancer class discovery, under a wide range of normalization choices. After extensive evaluations, we observe that the maximum norm showed the best performance, although the maximum norm has not previously been used for NMF. Matlab codes are freely available from: http://maths.nuigalway.ie/~haixuanyang/pNMF/pNMF.htm. PMID:27741311

  3. Quantitative analysis of collagen change between normal and cancerous thyroid tissues based on SHG method

    Science.gov (United States)

    Chen, Xiwen; Huang, Zufang; Xi, Gangqin; Chen, Yongjian; Lin, Duo; Wang, Jing; Li, Zuanfang; Sun, Liqing; Chen, Jianxin; Chen, Rong

    2012-03-01

    Second-harmonic generation (SHG) is proved to be a high spatial resolution, large penetration depth and non-photobleaching method. In our study, SHG method was used to investigate the normal and cancerous thyroid tissue. For SHG imaging performance, system parameters were adjusted for high-contrast images acquisition. Each x-y image was recorded in pseudo-color, which matches the wavelength range in the visible spectrum. The acquisition time for a 512×512-pixels image was 1.57 sec; each acquired image was averaged four frames to improve the signal-to-noise ratio. Our results indicated that collagen presence as determined by counting the ratio of the SHG pixels over the whole pixels for normal and cancerous thyroid tissues were 0.48+/-0.05, 0.33+/-0.06 respectively. In addition, to quantitatively assess collagen-related changes, we employed GLCM texture analysis to the SHG images. Corresponding results showed that the correlation both fell off with distance in normal and cancerous group. Calculated value of Corr50 (the distance where the correlation crossed 50% of the initial correlation) indicated significant difference. This study demonstrates that SHG method can be used as a complementary tool in thyroid histopathology.

  4. Evaluation of algorithm methods for fluorescence spectra of cancerous and normal human tissues

    Science.gov (United States)

    Pu, Yang; Wang, Wubao; Alfano, Robert R.

    2016-03-01

    The paper focus on the various algorithms on to unravel the fluorescence spectra by unmixing methods to identify cancerous and normal human tissues from the measured fluorescence spectroscopy. The biochemical or morphologic changes that cause fluorescence spectra variations would appear earlier than the histological approach; therefore, fluorescence spectroscopy holds a great promise as clinical tool for diagnosing early stage of carcinomas and other deceases for in vivo use. The method can further identify tissue biomarkers by decomposing the spectral contributions of different fluorescent molecules of interest. In this work, we investigate the performance of blind source un-mixing methods (backward model) and spectral fitting approaches (forward model) in decomposing the contributions of key fluorescent molecules from the tissue mixture background when certain selected excitation wavelength is applied. Pairs of adenocarcinoma as well as normal tissues confirmed by pathologist were excited by selective wavelength of 340 nm. The emission spectra of resected fresh tissue were used to evaluate the relative changes of collagen, reduced nicotinamide adenine dinucleotide (NADH), and Flavin by various spectral un-mixing methods. Two categories of algorithms: forward methods and Blind Source Separation [such as Principal Component Analysis (PCA) and Independent Component Analysis (ICA), and Nonnegative Matrix Factorization (NMF)] will be introduced and evaluated. The purpose of the spectral analysis is to discard the redundant information which conceals the difference between these two types of tissues, but keep their diagnostically significance. The facts predicted by different methods were compared to the gold standard of histopathology. The results indicate that these key fluorophores within tissue, e.g. tryptophan, collagen, and NADH, and flavin, show differences of relative contents of fluorophores among different types of human cancer and normal tissues. The

  5. Effect of various normalization methods on Applied Biosystems expression array system data

    Directory of Open Access Journals (Sweden)

    Keys David N

    2006-12-01

    Full Text Available Abstract Background DNA microarray technology provides a powerful tool for characterizing gene expression on a genome scale. While the technology has been widely used in discovery-based medical and basic biological research, its direct application in clinical practice and regulatory decision-making has been questioned. A few key issues, including the reproducibility, reliability, compatibility and standardization of microarray analysis and results, must be critically addressed before any routine usage of microarrays in clinical laboratory and regulated areas can occur. In this study we investigate some of these issues for the Applied Biosystems Human Genome Survey Microarrays. Results We analyzed the gene expression profiles of two samples: brain and universal human reference (UHR, a mixture of RNAs from 10 cancer cell lines, using the Applied Biosystems Human Genome Survey Microarrays. Five technical replicates in three different sites were performed on the same total RNA samples according to manufacturer's standard protocols. Five different methods, quantile, median, scale, VSN and cyclic loess were used to normalize AB microarray data within each site. 1,000 genes spanning a wide dynamic range in gene expression levels were selected for real-time PCR validation. Using the TaqMan® assays data set as the reference set, the performance of the five normalization methods was evaluated focusing on the following criteria: (1 Sensitivity and reproducibility in detection of expression; (2 Fold change correlation with real-time PCR data; (3 Sensitivity and specificity in detection of differential expression; (4 Reproducibility of differentially expressed gene lists. Conclusion Our results showed a high level of concordance between these normalization methods. This is true, regardless of whether signal, detection, variation, fold change measurements and reproducibility were interrogated. Furthermore, we used TaqMan® assays as a reference, to generate

  6. The N/D method with non-perturbative left-hand-cut discontinuity and the $^1S_0$ $NN$ partial wave

    CERN Document Server

    Entem, D R

    2016-01-01

    In this letter we deduce an integral equation that allows to calculate the exact left-hand-cut discontinuity for an uncoupled $S$-wave partial-wave amplitude in potential scattering for a given finite-range potential. The results obtained from the $N/D$ method for the partial-wave amplitude are rigorous, since now the discontinuities along the left-hand cut and right-hand cut are exactly known. This solves the open question with respect to the $N/D$ method and the effect on the final result of the non-perturbative iterative diagrams in the evaluation of $\\Delta(A)$. A big advantage of the method is that short-range physics (corresponding to integrated out degrees of freedom within low-energy Effective Field Theory) does not contribute to $\\Delta(A)$ and it manifests through the extra subtractions that are implemented within the method. We show the equivalence of the $N/D$ method and the Lippmann-Schwinger (LS) equation for a nonsingular $^1S_0$ $NN$ potential (Yukawa potential). The equivalence between the $N...

  7. Evaluation of normalization methods to pave the way towards large-scale LC-MS-based metabolomics profiling experiments.

    Science.gov (United States)

    Ejigu, Bedilu Alamirie; Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya

    2013-09-01

    Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments.

  8. Force Modeling for Ultrasonic-assisted Wire Saw Cutting SiC Monocryatal Wafers

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jie; LI Shujuan; Liu Yong

    2011-01-01

    The advantages, such as a small cutting force, narrow kerf and little material waste make wire saw cut- ting suitable for machining precious materials like SiC, Si monocrystal and a variety of gem. As regards wire saw cutting fo wafer, however, in traditional wire saw cutting process, the cutting efficiency is low, the wear of wire saw is badly, the surface roughness of wafer is poor etc, which have a seriously impact on the cutting process stability and the use of wafers. Ultrasonic-assisted machining method is very suitable for processing a variety of non-conduc- tive hard and brittle materials, glass, ceramics, quartz, silicon, precious stones and diamonds, etc. In this paper, the force model of ultrusonic-assisted wire saw cutting of SiC monocrystal wafer, based on the kinematic and experi- mental analysis were established. The single factor and orthogonal experimental scheme for different processing pa- rameters such as wire saw speed, part rotation speed of and part feed rate, were carried out in traditional wire saw and ultrasonic-assisted wire saw cutting process. The multiple linear regression method is used to establish the static model among the cutting force, processing parameters and ultrasonic vibration parameters, and the model signifi- cance is verified. The results show, as regards ultrasonic-assisted wire saw cutting of SiC monicrystal wafer, both the tangential and normal cutting forces can reduce about 24. 5%-36% and 36. 6%-40%.

  9. Numerical Validation of the Delaunay Normalization and the Krylov-Bogoliubov-Mitropolsky Method

    Directory of Open Access Journals (Sweden)

    David Ortigosa

    2014-01-01

    Full Text Available A scalable second-order analytical orbit propagator programme based on modern and classical perturbation methods is being developed. As a first step in the validation and verification of part of our orbit propagator programme, we only consider the perturbation produced by zonal harmonic coefficients in the Earth’s gravity potential, so that it is possible to analyze the behaviour of the mathematical expressions involved in Delaunay normalization and the Krylov-Bogoliubov-Mitropolsky method in depth and determine their limits.

  10. Establishment of a novel method for primary culture of normal human cervical keratinocytes

    Institute of Scientific and Technical Information of China (English)

    LIU Yu-zhen; L(U) Xiu-ping; PAN Zi-xuan; ZHANG Wei; CHEN Zhao-ri; WANG Hui; LIU Hua

    2013-01-01

    Background Cervical keratinocytes are recovered at a low numbers and frequently associated with contaminating human fibroblasts which rapidly overgrow the epithelial cells in culture with medium supplemented with 10% fetal bovine serum (FBS).However,it is difficult to initiate keratinocyte cultures with serum-free keratinocyte growth medium alone because cell attachment can be poor.Therefore,the culture of these cells is extremely difficult.In this study,we described a modified culture medium and coated culture plastics for growing normal human cervical epithelial cells in vitro.Methods Normal cervical epithelial tissue pieces were obtained and digested with type Ⅰ collagenase to dissociate the cells and a single cell suspension produced.The cells were cultured on plastic tissue culture substrate alone or substrate coated with collagen type Ⅰ from rat tail,with modified keratinocyte serum-free medium (K-SFM) supplemented with 5% FBS.After attachment,the medium were replaced with K-SFM without FBS.The expression of basal keratins of the ectocervical epithelium,K5,K14 and K19 were assayed by immunofiuorescence with monoclonal antibodies to identify the cell purity.Results Our results indicate that cells attached to the culture plastic more quickly in K-SFM supplemented with 5%FBS than in K-SFM alone,as well as to tissue culture plastic coated with collagen type Ⅰ than plastic alone.The modified medium composed of K-SFM and 5% FBS combined with a specific tissue culture plastic coated with collagen type Ⅰ from rat tail was the best method for culture of normal cervical epithelial cells.K5,K14 and K19 were assayed and keratinocyte purity was nearly 100%.Conclusion A novel,simple and effective method can be used to rapidly obtain highly purified keratinocytes from normal human cervical epithelium.

  11. Effects of Different Cutting Patterns and Experimental Conditions on the Performance of a Conical Drag Tool

    Science.gov (United States)

    Copur, Hanifi; Bilgin, Nuh; Balci, Cemal; Tumac, Deniz; Avunduk, Emre

    2017-06-01

    This study aims at determining the effects of single-, double-, and triple-spiral cutting patterns; the effects of tool cutting speeds on the experimental scale; and the effects of the method of yield estimation on cutting performance by performing a set of full-scale linear cutting tests with a conical cutting tool. The average and maximum normal, cutting and side forces; specific energy; yield; and coarseness index are measured and compared in each cutting pattern at a 25-mm line spacing, at varying depths of cut per revolution, and using two cutting speeds on five different rock samples. The results indicate that the optimum specific energy decreases by approximately 25% with an increasing number of spirals from the single- to the double-spiral cutting pattern for the hard rocks, whereas generally little effect was observed for the soft- and medium-strength rocks. The double-spiral cutting pattern appeared to be more effective than the single- or triple-spiral cutting pattern and had an advantage of lower side forces. The tool cutting speed had no apparent effect on the cutting performance. The estimation of the specific energy by the yield based on the theoretical swept area was not significantly different from that estimated by the yield based on the muck weighing, especially for the double- and triple-spiral cutting patterns and with the optimum ratio of line spacing to depth of cut per revolution. This study also demonstrated that the cutterhead and mechanical miner designs, semi-theoretical deterministic computer simulations and empirical performance predictions and optimization models should be based on realistic experimental simulations. Studies should be continued to obtain more reliable results by creating a larger database of laboratory tests and field performance records for mechanical miners using drag tools.

  12. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    Energy Technology Data Exchange (ETDEWEB)

    Malinowski, Jacek

    2004-05-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  13. Value of two noninvasive methods to detect progression of fibrosis among HCV carriers with normal aminotransferases.

    Science.gov (United States)

    Colletta, Cosimo; Smirne, Carlo; Fabris, Carlo; Toniutto, Pierluigi; Rapetti, Rachele; Minisini, Rosalba; Pirisi, Mario

    2005-10-01

    The course of hepatitis C virus (HCV) infection carriers with normal/near-normal aminotransferases (NALT) is usually mild; however, in a few, fibrosis progression occurs. We aimed to verify whether monitoring by liver biopsy might be replaced by noninvasive methods and to identify factors associated with fibrosis progression in patients with persistently normal alanine aminotransferases. We studied 40 untreated HCV-RNA-positive subjects (22 male; median age, 44 years), who underwent two liver biopsies, with a median interval of 78.5 months, during which alanine aminotransferase concentrations (median number of determinations: 12) never exceeded 1.2 times the upper normal limit. Within 9 months from the second biopsy, they were tested by the shear elasticity probe (Fibroscan) and the artificial intelligence algorithm FibroTest. METAVIR fibrosis scores were analyzed in relationship to demographic, clinical, and viral parameters. Weighted kappa analysis was used to verify whether the results of noninvasive methods agreed with histology. Significant fibrosis (> or = F2), present at the first biopsy in only one patient (2.5%), was observed at the second biopsy in 14 patients (35%). At multivariate analysis, excess alcohol consumption in the past (>20 g/d; P = .017) and viral load (>8.0 x 10(6) copies/mL; P = .021) were independent predictors of progression. In identifying patients with significant fibrosis, inter-rater agreement was excellent for Fibroscan (weighted kappa = 1.0), and poor for FibroTest (weighted kappa = -0.041). In conclusion, among HCV carriers with NALT, Fibroscan is superior to the FibroTest in the noninvasive identification of fibrosis, for which excess alcohol consumption in the past and high viral load represent risk factors.

  14. An Interior Point Path-following Method for Nonconvex Programming With Quasi Normal Cone Condition

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    @@Since Karmarkar's famous paper[1] on a new polynomial interior point algorithm for linear programming was published in 1984, interior point methods have been proven to be a class of efficient methods for mathematical programming and have been paid muchattention. Up till now, theories, algorithms and applications of interior point methods for linear programming as well as convex nonlinear programming have been well studied (see [2] and references therein), however, few results on nonconvex programming have been published. For nonconvex problem, the existence of interior path to a solution, which is trivial for linear and convex programming, becomes a key problem. In [3,4], the so-called normal cone condition (NCC) was introduced and used for a class of nonconvex programming problem, that is, the out normal cone of a feasible set cannot meet the strictly feasible set. This condition is a generalization of the convexity, in other words, it imposes restrictions on nonconvexity of the feasible set. A combined homotopy interior point method was constructed, and the existence of the interior path from a known interior point to a solution of the K-K-T system for nonconvex programming with the NCC was proven.

  15. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Mengjiao Yu; Ramadan Ahmed; Mark Pickell; Len Volk; Lei Zhou; Zhu Chen; Aimee Washington; Crystal Redden

    2003-09-30

    The Quarter began with installing the new drill pipe, hooking up the new hydraulic power unit, completing the pipe rotation system (Task 4 has been completed), and making the SWACO choke operational. Detailed design and procurement work is proceeding on a system to elevate the drill-string section. The prototype Foam Generator Cell has been completed by Temco and delivered. Work is currently underway to calibrate the system. Literature review and preliminary model development for cuttings transportation with polymer foam under EPET conditions are in progress. Preparations for preliminary cuttings transport experiments with polymer foam have been completed. Two nuclear densitometers were re-calibrated. Drill pipe rotation system was tested up to 250 RPM. Water flow tests were conducted while rotating the drill pipe up to 100 RPM. The accuracy of weight measurements for cuttings in the annulus was evaluated. Additional modifications of the cuttings collection system are being considered in order to obtain the desired accurate measurement of cuttings weight in the annular test section. Cutting transport experiments with aerated fluids are being conducted at EPET, and analyses of the collected data are in progress. The printed circuit board is functioning with acceptable noise level to measure cuttings concentration at static condition using ultrasonic method. We were able to conduct several tests using a standard low pass filter to eliminate high frequency noise. We tested to verify that we can distinguish between different depths of sand in a static bed of sand. We tested with water, air and a mix of the two mediums. Major modifications to the DTF have almost been completed. A stop-flow cell is being designed for the DTF, the ACTF and Foam Generator/Viscometer which will allow us to capture bubble images without the need for ultra fast shutter speeds or microsecond flash system.

  16. [Modified method of constructing tissue microarray which contains keloid and normal skin].

    Science.gov (United States)

    Zhang, Zhenyu; Chen, Junjie; Cen, Ying; Zhao, Sha; Liao, Dianying; Gong, Jing

    2010-08-01

    To seek for a method of constructing the tissue microarray which contains keloid, skin around keloid, and normal skin. The specimens were gained from patients of voluntary donation between March and May 2009, including the tissues of keloid (27 cases), skin around keloid (13 cases), and normal skin (27 cases). The specimens were imbedded by paraffin as donor blocks. The traditional method of constructing the tissue microarray and section were modified according to the histological characteristics of the keloid and skin tissue and the experimental requirement. The tissue cores were drilled from donor blocks and attached securely on the adhesive platform which was prepared. The adhesive platform with tissue cores in situ was placed into an imbedding mold, which then was preheated briefly. Paraffin at approximately 70 degrees C was injected to fill the mold and then cooled to room temperature. Then HE staining, immunohistochemistry staining were performed and the results were observed by microscope. The constructed tissue microarray block contained 67 cores as designed and displayed smooth surface with no crack. All the cores distributed regularly, had no disintegration or manifest shift. HE staining of tissue microarray section showed that all cores had equal thickness, distinct layer, manifest contradistinction, well-defined edge, and consistent with original pathological diagnosis. Immunohistochemistry staining results demonstrated that all cores contained enough tissue dose to apply group comparison. However, in tissue microarray which was made as traditional method, many cores missed and a few cores shifted obviously. Applying modified method can successfully construct tissue microarray which is composed of keloid, skin around keloid, and normal skin. This tissue microarray will become an effective tool of researching the pathogenesis of keloid.

  17. Cutting temperature measurement and material machinability

    Directory of Open Access Journals (Sweden)

    Nedić Bogdan P.

    2014-01-01

    Full Text Available Cutting temperature is very important parameter of cutting process. Around 90% of heat generated during cutting process is then away by sawdust, and the rest is transferred to the tool and workpiece. In this research cutting temperature was measured with artificial thermocouples and question of investigation of metal machinability from aspect of cutting temperature was analyzed. For investigation of material machinability during turning artificial thermocouple was placed just below the cutting top of insert, and for drilling thermocouples were placed through screw holes on the face surface. In this way was obtained simple, reliable, economic and accurate method for investigation of cutting machinability.

  18. Kids Who Cut.

    Science.gov (United States)

    Coy, Doris Rhea; Simpson, Chris

    2002-01-01

    Regardless of whether it is cutting, burning or some other form of self-harm, self-injury is a serious problem requiring serious solutions. This article reviews the various types of self-harm, descriptions of self-mutilators, common myths about self-mutilation, and effective treatment methods. (GCP)

  19. Normalizing surface electromyographic measures of the masticatory muscles: Comparison of two different methods for clinical purpose.

    Science.gov (United States)

    Mapelli, Andrea; Tartaglia, Gianluca Martino; Connelly, Stephen Thaddeus; Ferrario, Virgilio Ferruccio; De Felicio, Claudia Maria; Sforza, Chiarella

    2016-10-01

    To compare a new normalization technique (wax pad, WAX) with the currently utilized cotton roll (COT) method in surface electromyography (sEMG) of the masticatory muscles. sEMG of the masseter and anterior temporalis muscles of 23 subjects was recorded while performing two repetitions of 5s maximum voluntary clenches (MVC) on COT and WAX. For each task, the mean value of sEMG amplitude and its coefficient of variation were calculated, and the differences between the two repetitions computed. The standard error of measurement (SEM) was calculated. For each subject and muscle, the COT-to-WAX maximum activity increment was computed. Participant preference between tasks was also recorded. WAX MVC tasks had larger maximum EMG amplitude than COT MVC tasks (P0.391) and its coefficient of variation were unchanged (P>0.180). The WAX task was the more comfortable for 18/23 subjects (P=0.007). WAX normalization ensures the same stability level of maximum EMG amplitude as COT normalization, but it is more repeatable, elicits larger maximum muscular contraction, and is felt to be more comfortable by subjects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Study on compressive strength of self compacting mortar cubes under normal & electric oven curing methods

    Science.gov (United States)

    Prasanna Venkatesh, G. J.; Vivek, S. S.; Dhinakaran, G.

    2017-07-01

    In the majority of civil engineering applications, the basic building blocks were the masonry units. Those masonry units were developed as a monolithic structure by plastering process with the help of binding agents namely mud, lime, cement and their combinations. In recent advancements, the mortar study plays an important role in crack repairs, structural rehabilitation, retrofitting, pointing and plastering operations. The rheology of mortar includes flowable, passing and filling properties which were analogous with the behaviour of self compacting concrete. In self compacting (SC) mortar cubes, the cement was replaced by mineral admixtures namely silica fume (SF) from 5% to 20% (with an increment of 5%), metakaolin (MK) from 10% to 30% (with an increment of 10%) and ground granulated blast furnace slag (GGBS) from 25% to 75% (with an increment of 25%). The ratio between cement and fine aggregate was kept constant as 1: 2 for all normal and self compacting mortar mixes. The accelerated curing namely electric oven curing with the differential temperature of 128°C for the period of 4 hours was adopted. It was found that the compressive strength obtained from the normal and electric oven method of curing was higher for self compacting mortar cubes than normal mortar cube. The cement replacement by 15% SF, 20% MK and 25%GGBS obtained higher strength under both curing conditions.

  1. A Preliminary Study on Ultrasonic Cutting Process for Carbon Fibre Prepreg

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In ultrasonic cutting process, the cutting pressure is still essential for achieving good quality cut edge compared with traditional mechanical cutting process. With the increase of the cutting power, the cutting pressure demand decreases. The cutting velocity range is extended broadly. The experiment results show that ultrasonic cutting technique is an effective, clean and controllable method for machining of carbon fibre prepreg.

  2. [Method for determination of individual normal values of blood pressure in humans].

    Science.gov (United States)

    Volians'kyĭ, O M

    2010-01-01

    The base of a methodology for evaluation of individual normal values of blood pressure (BP) was the determination of homeostatic reaction scale range of mechanism adaptation in each individual, which then were compared with test values of this parameter. Averaged values of dynamic arterial pressure were used to construct the scale. The values were obtained during daily monitoring, and in process of veloergometry test. The method of comparison of obtained values of blood pressure with the scale allowed to carry out pre-diagnostics of both hypertensive, and hypotensive disorders in each volunteer

  3. Influence of the Magnetic High-speed Steel Cutting Tool on Cutting Capability

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The high-speed steel cutting tool has advantaged i n modern cutting tool for its preferable synthetical performance, especially, in a pplication of complicated cutting tools. Therefore, the study of the high-speed steel cutting tools that occupied half of cutting tools has become an importa nt way of studying on modern cutting technology. The cutting performance of hi gh speed-steel cutting tools will be improved by magnetization treating method. Microstructure of high-speed steel will be changed as a ...

  4. Prediction method for risks of coal and gas outbursts based on spatial chaos theory using gas desorption index of drill cuttings

    Institute of Scientific and Technical Information of China (English)

    Li Dingqi; Cheng Yuanping; Wang Lei; Wang Haifeng; Wang Liang; Zhou Hongxing

    2011-01-01

    Based on the evolution of geological dynamics and spatial chaos theory,we proposed the advanced prediction an advanced prediction method of a gas desorption index of drill cuttings to predict coal and gas outbursts.We investigated and verified the prediction method by a spatial series data of a gas desorption index of drill cuttings obtained from the 113112 coal roadway at the Shitai Mine.Our experimental results show that the spatial distribution of the gas desorption index of drill cuttings has some chaotic characteristics,which implies that the risk of coal and gas outbursts can be predicted by spatial chaos theory.We also found that a proper amount of sample data needs to be chosen in order to ensure the accuracy and practical maneuverability of prediction.The relative prediction error is small when the prediction pace is chosen carefully.In our experiments,it turned out that the optimum number of sample points is 80 and the optimum prediction pace 30.The corresponding advanced prediction pace basically meets the requirements of engineering applications.

  5. Reducing the nonconforming products by using the Six Sigma method: A case study of a polyes-ter short cut fiber manufacturing in Indonesia

    Directory of Open Access Journals (Sweden)

    Oky Syafwiratama

    2017-03-01

    Full Text Available Polyester short cut fiber is a textile industry which is rarely explored or researched. This research explains the necessary steps of improvement using Six Sigma method to reduce the nonconform-ing products in a polyester short cut fiber manufacturing in Indonesia. An increased noncon-forming products in the shortcut fiber production process created some quality problems from January to May, 2015. Define, measure, analysis, improve, control (DMAIC steps were im-plemented to determine root cause of the problems and to improve production process using sta-tistical approach. The results of Six Sigma improvement has indicated that the process capability was increased from 2.2 to 3.1 sigma, savings $18,394.2 USD per-month.

  6. Normal response function method for mass and stiffness matrix updating using complex FRFs

    Science.gov (United States)

    Pradhan, S.; Modak, S. V.

    2012-10-01

    Quite often a structural dynamic finite element model is required to be updated so as to accurately predict the dynamic characteristics like natural frequencies and the mode shapes. Since in many situations undamped natural frequencies and mode shapes need to be predicted, it has generally been the practice in these situations to seek updating of only mass and stiffness matrix so as to obtain a reliable prediction model. Updating using frequency response functions (FRFs) has been one of the widely used approaches for updating, including updating of mass and stiffness matrices. However, the problem with FRF based methods, for updating mass and stiffness matrices, is that these methods are based on use of complex FRFs. Use of complex FRFs to update mass and stiffness matrices is not theoretically correct as complex FRFs are not only affected by these two matrices but also by the damping matrix. Therefore, in situations where updating of only mass and stiffness matrices using FRFs is required, the use of complex FRFs based updating formulation is not fully justified and would lead to inaccurate updated models. This paper addresses this difficulty and proposes an improved FRF based finite element model updating procedure using the concept of normal FRFs. The proposed method is a modified version of the existing response function method that is based on the complex FRFs. The effectiveness of the proposed method is validated through a numerical study of a simple but representative beam structure. The effect of coordinate incompleteness and robustness of method under presence of noise is investigated. The results of updating obtained by the improved method are compared with the existing response function method. The performance of the two approaches is compared for cases of light, medium and heavily damped structures. It is found that the proposed improved method is effective in updating of mass and stiffness matrices in all the cases of complete and incomplete data and

  7. A step beyond the Monte Carlo method in economics: Application of multivariate normal distribution

    Science.gov (United States)

    Kabaivanov, S.; Malechkova, A.; Marchev, A.; Milev, M.; Markovska, V.; Nikolova, K.

    2015-11-01

    In this paper we discuss the numerical algorithm of Milev-Tagliani [25] used for pricing of discrete double barrier options. The problem can be reduced to accurate valuation of an n-dimensional path integral with probability density function of a multivariate normal distribution. The efficient solution of this problem with the Milev-Tagliani algorithm is a step beyond the classical application of Monte Carlo for option pricing. We explore continuous and discrete monitoring of asset path pricing, compare the error of frequently applied quantitative methods such as the Monte Carlo method and finally analyze the accuracy of the Milev-Tagliani algorithm by presenting the profound research and important results of Honga, S. Leeb and T. Li [16].

  8. Normal vector method for convergence improvement using the RCWA for crossed gratings.

    Science.gov (United States)

    Schuster, Thomas; Ruoff, Johannes; Kerwien, Norbert; Rafler, Stephan; Osten, Wolfgang

    2007-09-01

    The rigorous coupled wave analysis (RCWA) is a widely used method for simulating diffraction from periodic structures. Since its recognized formulation by Moharam [J. Opt. Soc. Am. A12, 1068 and 1077 (1995)], there still has been a discussion about convergence problems. Those problems are more or less solved for the diffraction from line gratings, but there remain different concurrent proposals about the convergence improvement for crossed gratings. We propose to combine Popov and Nevière's formulation of the differential method [Light Propagation in Periodic Media (Dekker, 2003) and J. Opt. Soc. Am. A18, 2886 (2001)] with the classical RCWA. With a suitable choice of a normal vector field we obtain a better convergence than for the formulations that are known from the literature.

  9. A NEW MOMENT METHOD FOR THE FAST AND ACCURATE ANALYSIS OF NORMAL MODE HELICAL ANTENNAS

    Institute of Scientific and Technical Information of China (English)

    Ji Yicai; Sun Baohua; Liu Qizhong

    2001-01-01

    In this letter, a new moment method using helical segments is presented to model Normal Mode Helical Antenna (NMHA). Using this method, the NMHA can be modeled by a few segments. The current distributions and radiation patterns of some NMHAs are calculated.A comparison is made between results obtained using this helical segment algorithm and a linear segment algorithm, and the results of the two algorithms agree fairly well. When calculating the impedance matrix [Z], all the elements of the matrix can be obtained by only calculating a few elements with the application of the symmetric and periodic characteristics of the NMHA.Therefore, the CPU time and the memory storage are significantly reduced, with the accuracy and speed enhanced.

  10. Dominus for cut flower production

    Science.gov (United States)

    Fumigation with methyl bromide was the principal method of soilborne pest control in cut flower production. Many cut flower growers in Florida have ceased production, but those that remain are restricted in the fumigants that they are able to utilize due to proximity to potable water sources and oc...

  11. A Modified Method for Measuring Root Iron Reductase Activity Under Normal Laboratory Conditions

    Institute of Scientific and Technical Information of China (English)

    ZHENG Shao-Jian; HE Yun-Feng; TANG Cai-Xian; Y. MASAOKA

    2005-01-01

    Based on the strong chelating property of bathophenanthroline disulfonic acid (BPDS) with Fe(Ⅱ), root Fe(Ⅲ) chelate reductase activity is usually measured with a spectrophotometer using MES (2-morpholinoethanesulfonic acid) or HEPES (2-(4-(2-Hydroxyethyl)-1-piperazinyl) ethanesulfonic acid) buffer in the dark because of high autoreduction rate of Fe(Ⅲ)in the presence of light. However, the exclusion of light is inconvenient, especially when analyzing a large number of samples. The objective of this study was to develop a new method for determination of root reductase activity under normal laboratory conditions using a suitable buffer composition and Fe(Ⅲ) concentration to eliminate the autoreduction of Fe(Ⅲ). A modified method using a Tris (2-amino-2-hydroxymethyl-1,3-propanediol) buffer at pH 7.5 instead of MES or HEPES buffer and a decreased FeEDTA (Fe ethylene diamine tetraacetic acid) concentration of 50 μmol L-1 was developed. The autoreduction of Fe(Ⅲ) using the Tris buffer was undetectable for temperatures at 4 and 28 ℃ and was also much lower than that using the other buffers even with sunlight during measurement of Fe(Ⅲ) reduction.Furthermore, the differences in Fe(Ⅲ) reductase activity among 5 plant species and 14 red clover cultivars (Trifolium pratense L.) could be easily detected with the modified method. The method developed in this study to measure root Fe chelate reductase activity was not only effective and reliable but also easily managed under normal laboratory light conditions.

  12. Negative refractive index metamaterials using only metallic cut wires.

    Science.gov (United States)

    Sellier, Alexandre; Burokur, Shah Nawaz; Kanté, Boubacar; de Lustrac, André

    2009-04-13

    We present, design and analyze a novel planar Left-Handed (LH) metamaterial at microwave frequencies. This metamaterial is composed of only metallic cut wires and is used under normal-to-plane incidence. Using Finite Element Method (FEM) based simulations and microwave experiments, we have investigated the material properties of the structure. Simultaneous negative values are observed for the permittivity epsilon and permeability mu by the inversion method from the transmission and reflection responses. A negative index n is verified in a bulk prism engineered by stacking several layers of the metamaterial. Our work demonstrates the feasibility of a LH metamaterial composed of only cut wires.

  13. Optimization of accelerator parameters using normal form methods on high-order transfer maps

    Energy Technology Data Exchange (ETDEWEB)

    Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)

    2007-05-01

    Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented

  14. 馏分切割在 FCC 汽油吸附脱硫中的应用%APPLICATION OF FRACTION CUTTING METHODS IN FCC GASOLINE ADSORPTION DESULFURIZATION

    Institute of Scientific and Technical Information of China (English)

    祖运; 范跃超; 秦玉才; 宋丽娟

    2016-01-01

    采用轻重馏分切割、温度点切割及等体积切割方法对 FCC 汽油进行切割,运用改性 Y 分子筛[NiY,Cu(Ⅰ)Y,CeY]吸附剂对汽油馏分进行吸附脱硫性能考察,并联合微库仑技术和色谱-硫化物发光检测(GC-SCD)偶联技术分析切割后各馏分中硫化物的脱除情况。结果表明:NiY,Cu(Ⅰ)Y,CeY 中 B 酸和 L 酸的类型和强度决定催化剂对不同馏分的脱硫性能;NiY 中的弱 B 酸和弱 L 酸中心对芳烃含量低的馏分有较好的脱硫性能,CeY 中的强 B 酸和弱 L 酸中心对烯烃少的馏分有较好的脱硫性能,而 Cu(Ⅰ)Y 中的强 B 酸和强 L酸中心对各馏分的脱硫性能均较差;在等体积切割方法中,采用 NiY 对前段馏分、CeY 对后段馏分进行吸附脱硫,可以将 FCC 汽油的脱硫率较单一吸附剂提高47.54百分点和22.40百分点。%The performances of modified Y zeolites(Cu(Ⅰ)Y,CeY,NiY)for selective adsorption desulfurization of FCC gasoline were investigated. The FCC gasoline was cut into different fractions ac-cording to three methods:two fractions of light and heavy,temperature point cutting and constant vol-ume cutting. At the same time,the sulfur content in each fraction before and after adsorption was ana-lyzed by microcoulometry and GC-SCD techniques. The results indicate that the type and intensity of Brönsted acid on Lewis acid on the NiY,Cu(Ⅰ)Y and CeY zeolites determine the desulfurization extent of the cutting fractions. The weak Brönsted acid and the weak Lewis acid on the NiY have a higher des-ulfurization performance for low aromatic fractions;the strong Brönsted acid and the weak Lewis acid in the CeY have a better desulfurization performance for low olefin fractions. However,the strong Brönsted acid and strong acid Lewis in the Cu(Ⅰ)Y has a poorer desulfurization performance for all cut-ting fractions. In equivoluminal cutting method,the combination use of NiY for

  15. 不同方法北京小菊扦插育苗对比试验%Comparison test using different cutting methods of Beijing Chrysanthemum

    Institute of Scientific and Technical Information of China (English)

    姜红

    2012-01-01

    Test in two cutting methods between plug seedling and plant bed seedling;Useing three kinds of matrix: perlite,Peat and perlite+ sand+peat = 1:1:1, studied of rooting traits of different 6 varieties. The results showed that the best composition of matrix for cutting was 1 part grass charcoal plus 1 part pearlite;Spectrum 2 have the highest cutting survival rate in 6 varieties.%试验通过北京小菊穴盘、苗床育苗2种扦插方法;以纯珍珠岩、纯草炭及珍珠岩+素沙+草炭=1∶1∶1 3种基质,对北京小菊6个品种进行扦插育苗试验.结果表明:珍珠岩+素沙+草炭=1∶1∶1基质配比,穴盘扦插繁殖育苗成活率最高;北京小菊6个品种中光谱2号的扦插成活率最高.

  16. Quality Analysis of Cutting Steel Using Laser

    Directory of Open Access Journals (Sweden)

    Vladislav Markovič

    2013-02-01

    Full Text Available The article explores the quality dependence of the edge surface of steel C45 LST EN 10083-1 obtained cutting the material using laser on different cutting regimes and variations in the thickness of trial steel. The paper presents the influence of the main modes of laser cutting equipment Trulaser 3030, including cutting speed, pressure, angle and the thickness of the surface on the quality characteristics of the sample. The quality of the edge after laser cutting is the most important indicator influencing such technological spread in industry worldwide. Laser cutting is the most popular method of material cutting. Therefore, the article focuses on cutting equipment, cutting defects and methods of analysis. Research on microstructure, roughness and micro-toughness has been performed with reference to edge samples. At the end of the publication, conclusions are drawn.Article in Lithuanian

  17. Quality Analysis of Cutting Steel Using Laser

    Directory of Open Access Journals (Sweden)

    Vladislav Markovič

    2012-12-01

    Full Text Available The article explores the quality dependence of the edge surface of steel C45 LST EN 10083-1 obtained cutting the material using laser on different cutting regimes and variations in the thickness of trial steel. The paper presents the influence of the main modes of laser cutting equipment Trulaser 3030, including cutting speed, pressure, angle and the thickness of the surface on the quality characteristics of the sample. The quality of the edge after laser cutting is the most important indicator influencing such technological spread in industry worldwide. Laser cutting is the most popular method of material cutting. Therefore, the article focuses on cutting equipment, cutting defects and methods of analysis. Research on microstructure, roughness and micro-toughness has been performed with reference to edge samples. At the end of the publication, conclusions are drawn.Article in Lithuanian

  18. Chip-ejection interference in cutting processes of modern cutting tools

    Institute of Scientific and Technical Information of China (English)

    师汉民

    1999-01-01

    Based on the “principle of minimum energy”, the basic characteristics of non-free cutting are studied; the phenomenon and the nature of chip-ejection interference commonly existing in the cutting process of modem cutting tools are explored. A "synthesis method of elementary cutting tools" is suggested for modeling the cutting process of modem complex cutting tools. The general equation governing the chip-ejection motion is deduced. Real examples of non-free cutting are analyzed and the theoretically predicted results are supported by the experimental data or facts. The sufficient and necessary conditions for eliminating chip-ejection interference and for realizing free cutting are given; the idea and the technical approach of "the principle of free cutting" are also discussed, and a feasible way for improving or optimizing the cutting performance of modem cutting tools is, therefore, found.

  19. A Classification Method of Normal and Overweight Females Based on Facial Features for Automated Medical Applications

    Directory of Open Access Journals (Sweden)

    Bum Ju Lee

    2012-01-01

    Full Text Available Obesity and overweight have become serious public health problems worldwide. Obesity and abdominal obesity are associated with type 2 diabetes, cardiovascular diseases, and metabolic syndrome. In this paper, we first suggest a method of predicting normal and overweight females according to body mass index (BMI based on facial features. A total of 688 subjects participated in this study. We obtained the area under the ROC curve (AUC value of 0.861 and kappa value of 0.521 in Female: 21–40 (females aged 21–40 years group, and AUC value of 0.76 and kappa value of 0.401 in Female: 41–60 (females aged 41–60 years group. In two groups, we found many features showing statistical differences between normal and overweight subjects by using an independent two-sample t-test. We demonstrated that it is possible to predict BMI status using facial characteristics. Our results provide useful information for studies of obesity and facial characteristics, and may provide useful clues in the development of applications for alternative diagnosis of obesity in remote healthcare.

  20. Normal and abnormal tissue identification system and method for medical images such as digital mammograms

    Science.gov (United States)

    Heine, John J. (Inventor); Clarke, Laurence P. (Inventor); Deans, Stanley R. (Inventor); Stauduhar, Richard Paul (Inventor); Cullers, David Kent (Inventor)

    2001-01-01

    A system and method for analyzing a medical image to determine whether an abnormality is present, for example, in digital mammograms, includes the application of a wavelet expansion to a raw image to obtain subspace images of varying resolution. At least one subspace image is selected that has a resolution commensurate with a desired predetermined detection resolution range. A functional form of a probability distribution function is determined for each selected subspace image, and an optimal statistical normal image region test is determined for each selected subspace image. A threshold level for the probability distribution function is established from the optimal statistical normal image region test for each selected subspace image. A region size comprising at least one sector is defined, and an output image is created that includes a combination of all regions for each selected subspace image. Each region has a first value when the region intensity level is above the threshold and a second value when the region intensity level is below the threshold. This permits the localization of a potential abnormality within the image.

  1. Normalized impact factor (NIF): an adjusted method for calculating the citation rate of biomedical journals.

    Science.gov (United States)

    Owlia, P; Vasei, M; Goliaei, B; Nassiri, I

    2011-04-01

    The interests in journal impact factor (JIF) in scientific communities have grown over the last decades. The JIFs are used to evaluate journals quality and the papers published therein. JIF is a discipline specific measure and the comparison between the JIF dedicated to different disciplines is inadequate, unless a normalization process is performed. In this study, normalized impact factor (NIF) was introduced as a relatively simple method enabling the JIFs to be used when evaluating the quality of journals and research works in different disciplines. The NIF index was established based on the multiplication of JIF by a constant factor. The constants were calculated for all 54 disciplines of biomedical field during 2005, 2006, 2007, 2008 and 2009 years. Also, ranking of 393 journals in different biomedical disciplines according to the NIF and JIF were compared to illustrate how the NIF index can be used for the evaluation of publications in different disciplines. The findings prove that the use of the NIF enhances the equality in assessing the quality of research works produced by researchers who work in different disciplines.

  2. Determining quasidiabatic coupled electronic state Hamiltonians using derivative couplings: A normal equations based method.

    Science.gov (United States)

    Papas, Brian N; Schuurman, Michael S; Yarkony, David R

    2008-09-28

    A self-consistent procedure for constructing a quasidiabatic Hamiltonian representing N(state) coupled electronic states in the vicinity of an arbitrary point in nuclear coordinate space is described. The matrix elements of the Hamiltonian are polynomials of arbitrary order. Employing a crude adiabatic basis, the coefficients of the linear terms are determined exactly using analytic gradient techniques. The remaining polynomial coefficients are determined from the normal form of a system of pseudolinear equations, which uses energy gradient and derivative coupling information obtained from reliable multireference configuration interaction wave functions. In a previous implementation energy gradient and derivative coupling information were employed to limit the number of nuclear configurations at which ab initio data were required to determine the unknown coefficients. Conversely, the key aspect of the current approach is the use of ab initio data over an extended range of nuclear configurations. The normal form of the system of pseudolinear equations is introduced here to obtain a least-squares fit to what would have been an (intractable) overcomplete set of data in the previous approach. This method provides a quasidiabatic representation that minimizes the residual derivative coupling in a least-squares sense, a means to extend the domain of accuracy of the diabatic Hamiltonian or refine its accuracy within a given domain, and a way to impose point group symmetry and hermiticity. These attributes are illustrated using the 1 (2)A(1) and 1 (2)E states of the 1-propynyl radical, CH(3)CC.

  3. Simulation of growth normal fault sandbox tests using the 2D discrete element method

    Science.gov (United States)

    Chu, Sheng-Shin; Lin, Ming-Lang; Huang, Wen-Chao; Nien, Wei-Tung; Liu, Huan-Chi; Chan, Pei-Chen

    2015-01-01

    A fault slip can cause the deformation of shallow soil layers and destroy infrastructures. The Shanchiao Fault on the west side of the Taipei Basin is one such fault. The activities of the Shanchiao Fault have caused the quaternary sediment beneath the Taipei Basin to become deformed, damaging structures, traffic construction, and utility lines in the area. Data on geological drilling and dating have been used to determine that a growth fault exists in the Shanchiao Fault. In an experiment, a sandbox model was built using noncohesive sandy soil to simulate the existence of a growth fault in the Shanchiao Fault and forecast the effect of the growth fault on shear-band development and ground differential deformation. The experimental results indicated that when a normal fault contains a growth fault at the offset of the base rock, the shear band develops upward beside the weak side of the shear band of the original-topped soil layer, and surfaces considerably faster than that of the single-topped layer. The offset ratio required is approximately one-third that of the single-cover soil layer. In this study, a numerical simulation of the sandbox experiment was conducted using a discrete element method program, PFC2D, to simulate the upper-covering sand layer shear-band development pace and the scope of a growth normal fault slip. The simulation results indicated an outcome similar to that of the sandbox experiment, which can be applied to the design of construction projects near fault zones.

  4. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis.

    Science.gov (United States)

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-12-13

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.

  5. Normal CBF values by the ARG method using IMP SPECT. Comparison with a conventional microsphere model method

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Hiroshi; Koyama, Masamichi; Kawashima, Ryuta; Ono, Shuichi; Fukuda, Hiroshi [Tohoku Univ., Sendai (Japan). Inst. of Development, Aging and Cancer; Ishii, Kiyoshi; Kinoshita, Toshifumi

    1996-02-01

    N-isopropyl-p[{sup 123}I]iodoamphetamine (IMP) has been used as a flow tracer for SPECT, and measurement of cerebral blood flow (CBF) using IMP has been performed by conventional microsphere model method (MS method). Recently, the ARG method for measuring CBF by using IMP with one SPECT scan and one point blood sampling has been developed. This method was based on two-compartment model. In the present study, normal CBF values were measured in 10 healthy male subjects (mean age{+-}S.D.: 29.8{+-}6.01, age range: 23-41) by ARG and MS methods. The mean CBF values ({+-}S.D.) for ARG method in which V{sub d} value was assumed to be 50 ml/ml were 41.7{+-}9.4, 31.1{+-}5.0, 40.7{+-}9.7, 41.5{+-}10.0, 38.2{+-}9.2, 39.0{+-}9.4, 41.9{+-}10.6, 38.7{+-}8.0 and 30.0{+-}7.7 ml/100 ml/min in the cerebellum, pons, thalamus, basal ganglia, frontal, temporal, parietal, occipital lobe cortex and centrum semiovale, respectively. The mean CBF values for the MS method were 46.8{+-}8.4, 37.5{+-}5.6, 45.8{+-}8.6, 46.5{+-}8.9, 43.7{+-}8.3, 44.4{+-}8.7, 46.8{+-}9.3, 44.3{+-}7.3 and 36.3{+-}8.1 ml/100 ml/min, respectively. The mean CBF values in the cerebral cortex region for ARG method were lower than those previously reported by PET. This would be caused by low first-pass extraction fraction of IMP compared with oxygen-15 labeled water. The mean CBF values for the MS method were higher than those for ARG method against previous studies. As reasons for this, errors in estimation of SPECT brain counts at 8 min in the MS method were considered. (author).

  6. Analyses on normal background characteristics about deformation observation data on the basis of wavelet transform method

    Institute of Scientific and Technical Information of China (English)

    李杰; 刘希强; 李红; 毛玉华; 郑树田

    2005-01-01

    Wavelet transform method is applied to measure time-frequency distribution characteristics of digital deformation data and noise. Based on the characteristics of primary modulus and stochastic white noise discrimination factor of wavelet decomposition, we analyze the variation rule of normal background and noise data from Shandong digital deformation observation data. The research results indicate that: a) 1/4 daily wave, semi-diurnal tide wave, daily wave and half lunar wave and so on quasi-periodic signal exist in the detail decomposing signal of wavelet when scale are equal to 2, 3 and 4; b) The amplitude of detail decomposing signal is the biggest when scale is equal to 3; c) The detail decomposing signal contains mainly noise corresponding to scale 1 and 5, respectively; d) We may trace the abnormal precursory which is related to earthquake by analyzing non-earthquake wavelet decomposing signal whose scale is specified from digital deformation observation data.

  7. Comparison of methods of estimating body fat in normal subjects and cancer patients

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, S.H. (Brookhaven National Lab., Upton, NY); Ellis, K.J.; Vartsky, D.; Sawitsky, A.; Gartenhaus, W.; Yasumura, S.; Vaswani, A.N.

    1981-12-01

    Total body fat can be indirectly estimated by the following noninvasive techniques: determination of lean body mass by measurement of body potassium or body water, and determination of density by underwater weighing or by skinfold measurements. The measurement of total body nitrogen by neutron activation provides another technique for estimating lean body mass and hence body fat. The nitrogen measurement can also be combined with the measurement of total body potassium in a two compartment model of the lean body mass from which another estimate of body fat can be derived. All of the above techniques are subject to various errors and are based on a number of assumptions, some of which are incompletely validated. These techniques were applied to a population of normal subjects and to a group of cancer patients. The advantages and disadvantages of each method are discussed in terms of their ability to estimate total body fat.

  8. Comparison of methods of estimating body fat in normal subjects and cancer patients.

    Science.gov (United States)

    Cohn, S H; Ellis, K J; Vartsky, D; Sawitsky, A; Gartenhaus, W; Yasumura, S; Vaswani, A N

    1981-12-01

    Total body fat can be indirectly estimated by the following noninvasive techniques: determination of lean body mass by measurement of body potassium or body water, and determination of density by underwater weighing or by skinfold measurements. The measurement of total body nitrogen by neutron activation provides another technique for estimating lean body mass and hence body fat. The nitrogen measurement can also be combined with the measurement of total body potassium in a two compartment model of the lean body mass from which another estimate of body fat can be derived. All of the above techniques are subject to various errors and are based on a number of assumptions, some of which are incompletely validated. These techniques were applied to a population of normal subjects and to a group of cancer patients. The advantages and disadvantages of each method are discussed in terms of their ability to estimate total body fat.

  9. Gene expression in human skeletal muscle: alternative normalization method and effect of repeated biopsies.

    Science.gov (United States)

    Lundby, Carsten; Nordsborg, Nikolai; Kusuhara, Keiko; Kristensen, Kristina Møller; Neufer, P Darrell; Pilegaard, Henriette

    2005-10-01

    The reverse transcriptase-polymerase chain reaction (RT-PCR) method has lately become widely used to determine transcription and mRNA content in rodent and human muscle samples. However, the common use of endogenous controls for correcting for variance in cDNA between samples is not optimal. Specifically, we investigated (1) a new normalization method based on determining the cDNA content by the flourophores PicoGreen and OliGreen, (2) effect of repeated muscle biopsies on mRNA gene expression, and (3) the spatial heterogeneity in mRNA expression across the muscle. Standard curves using oligo standards revealed a high degree of sensitivity and linearity (2.5-45 ng; R2>0.99) with OliGreen reagent, as was the case for OliGreen analyses with standard curves constructed from serial dilutions of representative RT samples (R2 >0.99 for a ten times dilution range of a representative reversed transcribed (RT) sample). Likewise, PicoGreen reagent detected the RNA:DNA hybrid content in RT samples with great sensitivity. Standard curves constructed from both double-stranded lambda DNA (1-10 ng) and from serial dilutions of representative RT samples consistently resulted in linearity with R2 >0.99. The present determination of cDNA content in reversed transcribed human skeletal muscle RNA samples by both PicoGreen and OliGreen analyses suggests that these fluorophores provide a potential alternative normalization procedure for human gene expression studies. In addition, the present study shows that multiple muscle biopsies obtained from the same muscle do not influence the mRNA response induced by an acute exercise bout for any of the genes examined.

  10. An efficient Born normal mode method to compute sensitivity kernels and synthetic seismograms in the Earth

    Science.gov (United States)

    Capdeville, Y.

    2005-11-01

    We present an alternative to the classical mode coupling method scheme often used in global seismology to compute synthetic seismograms in laterally heterogeneous earth model and Frechet derivatives for tomographic inverse problem with the normal modes first-order Born approximation. We start from the first-order Born solution in the frequency domain and we use a numerical scheme for the volume integration, which means that we have to compute the effect of a finite number of scattering points and sum them with the appropriate integration weight. For each scattering point, `source to scattering point' and `scattering point to receivers' expressions are separated before applying a Fourier transform to return to the time domain. Doing so, the perturbed displacement is obtained, for each scattering point, as the convolution of a forward wavefield from the source to the scattering point with a backward wavefield from the scattering integration point to the receiver. For one scattering point and for a given number of time steps, the numerical cost of such a scheme grows as (number of receivers + the number of sources) × (corner frequency)2 to be compared to (number of receivers × the number of sources) × (corner frequency)4 when the classical normal mode coupling algorithm is used. Another interesting point is, when used for Frechet kernel, the computing cost is (almost) independent of the number of parameters used for the inversion. This algorithm is similar to the one obtained when solving the adjoint problem. Validation tests with respect to the spectral element method solution both in the Frechet derivative case and as a synthetic seismogram tool shows a good agreement. In the latter case, we show that non-linearity can be significant even at long periods and when using existing smooth global tomographic models.

  11. Shack-Hartmann centroid detection method based on high dynamic range imaging and normalization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, Javier; Gonzalez-Fernandez, Luis; Quiroga, Juan Antonio; Belenguer, Tomas

    2010-05-01

    In the optical quality measuring process of an optical system, including diamond-turning components, the use of a laser light source can produce an undesirable speckle effect in a Shack-Hartmann (SH) CCD sensor. This speckle noise can deteriorate the precision and accuracy of the wavefront sensor measurement. Here we present a SH centroid detection method founded on computer-based techniques and capable of measurement in the presence of strong speckle noise. The method extends the dynamic range imaging capabilities of the SH sensor through the use of a set of different CCD integration times. The resultant extended range spot map is normalized to accurately obtain the spot centroids. The proposed method has been applied to measure the optical quality of the main optical system (MOS) of the mid-infrared instrument telescope smulator. The wavefront at the exit of this optical system is affected by speckle noise when it is illuminated by a laser source and by air turbulence because it has a long back focal length (3017 mm). Using the proposed technique, the MOS wavefront error was measured and satisfactory results were obtained.

  12. Ultrasonic method for measuring water holdup of low velocity and high-water-cut oil-water two-phasefl ow

    Institute of Scientific and Technical Information of China (English)

    Zhao An; Han Yun-Feng; Ren Ying-Yu; Zhai Lu-Sheng; Jin Ning-De

    2016-01-01

    Oil reservoirs with low permeability and porosity that are in the middle and late exploitation periods in China’s onshore oilfi elds are mostly in the high-water-cut production stage. This stage is associated with severely non-uniform local-velocity flow profiles and dispersed-phase concentration (of oil droplets) in oil-water two-phase flow, which makes it diffi cult to measure water holdup in oil wells. In this study, we use an ultrasonic method based on a transmission-type sensor in oil-water two-phase flow to measure water holdup in low-velocity and high water-cut conditions. First, we optimize the excitation frequency of the ultrasonic sensor by calculating the sensitivity of the ultrasonicfi eld using thefi nite element method for multiphysics coupling. Then we calculate the change trend of sound pressure level attenuation ratio with the increase in oil holdup to verify the feasibility of the employed diameter for the ultrasonic sensor. Based on the results, we then investigate the effects of oil-droplet diameter and distribution on the ultrasonicfi eld. To further understand the measurement characteristics of the ultrasonic sensor, we perform a flow loop test on vertical upward oil-water two-phasefl ow and measure the responses of the optimized ultrasonic sensor. The results show that the ultrasonic sensor yields poor resolution for a dispersed oil slug in waterfl ow (D OS/Wfl ow), but the resolution is favorable for dispersed oil in waterfl ow (D O/Wfl ow) and veryfi ne dispersed oil in waterfl ow (VFD O/Wfl ow). This research demonstrates the potential application of a pulsed-transmission ultrasonic method for measuring the fraction of individual components in oil-water two-phasefl ow with a low mixture velocity and high water cut.

  13. Ultrasonic method for measuring water holdup of low velocity and high-water-cut oil-water two-phase flow

    Science.gov (United States)

    Zhao, An; Han, Yun-Feng; Ren, Ying-Yu; Zhai, Lu-Sheng; in, Ning-De

    2016-03-01

    Oil reservoirs with low permeability and porosity that are in the middle and late exploitation periods in China's onshore oil fields are mostly in the high-water-cut production stage. This stage is associated with severely non-uniform local-velocity flow profiles and dispersed-phase concentration (of oil droplets) in oil-water two-phase flow, which makes it difficult to measure water holdup in oil wells. In this study, we use an ultrasonic method based on a transmission-type sensor in oil-water two-phase flow to measure water holdup in low-velocity and high water-cut conditions. First, we optimize the excitation frequency of the ultrasonic sensor by calculating the sensitivity of the ultrasonic field using the finite element method for multiphysics coupling. Then we calculate the change trend of sound pressure level attenuation ratio with the increase in oil holdup to verify the feasibility of the employed diameter for the ultrasonic sensor. Based on the results, we then investigate the effects of oil-droplet diameter and distribution on the ultrasonic field. To further understand the measurement characteristics of the ultrasonic sensor, we perform a flow loop test on vertical upward oil-water two-phase flow and measure the responses of the optimized ultrasonic sensor. The results show that the ultrasonic sensor yields poor resolution for a dispersed oil slug in water flow (D OS/W flow), but the resolution is favorable for dispersed oil in water flow (D O/W flow) and very fine dispersed oil in water flow (VFD O/W flow). This research demonstrates the potential application of a pulsed-transmission ultrasonic method for measuring the fraction of individual components in oil-water two-phase flow with a low mixture velocity and high water cut.

  14. 心内膜三维几何模型交互式网格切割算法研究%ON INTERACTIVE MESH CUTTING METHOD FOR 3D ENDOCARDIAL GEOMETRIC MODEL

    Institute of Scientific and Technical Information of China (English)

    周学礼; 万旺根; 王亚男; 李金博

    2014-01-01

    针对心内膜三维标测系统中心内膜模型网格切割模块,提出一种新的交互式网格切割算法。系统在载入心内膜三维几何模型后初始化切割平板模型,由用户交互移动切割平板和旋转切割平板的法线。借助矩阵变换,针对心内膜三维几何模型实现能反映用户切割意图和保持网格细节特征的网格切割,而且切割平板可以消除锯齿效应。实验对比结果证明,该算法对于心内膜三维几何模型可以实现令用户满意的实时无锯齿交互式网格切割。%For the mesh cutting module of endocardial model in endocardial 3D mapping system,we present a new interactive mesh cutting algorithm.The system initialises the cutting plane model after the 3D endocardial geometric model is loaded,then the cutting plane and the normal line of cutting plane are moved and rotated by the user interactively.With the help of matrix transformation,the mesh cutting which reflects user’s cutting intention while preserves the feature of mesh details is implemented aiming at the 3D endocardial geometric model.Mo-reover,the cutting plane can eliminate the sawtooth effect.Experimental contrasting result shows that this algorithm can achieve real-time non-serrated interactive mesh cutting with user satisfaction for 3D endocardial geometric model.

  15. A Gauss-Newton method for the integration of spatial normal fields in shape Space

    KAUST Repository

    Balzer, Jonathan

    2011-08-09

    We address the task of adjusting a surface to a vector field of desired surface normals in space. The described method is entirely geometric in the sense, that it does not depend on a particular parametrization of the surface in question. It amounts to solving a nonlinear least-squares problem in shape space. Previously, the corresponding minimization has been performed by gradient descent, which suffers from slow convergence and susceptibility to local minima. Newton-type methods, although significantly more robust and efficient, have not been attempted as they require second-order Hadamard differentials. These are difficult to compute for the problem of interest and in general fail to be positive-definite symmetric. We propose a novel approximation of the shape Hessian, which is not only rigorously justified but also leads to excellent numerical performance of the actual optimization. Moreover, a remarkable connection to Sobolev flows is exposed. Three other established algorithms from image and geometry processing turn out to be special cases of ours. Our numerical implementation founds on a fast finite-elements formulation on the minimizing sequence of triangulated shapes. A series of examples from a wide range of different applications is discussed to underline flexibility and efficiency of the approach. © 2011 Springer Science+Business Media, LLC.

  16. The Stochastic Galerkin Method for Darcy Flow Problem with Log-Normal Random Field Coefficients

    Directory of Open Access Journals (Sweden)

    Michal Beres

    2017-01-01

    Full Text Available This article presents a study of the Stochastic Galerkin Method (SGM applied to the Darcy flow problem with a log-normally distributed random material field given by a mean value and an autocovariance function. We divide the solution of the problem into two parts. The first one is the decomposition of a random field into a sum of products of a random vector and a function of spatial coordinates; this can be achieved using the Karhunen-Loeve expansion. The second part is the solution of the problem using SGM. SGM is a simple extension of the Galerkin method in which the random variables represent additional problem dimensions. For the discretization of the problem, we use a finite element basis for spatial variables and a polynomial chaos discretization for random variables. The results of SGM can be utilised for the analysis of the problem, such as the examination of the average flow, or as a tool for the Bayesian approach to inverse problems.

  17. Integrating atlas and graph cut methods for right ventricle blood-pool segmentation from cardiac cine MRI

    Science.gov (United States)

    Dangi, Shusil; Linte, Cristian A.

    2017-03-01

    Segmentation of right ventricle from cardiac MRI images can be used to build pre-operative anatomical heart models to precisely identify regions of interest during minimally invasive therapy. Furthermore, many functional parameters of right heart such as right ventricular volume, ejection fraction, myocardial mass and thickness can also be assessed from the segmented images. To obtain an accurate and computationally efficient segmentation of right ventricle from cardiac cine MRI, we propose a segmentation algorithm formulated as an energy minimization problem in a graph. Shape prior obtained by propagating label from an average atlas using affine registration is incorporated into the graph framework to overcome problems in ill-defined image regions. The optimal segmentation corresponding to the labeling with minimum energy configuration of the graph is obtained via graph-cuts and is iteratively refined to produce the final right ventricle blood pool segmentation. We quantitatively compare the segmentation results obtained from our algorithm to the provided gold-standard expert manual segmentation for 16 cine-MRI datasets available through the MICCAI 2012 Cardiac MR Right Ventricle Segmentation Challenge according to several similarity metrics, including Dice coefficient, Jaccard coefficient, Hausdorff distance, and Mean absolute distance error.

  18. The N/D method with non-perturbative left-hand-cut discontinuity and the S10NN partial wave

    Science.gov (United States)

    Entem, D. R.; Oller, J. A.

    2017-10-01

    In this letter we introduce an integral equation that allows to calculate the exact left-hand-cut discontinuity for an uncoupled S-wave partial-wave amplitude in potential scattering for a given finite-range potential. In particular this is applied here to the S10 nucleon-nucleon (NN) partial wave. The calculation of Δ (A) is completely fixed by the potential because short-range physics (corresponding to integrated out degrees of freedom within the low-energy Effective Field Theory) does not contribute to Δ (A). The results obtained from the N / D method for a partial-wave amplitude are rigorous, since now the discontinuities along the left-hand cut and right-hand cut are exactly known. This solves in this case the open question with respect to the N / D method and the effect on the final result of the non-perturbative iterative diagrams in the evaluation of Δ (A). The solution of this problem also implies the equivalence of the N / D method and the Lippmann-Schwinger (LS) equation for the nonsingular one-pion exchange S10NN potential (Yukawa potential). The equivalence between the N / D method with one extra subtraction and the LS equation renormalized with one counterterm or with subtractive renormalization also holds for the singular attractive S10NN potentials calculated by including higher orders in Chiral Perturbation Theory (ChPT). However, the N / D method is more flexible and, rather straightforwardly, it allows to evaluate partial-wave amplitudes with a higher number of extra subtractions, that we fix in terms of shape parameters within the effective range expansion. We give results up to three extra subtractions in the N / D method, which provide a rather accurate reproduction of the S10NN phase shifts when the NNLO ChPT potential is employed. Our new method then provides a general theory to renormalize non-perturbatively singular and regular potentials in scattering that can be extended to higher partial waves as well as to coupled channel scattering.

  19. Normal mode analysis of macromolecular systems with the mobile block Hessian method

    Energy Technology Data Exchange (ETDEWEB)

    Ghysels, An; Van Speybroeck, Veronique; Van Neck, Dimitri; Waroquier, Michel [Center for Molecular Modeling, Ghent University, Technologiepark 903, 9052 Zwijnaarde (Belgium); Brooks, Bernard R. [Laboratory for Computational Biology, National Heart, Lung and Blood Institute, National Institutes of Health, 5636 Fisher' s Ln, Rockville, MD 20851 (United States)

    2015-01-22

    Until recently, normal mode analysis (NMA) was limited to small proteins, not only because the required energy minimization is a computationally exhausting task, but also because NMA requires the expensive diagonalization of a 3N{sub a}×3N{sub a} matrix with N{sub a} the number of atoms. A series of simplified models has been proposed, in particular the Rotation-Translation Blocks (RTB) method by Tama et al. for the simulation of proteins. It makes use of the concept that a peptide chain or protein can be seen as a subsequent set of rigid components, i.e. the peptide units. A peptide chain is thus divided into rigid blocks with six degrees of freedom each. Recently we developed the Mobile Block Hessian (MBH) method, which in a sense has similar features as the RTB method. The main difference is that MBH was developed to deal with partially optimized systems. The position/orientation of each block is optimized while the internal geometry is kept fixed at a plausible - but not necessarily optimized - geometry. This reduces the computational cost of the energy minimization. Applying the standard NMA on a partially optimized structure however results in spurious imaginary frequencies and unwanted coordinate dependence. The MBH avoids these unphysical effects by taking into account energy gradient corrections. Moreover the number of variables is reduced, which facilitates the diagonalization of the Hessian. In the original implementation of MBH, atoms could only be part of one rigid block. The MBH is now extended to the case where atoms can be part of two or more blocks. Two basic linkages can be realized: (1) blocks connected by one link atom, or (2) by two link atoms, where the latter is referred to as the hinge type connection. In this work we present the MBH concept and illustrate its performance with the crambin protein as an example.

  20. A generalized crystal-cutting method for modeling arbitrarily oriented crystals in 3D periodic simulation cells with applications to crystal-crystal interfaces

    Science.gov (United States)

    Kroonblawd, Matthew P.; Mathew, Nithin; Jiang, Shan; Sewell, Thomas D.

    2016-10-01

    A Generalized Crystal-Cutting Method (GCCM) is developed that automates construction of three-dimensionally periodic simulation cells containing arbitrarily oriented single crystals and thin films, two-dimensionally (2D) infinite crystal-crystal homophase and heterophase interfaces, and nanostructures with intrinsic N-fold interfaces. The GCCM is based on a simple mathematical formalism that facilitates easy definition of constraints on cut crystal geometries. The method preserves the translational symmetry of all Bravais lattices and thus can be applied to any crystal described by such a lattice including complicated, low-symmetry molecular crystals. Implementations are presented with carefully articulated combinations of loop searches and constraints that drastically reduce computational complexity compared to simple loop searches. Orthorhombic representations of monoclinic and triclinic crystals found using the GCCM overcome some limitations in standard distributions of popular molecular dynamics software packages. Stability of grain boundaries in β-HMX was investigated using molecular dynamics and molecular statics simulations with 2D infinite crystal-crystal homophase interfaces created using the GCCM. The order of stabilities for the four grain boundaries studied is predicted to correlate with the relative prominence of particular crystal faces in lab-grown β-HMX crystals. We demonstrate how nanostructures can be constructed through simple constraints applied in the GCCM framework. Example GCCM constructions are shown that are relevant to some current problems in materials science, including shock sensitivity of explosives, layered electronic devices, and pharmaceuticals.

  1. Determination of the cut-off score of an endoscopic scoring method to predict whether elderly patients with dysphagia can eat pureed diets

    Institute of Scientific and Technical Information of China (English)

    Torao Sakamoto; Akira Horiuchi; Toshiyuki Makino; Masashi Kajiyama; Naoki Tanaka; Masamitsu Hyodo

    2016-01-01

    AIM: To identify the cut-off value for predicting the ability of elderly patients with dysphagia to swallow pureed diets using a new endoscopy scoring method. METHODS: Endoscopic swallowing evaluation of pureed diets were done in patients ≥ 65 years with dysphagia. The Hyodo-Komagane score for endoscopic swallowing evaluation is expressed as the sum(0-12) of four degrees(0-3) with four parameters:(1) salivary pooling in the vallecula and piriform sinuses;(2) the response of glottal closure reflex induced by touching the epiglottis with the endoscope;(3) the location of the bolus at the time of swallow onset assessed by "white-out" following swallowing of test jelly; and(4) pharyngeal clearance after swallowing of test jelly. We used receiver operating characteristic(ROC) curve analysis to retrospectively analyze the association between the total score and successful oral intake of pureed diets. RESULTS: One hundred and seventy-eight patients were enrolled including 113 men(63%), mean age 83 years(range, 66-98). One hundred and twenty-six patients(71%) were able to eat pureed diets during the observation period(mean ± SD, 19 ± 14 d). In ROC analysis, the cut-off value of the score for eating the pureed diets was 7(sensitivity = 0.98; specificity = 0.91).CONCLUSION: The Hyodo-Komagane endoscopic score is useful to predict the ability to eat pureed diets in elderly patients with dysphagia.

  2. Air in xylem vessels of cut flowers

    NARCIS (Netherlands)

    Nijsse, J.; Meeteren, van U.; Keijzer, C.J.

    2000-01-01

    Until now all studies on the role of air emboli in the water uptake of cut flowers describe indirect methods to demonstrate the presencFe of air in the plant tissues. Using cut chrysanthemum flowers, this report is the first one that directly visualises both air and water in xylem ducts of cut

  3. Optical properties of human normal small intestine tissue determined by Kubelka-Munk method in vitro

    Institute of Scientific and Technical Information of China (English)

    Hua-Jiang Wei; Da Xing; Guo-Yong Wu; Ying Jin; Huai-Min Gu

    2003-01-01

    AIM: To study the optical properties of human normal small intestine tissue at 476.5 nm, 488 nm, 496.5 nm, 514.5 nm,532 nm, 808 nm wavelengths of laser irradiation.METHODS: A double-integrating-sphere system, the basic principle of measuring technology of light radiation, and an optical model of biological tissues were used in the study.RESULTS: The results of measurement showed that there were no significant differences in the absorption coefficients of human normal small intestine tissue at 476.5 nm, 488 nm,496.5 nm laser in the Kubelka-Munk two-flux model (P>0.05).The absorption coefficients of the tissue at 514.5 nm, 532 nm,808 nm laser irradiation were obviously increased with the decrease of these wavelengths. The scattering coefficients of the tissue at 476.5 nm, 488 nm, 496.5 nm laser irradiation were increased with the decrease of these wavelengths.The scattering coefficients at 496.5 nm, 514.5 nm, 532 nm laser irradiation were obviously increased with the increase of these wavelengths. The scattering coefficient of the tissue at 532 nm laser irradiation was bigger than that at 808 nm.There were no significant differences in the total attenuation coefficient of the tissue at 476.5 nm and 488 nm laser irradiation (P>0.05). The total attenuation coefficient of the tissue at 488 nm, 496.5 nm, 514.5 nm, 532 nm, 808 nm laser irradiation was obviously increased with the decrease of these wavelengths, and their effective attenuation coefficient revealed the same trend. There were no significant differences among the forward scattered photon fluxe,backward scattered photon fiuxe, and total scattered photon fiuxe of the tissue at 476.5 nm, 488 nm, 496.5 nm laser irradiation. They were all obviously increased with attenuation of tissue thickness. The attenuations of forward and backward scattered photon fluxes, and the total scattered photon fiuxe of the tissue at 514.5 nm laser irradiation were slower than those at 476.5 nm, 488 nm, 496.5 nm laser irradiation

  4. Study on Ceramic Cutting by Plasma Arc

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Engineering ceramics are typical difficult-to-machine materials because of high hardness and brittleness. PAC (Plasma Arc Cutting) is a very important thermal cutting process and has been successfully used in cutting stainless steel and other difficult-to-machine alloys. PAC's application in cutting ceramics, however, is still limited because the most ceramics are not good electronic conducts, and transferred plasma arc cannot be produced between cathode and work-piece. So we presented a method of plasma ...

  5. [Air conducted ocular VEMP: I. Determination of a method and application in normal patients].

    Science.gov (United States)

    Walther, L E; Schaaf, H; Sommer, D; Hörmann, K

    2011-07-01

    Air conducted (AC) cervical vestibular evoked myogenic potentials (AC cVEMP) and air conducted ocular VEMP (AC oVEMP) may be used for measurement of otolith function. However AC oVEMP are few examined till now. The aim of this pilot study was to apply a method for use of AC oVEMP in clinical practice. AC oVEMP were recorded in healthy voluntary people (n=20) using intense AC-sound stimulation (500 Hz tone bursts, 100 dB nHL). Thermal irrigation and AC cVEMP were normal as including criteria. Values were evaluated statistically. AC oVEMP were recorded in all healthy patients. Mean and standard deviation for the first negative peak was 11.35±1.00 ms and for the first negative peak 16.30±1.10 ms. The mean amplitudes were 7.70±4.50 μV. The stability of n10 and p15 component was the same. AC oVEMP can be easy and fast obtained. N10 and p15 latencies may used as parameter for clinical interpretation. Amplitude fluctuations are relatively large. Results can be used in further clinical investigation of AC oVEMP. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Reconstruction of normal and abnormal gastric electrical sources using a potential based inverse method.

    Science.gov (United States)

    Kim, J H K; Du, P; Cheng, L K

    2013-09-01

    The use of cutaneous recordings to non-invasively characterize gastric slow waves has had limited clinical acceptance, primarily due to the uncertainty in relating the recorded signal to the underlying gastric slow waves. In this study we aim to distinguish and quantitatively reconstruct different slow wave patterns using an inverse algorithm. Slow wave patterns corresponding to normal, retrograde and uncoupled activity at different frequencies were imposed on a stomach surface model. Gaussian noise (10% peak-to-peak) was added to cutaneous potentials and the Greensite-Tikhonov inverse method was used to reconstruct the potentials on the stomach. The effectiveness of the number or location of electrodes on the accuracy of the inverse solutions was investigated using four different electrode configurations. Results showed the reconstructed solutions were able to reliably distinguish the different slow wave patterns and waves with lower frequency were better correlated to the known solution than those with higher. The use of up to 228 electrodes improved the accuracy of the inverse solutions. However, the use of 120 electrodes concentrated around the stomach was able to achieve similar results. The most efficient electrode configuration for our model involved 120 electrodes with an inter-electrode distance of 32 mm.

  7. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method.

    Directory of Open Access Journals (Sweden)

    Ganglong Yang

    Full Text Available The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia, KK47 (low grade nonmuscle invasive bladder cancer, NMIBC, and YTS1 (metastatic bladder cancer have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer.

  8. NDT-Bobath method in normalization of muscle tone in post-stroke patients.

    Science.gov (United States)

    Mikołajewska, Emilia

    2012-01-01

    Ischaemic stroke is responsible for 80-85% of strokes. There is great interest in finding effective methods of rehabilitation for post-stroke patients. The aim of this study was to assess the results of rehabilitation carried out in the normalization of upper limb muscle tonus in patients, estimated on the Ashworth Scale for Grading Spasticity. The examined group consisted of 60 patients after ischaemic stroke. 10 sessions of NDT-Bobath therapy were provided within 2 weeks (ten days of therapy). Patient examinations using the Ashworth Scale for Grading Spasticity were done twice: the first time on admission and the second after the last session of the therapy to assess rehabilitation effects. Among the patients involved in the study, the results measured on the Ashworth Scale (where possible) were as follows: recovery in 16 cases (26.67%), relapse in 1 case (1.67%), no measurable changes (or change within the same grade of the scale) in 8 cases (13.33%). Statistically significant changes were observed in the health status of the patients. These changes, in the area of muscle tone, were favorable and reflected in the outcomes of the assessment using the Ashworth Scale for Grading Spasticity.

  9. Gluebond strength of laser cut wood

    Science.gov (United States)

    Charles W. McMillin; Henry A. Huber

    1985-01-01

    The degree of strength loss when gluing laser cut wood as compared to conventionally sawn wood and the amount of additional surface treatment needed to improve bond quality were assessed under normal furniture plant operating conditions. The strength of laser cut oak glued with polyvinyl acetate adhesive was reduced to 75 percent of sawn joints and gum was reduced 43...

  10. A high-order multi-zone cut-stencil method for numerical simulations of high-speed flows over complex geometries

    Science.gov (United States)

    Greene, Patrick T.; Eldredge, Jeff D.; Zhong, Xiaolin; Kim, John

    2016-07-01

    In this paper, we present a method for performing uniformly high-order direct numerical simulations of high-speed flows over arbitrary geometries. The method was developed with the goal of simulating and studying the effects of complex isolated roughness elements on the stability of hypersonic boundary layers. The simulations are carried out on Cartesian grids with the geometries imposed by a third-order cut-stencil method. A fifth-order hybrid weighted essentially non-oscillatory scheme was implemented to capture any steep gradients in the flow created by the geometries and a third-order Runge-Kutta method is used for time advancement. A multi-zone refinement method was also utilized to provide extra resolution at locations with expected complex physics. The combination results in a globally fourth-order scheme in space and third order in time. Results confirming the method's high order of convergence are shown. Two-dimensional and three-dimensional test cases are presented and show good agreement with previous results. A simulation of Mach 3 flow over the logo of the Ubuntu Linux distribution is shown to demonstrate the method's capabilities for handling complex geometries. Results for Mach 6 wall-bounded flow over a three-dimensional cylindrical roughness element are also presented. The results demonstrate that the method is a promising tool for the study of hypersonic roughness-induced transition.

  11. A comprehensive comparison of normalization methods for loading control and variance stabilization of reverse-phase protein array data.

    Science.gov (United States)

    Liu, Wenbin; Ju, Zhenlin; Lu, Yiling; Mills, Gordon B; Akbani, Rehan

    2014-01-01

    Loading control (LC) and variance stabilization of reverse-phase protein array (RPPA) data have been challenging mainly due to the small number of proteins in an experiment and the lack of reliable inherent control markers. In this study, we compare eight different normalization methods for LC and variance stabilization. The invariant marker set concept was first applied to the normalization of high-throughput gene expression data. A set of "invariant" markers are selected to create a virtual reference sample. Then all the samples are normalized to the virtual reference. We propose a variant of this method in the context of RPPA data normalization and compare it with seven other normalization methods previously reported in the literature. The invariant marker set method performs well with respect to LC, variance stabilization and association with the immunohistochemistry/florescence in situ hybridization data for three key markers in breast tumor samples, while the other methods have inferior performance. The proposed method is a promising approach for improving the quality of RPPA data.

  12. Multibeam fiber laser cutting

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove; Hansen, Klaus Schütt; Nielsen, Jakob Skov

    2009-01-01

    The appearance of the high power high brilliance fiber laser has opened for new possibilities in laser materials processing. In laser cutting this laser has demonstrated high cutting performance compared to the dominating Cutting laser, the CO2 laser. However, quality problems in fiber-laser...... cutting have until now limited its application to metal cutting. In this paper the first results of proof-of-principle Studies applying a new approach (patent pending) for laser cutting with high brightness and short wavelength lasers will be presented. In the approach, multibeam patterns are applied...... to control the melt flow out of the cut kerf resulting in improved cut quality in metal cutting. The beam patterns in this study are created by splitting up beams from two single mode fiber lasers and combining these beams into a pattern in the cut kerf. The results are obtained with a total of 550 W...

  13. Multibeam Fibre Laser Cutting

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove

    The appearance of the high power high brilliance fibre laser has opened for new possibilities in laser materials processing. In laser cutting this laser has demonstrated high cutting performance compared to the dominating cutting laser, the CO2-laser. However, quality problems in fibre-laser...... cutting have until now limited its application in metal cutting. In this paper the first results of proof-of-principle studies applying a new approach (patent pending) for laser cutting with high brightness short wavelength lasers will be presented. In the approach, multi beam patterns are applied...... to control the melt flow out of the cut kerf resulting in improved cut quality in metal cutting. The beam patterns in this study are created by splitting up beams from 2 single mode fibre lasers and combining these beams into a pattern in the cut kerf. The results are obtained with a total of 550 W of single...

  14. Feasibility of Computed Tomography-Guided Methods for Spatial Normalization of Dopamine Transporter Positron Emission Tomography Image.

    Directory of Open Access Journals (Sweden)

    Jin Su Kim

    Full Text Available Spatial normalization is a prerequisite step for analyzing positron emission tomography (PET images both by using volume-of-interest (VOI template and voxel-based analysis. Magnetic resonance (MR or ligand-specific PET templates are currently used for spatial normalization of PET images. We used computed tomography (CT images acquired with PET/CT scanner for the spatial normalization for [18F]-N-3-fluoropropyl-2-betacarboxymethoxy-3-beta-(4-iodophenyl nortropane (FP-CIT PET images and compared target-to-cerebellar standardized uptake value ratio (SUVR values with those obtained from MR- or PET-guided spatial normalization method in healthy controls and patients with Parkinson's disease (PD.We included 71 healthy controls and 56 patients with PD who underwent [18F]-FP-CIT PET scans with a PET/CT scanner and T1-weighted MR scans. Spatial normalization of MR images was done with a conventional spatial normalization tool (cvMR and with DARTEL toolbox (dtMR in statistical parametric mapping software. The CT images were modified in two ways, skull-stripping (ssCT and intensity transformation (itCT. We normalized PET images with cvMR-, dtMR-, ssCT-, itCT-, and PET-guided methods by using specific templates for each modality and measured striatal SUVR with a VOI template. The SUVR values measured with FreeSurfer-generated VOIs (FSVOI overlaid on original PET images were also used as a gold standard for comparison.The SUVR values derived from all four structure-guided spatial normalization methods were highly correlated with those measured with FSVOI (P < 0.0001. Putaminal SUVR values were highly effective for discriminating PD patients from controls. However, the PET-guided method excessively overestimated striatal SUVR values in the PD patients by more than 30% in caudate and putamen, and thereby spoiled the linearity between the striatal SUVR values in all subjects and showed lower disease discrimination ability. Two CT-guided methods showed

  15. Methods to evvaluate normal rainfall for short-term wetland hydrology assessment

    Science.gov (United States)

    Jaclyn Sumner; Michael J. Vepraskas; Randall K. Kolka

    2009-01-01

    Identifying sites meeting wetland hydrology requirements is simple when long-term (>10 years) records are available. Because such data are rare, we hypothesized that a single-year of hydrology data could be used to reach the same conclusion as with long-term data, if the data were obtained during a period of normal or below normal rainfall. Long-term (40-45 years)...

  16. 基于热裂法的液晶玻璃基板激光切割技术研究%Research of the technology of laser cutting LCD glass substrates based on thermal cracking method

    Institute of Scientific and Technical Information of China (English)

    汪旭煌; 姚建华; 周国斌; 楼程华; 杨渊思

    2011-01-01

    为了实现液晶玻璃基板的可控断裂切割,提出了一种液晶玻璃基板激光切割的新方法.采用Nd:YAG激光在液晶玻璃基板上预置初始裂纹,用CO2激光作为热源对其进行加热,并用Ar气进行冷却.分析了激光光斑尺寸和液晶玻璃基板厚度对切割的影响,并与传统机械切割进行了对比.用扫描电子显微镜检测了激光切割后切割表面及断面形貌.结果表明,运用机械切割,其基体内存在大量的微裂纹,而基于热裂法的液晶玻璃基板激光切割表面光滑平直、无毛刺、基体内无微裂纹存在,切割断面影响区小于20μm,优越性明显高于机械切割.%In order to cut liquid crystal display (LCD) glass substrates with controlled crack, a new method was put forward.Firstly, an initial crack was prepared on the surface of LCD substrates with Nd: YAG laser.Then, the substrate was heated up with CO2 laser and cooled with Ar gas.The effect of laser spot size and the thickness of LCD glass substrates on laser cutting quality was analyzed.The surface and performance of the cutting face after laser treatment was tested by means of scanning electronic microscope (SEM).The cutting quality was compared with the traditional mechanical cutting quality.The results show that there are lots of micro cracks in the substrates cut with traditional mechanical method, however, the laser cutting surface based on thermal cracking method is smooth and flat, no micro cracks exists, and the cutting section-affecting zone is less than 20μm.It is better to cut LCD glass substrates with thermal crack laser cutting method than mechanical cutting method.

  17. 超精密车削刀具偏置校正方法的研究%Research on Correction Method of Tool Decentration in Ultra Precision Cutting

    Institute of Scientific and Technical Information of China (English)

    王毅; 余景池

    2012-01-01

    刀具偏置的校正是超精密车削的首要任务,对提高超精密车削效率及精度具有非常重要的意义.传统的刀具偏置校正方法由于缺乏理论指导,存在校正效率低、精度低的缺点,即使利用国外进口软件进行辅助校正,也存在一些问题.对超精密车削中金刚石刀具偏置造成的车削误差展开理论分析,在理论分析的基础上,给出提高刀具偏置校正精度及效率的原则,提出一种新的刀具偏置校正方法——泰曼格林干涉仪在线检测法,并用实验证明其准确性.%Correction of tool decentraton is the key task and making important significance for the precision and efficiency of ultra precision cutting. For lacking of theoretical instruction, efficiency and accuracy of the traditional method of tool position correction are not high, even using imported software to help correction, some problems still exist. The cutting errors caused by the position of diamond tool were analyzed theoretically. Based on the theoretical analysis, the principle for enhancing the accuracy and efficiency of correction was given. A new tool position correction method, Twyman-Green interferometer online test method, was put forward. Its veracity is verified by a practical experiment.

  18. Study of propagation of Berberis thunbergii L. by cuttings, with using less-known methods of stimulation

    Directory of Open Access Journals (Sweden)

    Martin Říha

    2007-01-01

    Full Text Available The different type of own produce stimulators were tested at Berberis thunbergii L. 'Green Carpet', Berberis thunbergii 'Red Shift' and Berberis thunbergii 'Aureum'. We used the combination of growing inhibitors and quick-dip method, single quick-dip metod in solution of acetone and stimulant in form of gel. Groving inhibitors is including paclobutrazol and CCC in test. We used IBK, NAA, IAA and nicotin acid as auxins in quick-dip method. Medium was aceton solution.

  19. A systematic study of normalization methods for Infinium 450K methylation data using whole-genome bisulfite sequencing data.

    Science.gov (United States)

    Wang, Ting; Guan, Weihua; Lin, Jerome; Boutaoui, Nadia; Canino, Glorisa; Luo, Jianhua; Celedón, Juan Carlos; Chen, Wei

    2015-01-01

    DNA methylation plays an important role in disease etiology. The Illumina Infinium HumanMethylation450 (450K) BeadChip is a widely used platform in large-scale epidemiologic studies. This platform can efficiently and simultaneously measure methylation levels at ∼480,000 CpG sites in the human genome in multiple study samples. Due to the intrinsic chip design of 2 types of chemistry probes, data normalization or preprocessing is a critical step to consider before data analysis. To date, numerous methods and pipelines have been developed for this purpose, and some studies have been conducted to evaluate different methods. However, validation studies have often been limited to a small number of CpG sites to reduce the variability in technical replicates. In this study, we measured methylation on a set of samples using both whole-genome bisulfite sequencing (WGBS) and 450K chips. We used WGBS data as a gold standard of true methylation states in cells to compare the performances of 8 normalization methods for 450K data on a genome-wide scale. Analyses on our dataset indicate that the most effective methods are peak-based correction (PBC) and quantile normalization plus β-mixture quantile normalization (QN.BMIQ). To our knowledge, this is the first study to systematically compare existing normalization methods for Illumina 450K data using novel WGBS data. Our results provide a benchmark reference for the analysis of DNA methylation chip data, particularly in white blood cells.

  20. Friction self-locking of the clamping cutting taper hole method%摩擦式自锁装夹切削锥形孔件的方法

    Institute of Scientific and Technical Information of China (English)

    何端

    2014-01-01

    [Abstract]Friction-type self-locking clamping cutting,is one for the wal thickness is thinner, tapered bore and larger aperture pieces to get the appearance of smooth,no knives traces of inner and outer coaxial high,no deformation of the workpiece without pinching,high-productivity manufacturing method.%摩擦式自锁装夹切削,是一种针对壁厚较薄,而孔径较大的锥形孔件能获得外观光洁,无接刀痕迹,内外径同轴度高,工件无变形无夹伤,生产效率高的制造方法。

  1. Stacked propagation: a new way to grow native plants from root cuttings

    Science.gov (United States)

    David R. Dreesen; Thomas D. Landis; Jeremy R. Pinto

    2006-01-01

    Stacked propagation is a novel method of growing quaking aspen (Populus tremuloides Michx. [Salicaceae]) and other plants that reproduce from underground stems or root cuttings. Because the mother plant is not damaged, it is particularly well suited for rare plants or those that can’t be propagated by normal methods. Our initial trials indicate that...

  2. An Improved Genetic Fuzzy Logic Control Method to Reduce the Enlargement of Coal Floor Deformation in Shearer Memory Cutting Process.

    Science.gov (United States)

    Tan, Chao; Xu, Rongxin; Wang, Zhongbin; Si, Lei; Liu, Xinhua

    2016-01-01

    In order to reduce the enlargement of coal floor deformation and the manual adjustment frequency of rocker arms, an improved approach through integration of improved genetic algorithm and fuzzy logic control (GFLC) method is proposed. The enlargement of coal floor deformation is analyzed and a model is built. Then, the framework of proposed approach is built. Moreover, the constituents of GA such as tangent function roulette wheel selection (Tan-RWS) selection, uniform crossover, and nonuniform mutation are employed to enhance the performance of GFLC. Finally, two simulation examples and an industrial application example are carried out and the results indicate that the proposed method is feasible and efficient.

  3. Cut elimination in multifocused linear logic

    DEFF Research Database (Denmark)

    Guenot, Nicolas; Brock-Nannestad, Taus

    2015-01-01

    We study cut elimination for a multifocused variant of full linear logic in the sequent calculus. The multifocused normal form of proofs yields problems that do not appear in a standard focused system, related to the constraints in grouping rule instances in focusing phases. We show that cut...... elimination can be performed in a sensible way even though the proof requires some specific lemmas to deal with multifocusing phases, and discuss the difficulties arising with cut elimination when considering normal forms of proofs in linear logic....

  4. Cut elimination in multifocused linear logic

    DEFF Research Database (Denmark)

    Guenot, Nicolas; Brock-Nannestad, Taus

    2015-01-01

    We study cut elimination for a multifocused variant of full linear logic in the sequent calculus. The multifocused normal form of proofs yields problems that do not appear in a standard focused system, related to the constraints in grouping rule instances in focusing phases. We show that cut elim...... elimination can be performed in a sensible way even though the proof requires some specific lemmas to deal with multifocusing phases, and discuss the difficulties arising with cut elimination when considering normal forms of proofs in linear logic....

  5. Raman Spectroscopic Methods for Classification of Normal and Malignant Hypopharyngeal Tissues: An Exploratory Study

    Directory of Open Access Journals (Sweden)

    Parul Pujary

    2011-01-01

    Full Text Available Laryngeal cancer is more common in males. The present study is aimed at exploration of potential of conventional Raman spectroscopy in classifying normal from a malignant laryngopharyngeal tissue. We have recorded Raman spectra of twenty tissues (aryepiglottic fold using an in-house built Raman setup. The spectral features of mean malignant spectrum suggests abundance proteins whereas spectral features of mean normal spectrum indicate redundancy of lipids. PCA was employed as discriminating algorithm. Both, unsupervised and supervised modes of analysis as well as match/mismatch “limit test” methodology yielded clear classification among tissue types. The findings of this study demonstrate the efficacy of conventional Raman spectroscopy in classification of normal and malignant laryngopharyngeal tissues. A rigorous evaluation of the models with development of suitable fibreoptic probe may enable real-time Raman spectroscopic diagnosis of laryngopharyngeal cancers in future.

  6. Flexible Laser Metal Cutting

    DEFF Research Database (Denmark)

    Villumsen, Sigurd; Jørgensen, Steffen Nordahl; Kristiansen, Morten

    2014-01-01

    This paper describes a new flexible and fast approach to laser cutting called ROBOCUT. Combined with CAD/CAM technology, laser cutting of metal provides the flexibility to perform one-of-a-kind cutting and hereby realises mass production of customised products. Today’s laser cutting techniques...... possess, despite their wide use in industry, limitations regarding speed and geometry. Research trends point towards remote laser cutting techniques which can improve speed and geometrical freedom and hereby the competitiveness of laser cutting compared to fixed-tool-based cutting technology...... such as punching. This paper presents the concepts and preliminary test results of the ROBOCUT laser cutting technology, a technology which potentially can revolutionise laser cutting....

  7. Cutting state identification

    Energy Technology Data Exchange (ETDEWEB)

    Berger, B.S.; Minis, I.; Rokni, M. [Univ. of Maryland, College Park, MD (United States)] [and others

    1997-12-31

    Cutting states associated with the orthogonal cutting of stiff cylinders are identified through an analysis of the singular values of a Toeplitz matrix of third order cumulants of acceleration measurements. The ratio of the two pairs of largest singular values is shown to differentiate between light cutting, medium cutting, pre-chatter and chatter states. Sequences of cutting experiments were performed in which either depth of cut or turning frequency was varied. Two sequences of experiments with variable turning frequency and five with variable depth of cut, 42 cutting experiments in all, provided a database for the calculation of third order cumulants. Ratios of singular values of cumulant matrices find application in the analysis of control of orthogonal cutting.

  8. Performances of cutting fluids in turning. Mineral oil - RM

    DEFF Research Database (Denmark)

    Axinte, Dragos Aurelian; Belluco, Walter

    1999-01-01

    Scope of the present measurement campaign is the evaluation of the cutting fluid performance. The report presents the standard routine and the results obtained when turning stainless steel and brass with a commercial vegetable based oil called RM. The methods were developed to be applicable...... in normal workshop conditions using common equipment for turning as well as in a test laboratory. The evaluation tests can be carried out using the desired number of repetitions in terms of workpiece materials and tools....

  9. 海洋废弃平台桩基拆除切割方法的研究和发展%Research and development of cutting methods for removing piles of offshore discarded platform

    Institute of Scientific and Technical Information of China (English)

    王海波; 张岚; 孟庆鑫; 王喆

    2011-01-01

    桩基切割是海洋废弃平台拆除作业中的一项比较困难的工作.综述了可用于平台桩基切割的潜水员水下火焰切割、聚能爆破切割、旋转式内部机械切割、磨料高压射流切割和金刚石绳锯机切割五种方法.详细叙述了金刚石绳锯机切割海洋结构物的发展历程,对用于废弃平台桩基切割的绳锯机进行了结构及功能的上的全面介绍,并且分析了国外此方面的最新成果,展示了一些典型的绳锯机切割桩基工程实例.通过对比分析可知金刚石绳锯机切割桩基是一种比较安全可靠、先进环保的方法.%Piles cutting is a more difficult work in removal operations on offshore discarded platform.Five cutting ways are described in it which include divers underwater flarne cutting,shaped blasting cut ting,rotary internal mechanical cutting, high pressure abrasive jet cutting and diamond wire saw cutting. The development course of diamond wire saw cutting marine structures is expounded in it,the structure and function of the wire saw for cutting discarded platform piles is comprehensively introduced as well. then the latest foreign achievements in this area are analyzed as well as some typical engineering cases of wire saw cutting piles are showed.It is known that diamond wire saw cutting is a more safer, reliable, advanced as well as environmentally-friendly cutting method by contrast analyzing.

  10. Longwall mining “cutting cantilever beam theory” and 110 mining method in China—The third mining science innovation

    OpenAIRE

    Manchao He; Guolong Zhu; Zhibiao Guo

    2015-01-01

    With the third innovation in science and technology worldwide, China has also experienced this marvelous progress. Concerning the longwall mining in China, the “masonry beam theory” (MBT) was first proposed in the 1960s, illustrating that the transmission and equilibrium method of overburden pressure using reserved coal pillar in mined-out areas can be realized. This forms the so-called “121 mining method”, which lays a solid foundation for development of mining science and technology in Chin...

  11. Virtual tissue alignment and cutting plane definition--a new method to obtain optimal longitudinal histological sections.

    Science.gov (United States)

    Danz, J C; Habegger, M; Bosshardt, D D; Katsaros, C; Stavropoulos, A

    2014-02-01

    Histomorphometric evaluation of the buccal aspects of periodontal tissues in rodents requires reproducible alignment of maxillae and highly precise sections containing central sections of buccal roots; this is a cumbersome and technically sensitive process due to the small specimen size. The aim of the present report is to describe and analyze a method to transfer virtual sections of micro-computer tomographic (CT)-generated image stacks to the microtome for undecalcified histological processing and to describe the anatomy of the periodontium in rat molars. A total of 84 undecalcified sections of all buccal roots of seven untreated rats was analyzed. The accuracy of section coordinate transfer from virtual micro-CT slice to the histological slice, right-left side differences and the measurement error for linear and angular measurements on micro-CT and on histological micrographs were calculated using the Bland-Altman method, interclass correlation coefficient and the method of moments estimator. Also, manual alignment of the micro-CT-scanned rat maxilla was compared with multiplanar computer-reconstructed alignment. The supra alveolar rat anatomy is rather similar to human anatomy, whereas the alveolar bone is of compact type and the keratinized gingival epithelium bends apical to join the junctional epithelium. The high methodological standardization presented herein ensures retrieval of histological slices with excellent display of anatomical microstructures, in a reproducible manner, minimizes random errors, and thereby may contribute to the reduction of number of animals needed.

  12. Virtual tissue alignment and cutting plane definition – a new method to obtain optimal longitudinal histological sections

    Science.gov (United States)

    Danz, J C; Habegger, M; Bosshardt, D D; Katsaros, C; Stavropoulos, A

    2014-01-01

    Histomorphometric evaluation of the buccal aspects of periodontal tissues in rodents requires reproducible alignment of maxillae and highly precise sections containing central sections of buccal roots; this is a cumbersome and technically sensitive process due to the small specimen size. The aim of the present report is to describe and analyze a method to transfer virtual sections of micro-computer tomographic (CT)-generated image stacks to the microtome for undecalcified histological processing and to describe the anatomy of the periodontium in rat molars. A total of 84 undecalcified sections of all buccal roots of seven untreated rats was analyzed. The accuracy of section coordinate transfer from virtual micro-CT slice to the histological slice, right–left side differences and the measurement error for linear and angular measurements on micro-CT and on histological micrographs were calculated using the Bland–Altman method, interclass correlation coefficient and the method of moments estimator. Also, manual alignment of the micro-CT-scanned rat maxilla was compared with multiplanar computer-reconstructed alignment. The supra alveolar rat anatomy is rather similar to human anatomy, whereas the alveolar bone is of compact type and the keratinized gingival epithelium bends apical to join the junctional epithelium. The high methodological standardization presented herein ensures retrieval of histological slices with excellent display of anatomical microstructures, in a reproducible manner, minimizes random errors, and thereby may contribute to the reduction of number of animals needed. PMID:24266502

  13. Laser Cutting of Leather: Tool for Industry or Designers?

    Science.gov (United States)

    Stepanov, Alexander; Manninen, Matti; Pärnänen, Inni; Hirvimäki, Marika; Salminen, Antti

    Currently technologies which are applied for leather cutting include slitting knifes, die press techniques and manual cutting. Use of laser technology has grown significantly during recent years due to number of advantages over conventional cutting methods; flexibility, high production speed, possibility to cut complex geometries, easier cutting of customized parts, and less leftovers of leather makes laser cutting more and more economically attractive to apply for leather cutting. Laser technology provides advantages in cutting complex geometries, stable cutting quality and possibility to utilize leather material in economically best way. Constant quality is important in industrial processes and laser technology fulfills this requirement: properly chosen laser cutting parameters provides identical cuts. Additionally, laser technology is very flexible in terms of geometries: complex geometries, individual designs, prototypes and small scale products can be manufactured by laser cutting. Variety of products, which needed to be cut in small volumes, is also the application where laser cutting can be more beneficial due to possibility to change production from one product to another only by changing geometry without a need to change cutting tool. Disadvantages of laser processing include high initial investment costs and some running costs due to maintenance and required gas supply for the laser. Higher level of operator's expertise is required due to more complicated machinery in case of laser cutting. This study investigates advantages and disadvantages of laser cutting in different areas of application and provides comparison between laser cutting and mechanical cutting of leather.

  14. Investigating the Effect of Normalization Norms in Flexible Manufacturing Sytem Selection Using Multi-Criteria Decision-Making Methods

    Directory of Open Access Journals (Sweden)

    Prasenjit Chatterjee

    2014-07-01

    Full Text Available The main objective of this paper is to assess the effect of different normalization norms within multi-criteria decisionmaking (MADM models. Three well accepted MCDM tools, namely, preference ranking organization method for enrichment evaluation (PROMETHEE, grey relation analysis (GRA and technique for order preference by similarity to ideal solution (TOPSIS methods are applied for solving a flexible manufacturing system (FMS selection problem in a discrete manufacturing environment. Finally, by the introduction of different normalization norms to the decision algorithms, its effct on the FMS selection problem using these MCDM models are also studied.

  15. Simulation of Laser Cutting

    Science.gov (United States)

    Schulz, Wolfgang; Nießen, Markus; Eppelt, Urs; Kowalick, Kerstin

    Laser cutting is a thermal separation process widely used in shaping and contour cutting applications. There are, however, gaps in understanding the dynamics of the process, especially issues related to cut quality. This work describes the advances in fundamental physical modelling and process monitoring of laser cutting, as well as time varying processes such as contour cutting. Diagnosis of ripple and dross formation is advanced to observe the melt flow and its separation simultaneously as well as the spatial shape of the cut kerf.

  16. Histological versus stereological methods applied at spermatogonia during normal human development

    DEFF Research Database (Denmark)

    Cortes, D

    1990-01-01

    The number of spermatogonia per tubular transverse section (S/T), and the percentage of seminiferous tubulus containing spermatogonia (the fertility index (FI] were measured in 40 pairs of normal autopsy testes aged 28 weeks of gestation-40 years. S/T and FI showed similar changes during the whol...

  17. Histological versus stereological methods applied at spermatogonia during normal human development

    DEFF Research Database (Denmark)

    Cortes, Dina

    1990-01-01

    The number of spermatogonia per tubular transverse section (S/T), and the percentage of seminiferous tubulus containing spermatogonia (the fertility index (FI] were measured in 40 pairs of normal autopsy testes aged 28 weeks of gestation-40 years. S/T and FI showed similar changes during the whol...

  18. Lung lobe segmentation based on statistical atlas and graph cuts

    Science.gov (United States)

    Nimura, Yukitaka; Kitasaka, Takayuki; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku

    2012-03-01

    This paper presents a novel method that can extract lung lobes by utilizing probability atlas and multilabel graph cuts. Information about pulmonary structures plays very important role for decision of the treatment strategy and surgical planning. The human lungs are divided into five anatomical regions, the lung lobes. Precise segmentation and recognition of lung lobes are indispensable tasks in computer aided diagnosis systems and computer aided surgery systems. A lot of methods for lung lobe segmentation are proposed. However, these methods only target the normal cases. Therefore, these methods cannot extract the lung lobes in abnormal cases, such as COPD cases. To extract lung lobes in abnormal cases, this paper propose a lung lobe segmentation method based on probability atlas of lobe location and multilabel graph cuts. The process consists of three components; normalization based on the patient's physique, probability atlas generation, and segmentation based on graph cuts. We apply this method to six cases of chest CT images including COPD cases. Jaccard index was 79.1%.

  19. A robust multiple-locus method for quantitative trait locus analysis of non-normally distributed multiple traits.

    Science.gov (United States)

    Li, Z; Möttönen, J; Sillanpää, M J

    2015-12-01

    Linear regression-based quantitative trait loci/association mapping methods such as least squares commonly assume normality of residuals. In genetics studies of plants or animals, some quantitative traits may not follow normal distribution because the data include outlying observations or data that are collected from multiple sources, and in such cases the normal regression methods may lose some statistical power to detect quantitative trait loci. In this work, we propose a robust multiple-locus regression approach for analyzing multiple quantitative traits without normality assumption. In our method, the objective function is least absolute deviation (LAD), which corresponds to the assumption of multivariate Laplace distributed residual errors. This distribution has heavier tails than the normal distribution. In addition, we adopt a group LASSO penalty to produce shrinkage estimation of the marker effects and to describe the genetic correlation among phenotypes. Our LAD-LASSO approach is less sensitive to the outliers and is more appropriate for the analysis of data with skewedly distributed phenotypes. Another application of our robust approach is on missing phenotype problem in multiple-trait analysis, where the missing phenotype items can simply be filled with some extreme values, and be treated as outliers. The efficiency of the LAD-LASSO approach is illustrated on both simulated and real data sets.

  20. Métodos de conservação aplicados a melão minimamente processado Conservation methods applied to fresh-cut melon

    Directory of Open Access Journals (Sweden)

    Anaí Peter Batista

    2013-05-01

    Full Text Available O objetivo desta revisão é apresentar alguns métodos de conservação que podem ser utilizados para prolongar a vida útil do melão minimamente processado. Dentre os métodos, serão abordados revestimento comestível, irradiação, antimicrobianos naturais, antioxidantes, agentes de firmeza, atmosfera modificada, branqueamento, luz ultravioleta e alta pressão. Dependendo do método pode haver redução das alterações associadas ao processo mínimo do melão, como a perda de água, alteração da cor e firmeza, alteração do metabolismo e crescimento de micro-organismos, sendo o resultado muitas vezes dependente da cultivar do melão utilizado.The objective of this review is to present some conservation methods that can be used to prolong the life of fresh-cut melon. Among the methods, edible coating, irradiation, natural antimicrobials, antioxidants, firmness agent, modified atmosphere, whitening, ultraviolet light and high pressure will be discussed. Depending on the method, the changes associated to minimum process of melon, such as water loss, change in color and firmness, change in the metabolism and growth of micro-organisms can be reduced and the result is often dependent on the melon cultivar used.

  1. Study of normal and shear material properties for viscoelastic model of asphalt mixture by discrete element method

    DEFF Research Database (Denmark)

    Feng, Huan; Pettinari, Matteo; Stang, Henrik

    2015-01-01

    In this paper, the viscoelastic behavior of asphalt mixture was studied by using discrete element method. The dynamic properties of asphalt mixture were captured by implementing Burger’s contact model. Different ways of taking into account of the normal and shear material properties of asphalt mi...

  2. Effects of heat on cut mark characteristics.

    Science.gov (United States)

    Waltenberger, Lukas; Schutkowski, Holger

    2017-02-01

    Cut marks on bones provide crucial information about tools used and their mode of application, both in archaeological and forensic contexts. Despite a substantial amount of research on cut mark analysis and the influence of fire on bones (shrinkage, fracture pattern, recrystallisation), there is still a lack of knowledge in cut mark analysis on burnt remains. This study provides information about heat alteration of cut marks and whether consistent features can be observed that allow direct interpretation of the implemented tools used. In a controlled experiment, cut marks (n=25) were inflicted on pig ribs (n=7) with a kitchen knife and examined using micro-CT and digital microscopy. The methods were compared in terms of their efficacy in recording cut marks on native and heat-treated bones. Statistical analysis demonstrates that floor angles and the maximum slope height of cuts undergo significant alteration, whereas width, depth, floor radius, slope, and opening angle remain stable. Micro-CT and digital microscopy are both suitable methods for cut mark analysis. However, significant differences in measurements were detected between both methods, as micro-CT is less accurate due to the lower resolution. Moreover, stabbing led to micro-fissures surrounding the cuts, which might also influence the alteration of cut marks.

  3. The Multiobjective Trajectory Optimization for Hypersonic Glide Vehicle Based on Normal Boundary Intersection Method

    Directory of Open Access Journals (Sweden)

    Zhengnan Li

    2016-01-01

    Full Text Available To solve the multiobjective optimization problem on hypersonic glider vehicle trajectory design subjected to complex constraints, this paper proposes a multiobjective trajectory optimization method that combines the boundary intersection method and pseudospectral method. The multiobjective trajectory optimization problem (MTOP is established based on the analysis of the feature of hypersonic glider vehicle trajectory. The MTOP is translated into a set of general optimization subproblems by using the boundary intersection method and pseudospectral method. The subproblems are solved by nonlinear programming algorithm. In this method, the solution that has been solved is employed as the initial guess for the next subproblem so that the time consumption of the entire multiobjective trajectory optimization problem shortens. The maximal range and minimal peak heat problem is solved by the proposed method. The numerical results demonstrate that the proposed method can obtain the Pareto front of the optimal trajectory, which can provide the reference for the trajectory design of hypersonic glider vehicle.

  4. STUDY ON SUB-DRY CUTTING GCr12

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Through the comparison study on cutting force, cutting temperature and machined surface quality with the sub-dry cutting traditional cooling method, it is shown that sub-dry cutting can retard the wear of the tooled parts. It is beneficial to realize the production without pollution and meet the demand of clean environment.

  5. NormaCurve: a SuperCurve-based method that simultaneously quantifies and normalizes reverse phase protein array data.

    Directory of Open Access Journals (Sweden)

    Sylvie Troncale

    Full Text Available MOTIVATION: Reverse phase protein array (RPPA is a powerful dot-blot technology that allows studying protein expression levels as well as post-translational modifications in a large number of samples simultaneously. Yet, correct interpretation of RPPA data has remained a major challenge for its broad-scale application and its translation into clinical research. Satisfying quantification tools are available to assess a relative protein expression level from a serial dilution curve. However, appropriate tools allowing the normalization of the data for external sources of variation are currently missing. RESULTS: Here we propose a new method, called NormaCurve, that allows simultaneous quantification and normalization of RPPA data. For this, we modified the quantification method SuperCurve in order to include normalization for (i background fluorescence, (ii variation in the total amount of spotted protein and (iii spatial bias on the arrays. Using a spike-in design with a purified protein, we test the capacity of different models to properly estimate normalized relative expression levels. The best performing model, NormaCurve, takes into account a negative control array without primary antibody, an array stained with a total protein stain and spatial covariates. We show that this normalization is reproducible and we discuss the number of serial dilutions and the number of replicates that are required to obtain robust data. We thus provide a ready-to-use method for reliable and reproducible normalization of RPPA data, which should facilitate the interpretation and the development of this promising technology. AVAILABILITY: The raw data, the scripts and the normacurve package are available at the following web site: http://microarrays.curie.fr.

  6. A method to calculate zero-signature satellite laser ranging normal points for millimeter geodesy - a case study with Ajisai

    Science.gov (United States)

    Kucharski, Daniel; Kirchner, Georg; Otsubo, Toshimichi; Koidl, Franz

    2015-03-01

    High repetition-rate satellite laser ranging (SLR) offers new possibilities for the post-processing of the range measurements. We analyze 11 years of kHz SLR passes of the geodetic satellite Ajisai delivered by Graz SLR station (Austria) in order to improve the accuracy and precision of the principal SLR data product - normal points. The normal points are calculated by three different methods: 1) the range residuals accepted by the standard 2.5 sigma filter, 2) the range residuals accepted by the leading edge filter and 3) the range residuals given by the single corner cube reflector (CCR) panels of Ajisai.

  7. The CCTL (Cpf1-assisted Cutting and Taq DNA ligase-assisted Ligation) method for efficient editing of large DNA constructs in vitro.

    Science.gov (United States)

    Lei, Chao; Li, Shi-Yuan; Liu, Jia-Kun; Zheng, Xuan; Zhao, Guo-Ping; Wang, Jin

    2017-01-23

    As Cpf1 cleaves double-stranded DNA in a staggered way, it can be used in DNA assembly. However, the Cpf1 cleavage was found to be inaccurate, which may cause errors in DNA assembly. Here, the Cpf1 cleavage sites were precisely characterized, where the cleavage site on the target strand was around the 22nd base relative to the protospacer adjacent motif site, but the cleavage on the non-target strand was affected by the spacer length. When the spacer length was 20 nt or longer, Cpf1 mainly cleaved around the 14th and the 18th bases on the non-target strand; otherwise, with a shorter spacer (i.e. 17-19 nt), Cpf1 mainly cleaved after the 14th base, generating 8-nt sticky ends. With this finding, Cpf1 with a 17-nt spacer crRNA were employed for in vitro substitution of the actII-orf4 promoter in the actinorhodin biosynthetic cluster with a constitutively expressing promoter. The engineered cluster yielded more actinorhodin and produced actinorhodin from an earlier phase. Moreover, Taq DNA ligase was further employed to increase both the ligation efficiency and the ligation accuracy of the method. We expect this CCTL (Cpf1-assisted Cutting and Taq DNA ligase-mediated Ligation) method can be widely used in in vitro editing of large DNA constructs.

  8. Distance Determination Method for Normally Distributed Obstacle Avoidance of Mobile Robots in Stochastic Environments

    Directory of Open Access Journals (Sweden)

    Jinhong Noh

    2016-04-01

    Full Text Available Obstacle avoidance methods require knowledge of the distance between a mobile robot and obstacles in the environment. However, in stochastic environments, distance determination is difficult because objects have position uncertainty. The purpose of this paper is to determine the distance between a robot and obstacles represented by probability distributions. Distance determination for obstacle avoidance should consider position uncertainty, computational cost and collision probability. The proposed method considers all of these conditions, unlike conventional methods. It determines the obstacle region using the collision probability density threshold. Furthermore, it defines a minimum distance function to the boundary of the obstacle region with a Lagrange multiplier method. Finally, it computes the distance numerically. Simulations were executed in order to compare the performance of the distance determination methods. Our method demonstrated a faster and more accurate performance than conventional methods. It may help overcome position uncertainty issues pertaining to obstacle avoidance, such as low accuracy sensors, environments with poor visibility or unpredictable obstacle motion.

  9. Distance Determination Method for Normally Distributed Obstacle Avoidance of Mobile Robots in Stochastic Environments

    Directory of Open Access Journals (Sweden)

    Jinhong Noh

    2016-04-01

    Full Text Available Obstacle avoidance methods require knowledge of the distance between a mobile robot and obstacles in the environment. However, in stochastic environments, distance determination is difficult because objects have position uncertainty. The purpose of this paper is to determine the distance between a robot and obstacles represented by probability distributions. Distance determination for obstacle avoidance should consider position uncertainty, computational cost and collision probability. The proposed method considers all of these conditions, unlike conventional methods. It determines the obstacle region using the collision probability density threshold. Furthermore, it defines a minimum distance function to the boundary of the obstacle region with a Lagrange multiplier method. Finally, it computes the distance numerically. Simulations were executed in order to compare the performance of the distance determination methods. Our method demonstrated a faster and more accurate performance than conventional methods. It may help overcome position uncertainty issues pertaining to obstacle avoidance, such as low accuracy sensors, environments with poor visibility or unpredictable obstacle motion.

  10. Correction of vibrational broadening in molecular dynamics clusters with the normal mode optimization method.

    Science.gov (United States)

    Hudecová, Jana; Hopmann, Kathrin H; Bouř, Petr

    2012-01-12

    Vibrational properties of solutions are frequently simulated with clusters of a solute and a few solvent molecules obtained during molecular dynamics (MD) simulations. The raw cluster geometries, however, often provide unrealistic vibrational band broadening, for both ab initio and empirical force fields. In this work, partial optimization in normal-mode coordinates is used on empirical basis to reduce the broadening. The origin of the error is discussed on a simplified two-dimensional system, which indicates that the problem is caused by the anharmonic MD potential, mode coupling, and neglect of quantum effects. Then the procedure of partial geometry optimization on Raman and Raman optical activity (ROA) spectra is applied and analyzed for the solvated lactamide molecule. Comparison to experiment demonstrates that the normal-mode partial optimization technique with a suitable frequency limit can significantly reduce the broadening error. For lactamide, experimental and simulated vibrational bandwidths are compared; the most realistic theoretical spectra are obtained for partially optimized clusters with the vibrational wavenumber cutoff of about 200 cm(-1).

  11. Cutting system arrangement method of hard rock boring machine%硬岩掘进机破碎机构布置方法

    Institute of Scientific and Technical Information of China (English)

    毛君; 谢春雪; 梁晗; 黄华

    2013-01-01

      Hard rock roadheader is the key equipment in laneway constructions. In view of the traditional point-attack picks sever loss while breaking rocks, this paper put forward a shock and rolling breaking method which utilizes impact mechanism and a rolling mechanism to break rocks. This study analyzed the duration, the reliability and the driving efficiency on the cutter of the hard rock tunnel boring machine, and optimized the relations between the hard rock crushers using multi-disciplinary optimization method. In addition, this paper researched on factors that influence the layout of crushers which include global coordinating optimization as main factor, coordinating the optimization of parameters, the impact parameter, cutting force, cutting parameters and rock properties matching. The results of study show that the arrangement of the hard rock crusher was feasible, and it is optimal in energy consumption, tool wearing, driving efficiency and overall performance.%  针对传统镐齿破碎岩石的损耗严重,提出一种冲击滚压破碎方式。其结构采用冲击机构和滚压机构配合破碎岩石,考虑影响硬岩掘进机刀具的寿命、可靠性、掘进效率因素,采用多学科优化方法构建硬岩掘进机破碎机构布置关系,并以全局协调优化为主导,总体协调优化参数,对破碎机构的布置进行冲击参数、刀具受力、切削参数、围岩属性的匹配研究。研究结果表明:该硬岩掘进机破碎机构布置方法可行,降低了硬岩掘进机能量消耗、刀具的磨损,提高了掘进效率,使整机性能达到最优。

  12. Cutting as a continuous business process

    Directory of Open Access Journals (Sweden)

    Miro Gradišar

    2009-11-01

    Full Text Available A review of state-of-the-art methods for cutting stock problem optimisation shows that the current methods lead to near-optimum results for the instantaneous optimisation of trim loss. Further optimisation of this activity would not bring a considerable improvement. Therefore, the paper treats cutting stock as a continuous business process and not as an isolated activity. An exact method for a general one-dimensional cutting stock problem is presented and tested. The method is mainly suitable for smaller orders. It is then applied to continuous cutting and used to develop a method for assessing cutting costs in consecutive time periods. The modified method finds a good solution over the whole time-span, rather than just local optima.

  13. The Multiobjective Trajectory Optimization for Hypersonic Glide Vehicle Based on Normal Boundary Intersection Method

    OpenAIRE

    Zhengnan Li; Tao Yang; Zhiwei Feng

    2016-01-01

    To solve the multiobjective optimization problem on hypersonic glider vehicle trajectory design subjected to complex constraints, this paper proposes a multiobjective trajectory optimization method that combines the boundary intersection method and pseudospectral method. The multiobjective trajectory optimization problem (MTOP) is established based on the analysis of the feature of hypersonic glider vehicle trajectory. The MTOP is translated into a set of general optimization subproblems by u...

  14. Device for cutting protrusions

    Science.gov (United States)

    Bzorgi, Fariborz M.

    2011-07-05

    An apparatus for clipping a protrusion of material is provided. The protrusion may, for example, be a bolt head, a nut, a rivet, a weld bead, or a temporary assembly alignment tab protruding from a substrate surface of assembled components. The apparatus typically includes a cleaver having a cleaving edge and a cutting blade having a cutting edge. Generally, a mounting structure configured to confine the cleaver and the cutting blade and permit a range of relative movement between the cleaving edge and the cutting edge is provided. Also typically included is a power device coupled to the cutting blade. The power device is configured to move the cutting edge toward the cleaving edge. In some embodiments the power device is activated by a momentary switch. A retraction device is also generally provided, where the retraction device is configured to move the cutting edge away from the cleaving edge.

  15. The normalization of surface anisotropy effects present in SEVIRI reflectances by using the MODIS BRDF method

    DEFF Research Database (Denmark)

    Proud, Simon Richard; Zhang, Qingling; Schaaf, Crystal;

    2014-01-01

    A modified version of the MODerate resolution Imaging Spectroradiometer (MODIS) bidirectional reflectance distribution function (BRDF) algorithm is presented for use in the angular normalization of surface reflectance data gathered by the Spinning Enhanced Visible and InfraRed Imager (SEVIRI...... acquisition period than the comparable MODIS products while, at the same time, removing many of the angular perturbations present within the original MSG data. The NBAR data are validated against reflectance data from the MODIS instrument and in situ data gathered at a field location in Africa throughout 2008....... It is found that the MSG retrievals are stable and are of high-quality across much of the SEVIRI disk while maintaining a higher temporal resolution than the MODIS BRDF products. However, a number of circumstances are discovered whereby the BRDF model is unable to function correctly with the SEVIRI...

  16. A Frequency Domain Method for the Generation of Partially Coherent Normal Stationary Time Domain Signals

    Directory of Open Access Journals (Sweden)

    David O. Smallwood

    1993-01-01

    , that relates pairs of elements of the vector random process {X(t},−∞normal stationary sampled time histories, {X(t}, of arbitrary length.

  17. A new derivative with normal distribution kernel: Theory, methods and applications

    Science.gov (United States)

    Atangana, Abdon; Gómez-Aguilar, J. F.

    2017-06-01

    New approach of fractional derivative with a new local kernel is suggested in this paper. The kernel introduced in this work is the well-known normal distribution that is a very common continuous probability distribution. This distribution is very important in statistics and also highly used in natural science and social sciences to portray real-valued random variables whose distributions are not known. Two definitions are suggested namely Atangana-Gómez Averaging in Liouville-Caputo and Riemann-Liouville sense. We presented some relationship with existing integrals transform operators. Numerical approximations for first and second order approximation are derived in detail. Some Applications of the new mathematical tools to describe some real world problems are presented in detail. This is a new door opened the field of statistics, natural and socials sciences.

  18. Exploring Normalization and Network Reconstruction Methods using In Silico and In Vivo Models

    Science.gov (United States)

    Abstract: Lessons learned from the recent DREAM competitions include: The search for the best network reconstruction method continues, and we need more complete datasets with ground truth from more complex organisms. It has become obvious that the network reconstruction methods t...

  19. Measurement of plasma histamine: description of an improved method and normal values

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J.; Warren, K.; Merlin, S.; Metcalfe, D.D.; Kaliner, M.

    1982-08-01

    The single isotopic-enzymatic assay of histamine was modified to increase its sensitivity and to facilitate measurement of plasma histamine levels. The modification involved extracting /sup 3/H-1-methylhistamine (generated by the enzyme N-methyltransferase acting on histamine in the presence of S-(methyl-/sup 3/H)-adenosyl-L-methionine) into chloroform and isolating the /sup 3/H-1-methylhistamine by thin-layer chromatography (TLC). The TLC was developed in acetone:ammonium hydroxide (95:10), and the methylhistamine spot (Rf . 0.50) was identified with an o-phthalaldehyde spray, scraped from the plate, and assayed in a scintillation counter. The assay in plasma demonstrated a linear relationship from 200 to 5000 pg histamine/ml. Plasma always had higher readings than buffer, and dialysis of plasma returned these values to the same level as buffer, suggesting that the baseline elevations might be attributable to histamine. However, all histamine standard curves were run in dialyzed plasma to negate any additional influences plasma might exert on the assay. The arithmetic mean (+/- SEM) in normal plasma histamine was 318.4 +/- 25 pg/ml (n . 51), and the geometric mean was 280 +/- 35 pg/ml. Plasma histamine was significantly elevated by infusion of histamine at 0.05 to 1.0 micrograms/kg/min or by cold immersion of the hand of a cold-urticaria patient. Therefore this modified isotopic-enzymatic assay of histamine is extremely sensitive, capable of measuring fluctuations in plasma histamine levels within the normal range, and potentially useful in analysis of the role histamine plays in human physiology.

  20. Thermophysical problems of laser cutting of metals

    Directory of Open Access Journals (Sweden)

    Orishich Anatoliy

    2017-01-01

    Full Text Available Variety and complex interaction of physical processes during laser cutting is a critical characteristic of the laser cutting of metals. Small spatial and temporal scales complicate significantly the experimental investigations of the multi-phase fluid flow in the conditions of laser cutting of metals. In these conditions, the surface formed during the cutting is an indicator determining the melt flow character. The quantitative parameter reflecting the peculiarities of the multi-phase fluid flow, is normally the roughness of the forming surface, and the minimal roughness is the criterion of the qualitative flow [1 – 2]. The purpose of this work is to perform the experimental comparative investigation of the thermophysical pattern of the multi-phase melt flow in the conditions of the laser cutting of metals with the laser wavelength of 10.6 μm and 1.07 μm.

  1. A cactus theorem for end cuts

    CERN Document Server

    Evangelidou, Anastasia

    2011-01-01

    Dinits-Karzanov-Lomonosov showed that it is possible to encode all minimal edge cuts of a graph by a tree-like structure called a cactus. We show here that minimal edge cuts separating ends of the graph rather than vertices can be `encoded' also by a cactus. We apply our methods to finite graphs as well and we show that several types of cuts can be encoded by cacti.

  2. A Finite Element Method for Computation of Structural Intensity by the Normal Mode Approach

    Science.gov (United States)

    Gavrić, L.; Pavić, G.

    1993-06-01

    A method for numerical computation of structural intensity in thin-walled structures is presented. The method is based on structural finite elements (beam, plate and shell type) enabling computation of real eigenvalues and eigenvectors of the undamped structure which then serve in evaluation of complex response. The distributed structural damping is taken into account by using the modal damping concept, while any localized damping is treated as an external loading, determined by use of impedance matching conditions and eigenproperties of the structure. Emphasis is given to aspects of accuracy of the results and efficiency of the numerical procedures used. High requirements on accuracy of the structural response (displacements and stresses) needed in intensity applications are satisfied by employing the "swept static solution", which effectively takes into account the influence of higher modes otherwise inaccessible to numerical computation. A comparison is made between the results obtained by using analytical methods and the proposed numerical procedure to demonstrate the validity of the method presented.

  3. A method of production of boneless chicken wings (drumettes and winglets) by separation of periosteum from bone without cutting skin and muscles.

    Science.gov (United States)

    Nakano, T; Ozimek, L

    2015-11-01

    The deboning of broiler chicken wings, including drumettes and winglets, is not common in the poultry processing industry. However, consumers who like convenient foods may be interested in boneless products. Samples of broiler wings were deboned by articular cartilage dislocation and periosteum stripping without cutting skin and muscles to obtain boneless drumettes and winglets, with each having inner space formed by bone removal. The average weight of bone-in winglets (30.7 g) was less (P winglets (80.1). There was a smaller number of muscles in the drumettes than in the winglets, but major muscles in the drumettes were larger than any muscles in the winglets. The average weight of muscle was greater (P winglets, and thus the muscle/skin ratio was approximately twice as high (P < 0.05) in the drumettes. The size and shape were different between the bone-in and boneless products, as expected. When a cooked product was examined, no appreciable inner space (resulting from bone removal) was seen on its transverse section. The advantages of boneless wing products over bone-in wing products were discussed. It was concluded that the method described in the present study is useful for the production of high-quality boneless wing products.

  4. Hydro-jet cutting: a method for selective surgical dissection of nerve tissue. An experimental study on the sciatic nerve of rats.

    Science.gov (United States)

    Kaduk, W M; Stengel, B; Pöhl, A; Nizze, H; Gundlach, K K

    1999-10-01

    The aim of this study was to answer the question: is it possible to save motor nerves when dissecting tissue with the hydro-jet dissector? In order to study the influence of the hydro-jet on motor nerves the function of the sciatic nerves of 10 Wistar rats was evaluated. The sciatic nerves were dissected bilaterally and only the left one was exposed to the hydro-jet. The water-jet emerged from a nozzle with a diameter of 0.1 mm and was applied to the nerve for 2, 5 or 10 s and with jet pressures of 80, 85 and 90 bar, respectively. After the operation the animals were observed for 5 months in order to monitor the degree of limping using a scale with 10 clinical grades of function. Five months postoperatively the animals were sacrificed and the sciatic nerves were studied by light and electron microscopy. It was found that hydro-jet pressures of 80 bar and exposure times of 2 s had already lead to irreversible damage to the sciatic nerve. Therefore further studies with lower pressures or shorter exposure times are required before considering hydro-jet cutting for parotid gland surgery. It must be confirmed as harmless to motor nerves before applying this method in humans.

  5. Methods and Systems for Measurement and Estimation of Normalized Contrast in Infrared Thermography

    Science.gov (United States)

    Koshti, Ajay M. (Inventor)

    2015-01-01

    Methods and systems for converting an image contrast evolution of an object to a temperature contrast evolution and vice versa are disclosed, including methods for assessing an emissivity of the object; calculating an afterglow heat flux evolution; calculating a measurement region of interest temperature change; calculating a reference region of interest temperature change; calculating a reflection temperature change; calculating the image contrast evolution or the temperature contrast evolution; and converting the image contrast evolution to the temperature contrast evolution or vice versa, respectively.

  6. Vestibular evoked myogenic potential eliciting in normal subjects: comparison of four different methods.

    Science.gov (United States)

    Eleftheriadou, Anna; Deftereos, Spyros N; Zarikas, Vasilios; Panagopoulos, Grigoris; Korres, Stavros; Sfetsos, Sotirios; Karageorgiou, Klimentini L; Ferekidou, Elisa; Kandiloros, Dimitrios

    2008-10-01

    Vestibular evoked myogenic potential (VEMP) recording is a new method for testing the otolith receptors and vestibulospinal pathways. The aim of this study was to evaluate the characteristics of VEMP using four different techniques to find reasons to prefer one type of recording over the others. Twenty healthy persons, 10 males and 10 females with ages ranging from 20 to 57 years (mean age 41 years), were enrolled in this study. Eliciting of VEMPs by using monaural or binaural acoustic stimulation and unilateral or bilateral SCM contraction was evaluated; 105 dB NHL acoustic stimulation consisting of 145 dB rarefaction clicks was applied. Latencies of p13, n23, n34, p44 peaks; amplitudes p13-n23 and n34-p44; and interaural amplitude differences (IADs) were assessed. All four methods elicited constant and evident waveforms. The reliability coefficients of amplitudes were high for all four methods and for both waves. However, the higher scores of reliability appeared for the monaural-ipsilateral recording. The results indicated no statistically significant difference between the right and left sides for all four types of VEMP eliciting. No correlation was found between IAD13-23 and IAD34-44 for all four methods. Statistically significant differences were found only for n23 latency among the four methods. Although no evidence to reject or strongly favour a specific method was found, the monaural-ipsilateral recording was associated with some advantages.

  7. A comparison of chemical and electrophoretic methods of serum protein determinations in clinically normal domestic animals of various ages.

    Science.gov (United States)

    Green, S A; Jenkins, S J; Clark, P A

    1982-10-01

    The biuret total protein method and a bromcresol green (BCG) albumin method were used on the Abbott ABA-100 chemistry analyzer to assay serum proteins in clinically normal cattle, sheep, ponies, pigs, and ducks. Total proteins were also read on a refractometer and mylar supported cellulose acetate electrophoresis was performed. Globulins and A/G ratios were calculated from the chemical method and the results compared with the electrophoretic method. Total protein, albumin and A/G ratios in the ponies, sheep and older cattle were in agreement between the two methods. The younger cattle and all the pigs had higher albumin levels and A/G ratios with the chemical BCG method. Ducks had slightly higher albumin values and A/G ratios with the electrophoretic method and the presence of pre-albumin was detected. Typical mylar supported cellulose acetate electrophoretic patterns are presented which show the excellent separation using these membranes. Means and range for normal animals are given and changes of proteins with age are discussed.

  8. The Normalization of Surface Anisotropy Effects Present in SEVIRI Reflectances by Using the MODIS BRDF Method

    Science.gov (United States)

    Proud, Simon Richard; Zhang, Qingling; Schaaf, Crystal; Fensholt, Rasmus; Rasmussen, Mads Olander; Shisanya, Chris; Mutero, Wycliffe; Mbow, Cheikh; Anyamba, Assaf; Pak, Ed; Sandholt, Inge

    2014-01-01

    A modified version of the MODerate resolution Imaging Spectroradiometer (MODIS) bidirectional reflectance distribution function (BRDF) algorithm is presented for use in the angular normalization of surface reflectance data gathered by the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) aboard the geostationary Meteosat Second Generation (MSG) satellites. We present early and provisional daily nadir BRDFadjusted reflectance (NBAR) data in the visible and near-infrared MSG channels. These utilize the high temporal resolution of MSG to produce BRDF retrievals with a greatly reduced acquisition period than the comparable MODIS products while, at the same time, removing many of the angular perturbations present within the original MSG data. The NBAR data are validated against reflectance data from the MODIS instrument and in situ data gathered at a field location in Africa throughout 2008. It is found that the MSG retrievals are stable and are of high-quality across much of the SEVIRI disk while maintaining a higher temporal resolution than the MODIS BRDF products. However, a number of circumstances are discovered whereby the BRDF model is unable to function correctly with the SEVIRI observations-primarily because of an insufficient spread of angular data due to the fixed sensor location or localized cloud contamination.

  9. Attenuated total reflectance Fourier transform infrared spectroscopy method to differentiate between normal and cancerous breast cells.

    Science.gov (United States)

    Lane, Randy; See, Seong S

    2012-09-01

    Attenuated total reflectance Fourier transform infrared spectroscopy (ATR-FTIR) is used to find the structural differences between cancerous breast cells (MCF-7 line) and normal breast cells (MCF-12F line). Gold nanoparticles were prepared and the hydrodynamic diameter of the gold nanoparticles found to be 38.45 nm. The Gold nanoparticles were exposed to both MCF-7 and MCF-12F cells from lower to higher concentrations. Spectroscopic studies founds nanoparticles were within the cells, and increasing the nanoparticles concentration inside the cells also resulted in sharper IR peaks as a result of localized surface Plasmon resonance. Asymmetric and symmetric stretching and bending vibrations between phosphate, COO-, CH2 groups were found to give negative shifts in wavenumbers and a decrease in peak intensities when going from noncancerous to cancerous cells. Cellular proteins produced peak assignments at the 1542 and 1644 cm(-1) wavenumbers which were attributed to the amide I and amide II bands of the polypeptide bond of proteins. Significant changes were found in the peak intensities between the cell lines in the spectrum range from 2854-2956 cm(-1). Results show that the concentration range of gold nanoparticles used in this research showed no significant changes in cell viability in either cell line. Therefore, we believe ATR-FTIR and gold nanotechnology can be at the forefront of cancer diagnosis for some time to come.

  10. Normalization method for asphalt mixture fatigue equation under different loading frequencies

    Institute of Scientific and Technical Information of China (English)

    吕松涛; 郑健龙

    2015-01-01

    In order to analyze the effect of different loading frequencies on the fatigue performance for asphalt mixture, the changing law of asphalt mixture strengths with loading speed was revealed by strength tests under different loading speeds. Fatigue equations of asphalt mixtures based on the nominal stress ratio and real stress ratio were established using fatigue tests under different loading frequencies. It was revealed that the strength of the asphalt mixture is affected by the loading speed greatly. It was also discovered that the fatigue equation based on the nominal stress ratio will change with the change of the fatigue loading speed. There is no uniqueness. But the fatigue equation based on the real stress ratio doesn’t change with the loading frequency. It has the uniqueness. The results indicate the fatigue equation based on the real stress ratio can realize the normalization of the asphalt mixture fatigue equation under different loading frequencies. It can greatly benefit the analysis of the fatigue characteristics under different vehicle speeds for asphalt pavement.

  11. Quantification of intraoral pressures during nutritive sucking: methods with normal infants.

    Science.gov (United States)

    Lang, William Christopher; Buist, Neil R M; Geary, Annmarie; Buckley, Scott; Adams, Elizabeth; Jones, Albyn C; Gorsek, Stephen; Winter, Susan C; Tran, Hanh; Rogers, Brian R

    2011-09-01

    We report quantitative measurements of ten parameters of nutritive sucking behavior in 91 normal full-term infants obtained using a novel device (an Orometer) and a data collection/analytical system (Suck Editor). The sucking parameters assessed include the number of sucks, mean pressure amplitude of sucks, mean frequency of sucks per second, mean suck interval in seconds, sucking amplitude variability, suck interval variability, number of suck bursts, mean number of sucks per suck burst, mean suck burst duration, and mean interburst gap duration. For analyses, test sessions were divided into 4 × 2-min segments. In single-study tests, 36 of 60 possible comparisons of ten parameters over six pairs of 2-min time intervals showed a p value of 0.05 or less. In 15 paired tests in the same infants at different ages, 33 of 50 possible comparisons of ten parameters over five time intervals showed p values of 0.05 or less. Quantification of nutritive sucking is feasible, showing statistically valid results for ten parameters that change during a feed and with age. These findings suggest that further research, based on our approach, may show clinical value in feeding assessment, diagnosis, and clinical management.

  12. Self-normalizing method to measure the detective quantum efficiency of a wide range of x-ray detectors.

    Science.gov (United States)

    Stierstorfer, K; Spahn, M

    1999-07-01

    The detective quantum efficiency (DQE) is widely accepted as the most relevant parameter to characterize the image quality of medical x-ray systems. In this article we describe a solid method to measure the DQE. The strength of the method lies in the fact that it is self-normalizing so measurements at very low spatial frequencies are not needed. Furthermore, it works on any system with a response function which is linear in the small-signal approximation. We decompose the DQE into several easily accessible quantities and discuss in detail how they can be measured. At the end we lead the interested reader through an example. Noise equivalent quanta and normalized contrast values are tabulated for standard radiation qualities.

  13. Preparing Fe5C2 Intermetallic Compound by Mechanical Alloying Method at Room Temperature and Normal Pressure

    Institute of Scientific and Technical Information of China (English)

    何正明; 钟敏建; 沈伟星; 张正明

    2003-01-01

    Single phase Fe5C2 intermetallic compound was prepared by mechanical alloying method. The phase and crystal structure of sample were analyzed with X-ray differaction spectrum. The decomposing temperature of the Fe5C2 compound is 596.4℃ determined by the DSC curve. It is further shown that the size of nanometer crystal grain is an important condition for carrying out the solid state reaction at room temperature and normal pressure.

  14. Adjusted normalized emissivity method for surface temperature and emissivity retrieval from optical and thermal infrared remote sensing data

    OpenAIRE

    Coll Company, César; Valor i Micó, Enric; Caselles Miralles, Vicente; Niclòs Corts, Raquel

    2003-01-01

    A methodology for the retrieval of surface temperatures and emissivities combining visible, near infrared and thermal infrared remote sensing data was applied to Digital Airborne Imaging Spectrometer (DAIS) data and validated with coincident ground measurements acquired in a multiyear experiment held in an agricultural site in Barrax, Spain. The Adjusted Normalized Emissivity Method (ANEM) is based on the use of visible and near infrared data to estimate the vegetation cover and model the max...

  15. First-order systems of linear partial differential equations: normal forms, canonical systems, transform methods

    Directory of Open Access Journals (Sweden)

    Heinz Toparkus

    2014-04-01

    Full Text Available In this paper we consider first-order systems with constant coefficients for two real-valued functions of two real variables. This is both a problem in itself, as well as an alternative view of the classical linear partial differential equations of second order with constant coefficients. The classification of the systems is done using elementary methods of linear algebra. Each type presents its special canonical form in the associated characteristic coordinate system. Then you can formulate initial value problems in appropriate basic areas, and you can try to achieve a solution of these problems by means of transform methods.

  16. The analysis method of deformation for laser cutting sheet metal part%激光切割钣金件时的工件变形分析方法

    Institute of Scientific and Technical Information of China (English)

    郭建; 兰天亮; 陈康

    2011-01-01

    利用ANSYS有限元分析软件建立了适合激光切割钣金件的三维实体模型,介绍了钣金件在自身重力和激光切割热源共同作用下,进行切割时工件变形大小的计算方法.将激光切剖热源简化为高斯热源,运用APDL语言分析计算了激光切剖时实际工件的变形大小.%ANSYS, finite element analysis software is used to establish a 3d entity model suited to the analysis of laser cutting sheet metal part.This paper introduces a method to detemine the size of workpiece deformation when it is cutted by laser under its own gravity and the action of the laser cutting heat source,and then simplifies the laser cutting heat source as the gauss heat source,analyzes and calculates the size of workpiece deformation when it is cutting by laser by writing APDL program

  17. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  18. Different percentages of false-positive results obtained using five methods for the calculation of reference change values based on simulated normal and ln-normal distributions of data.

    Science.gov (United States)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G; Sölétormos, György

    2016-11-01

    Background Reference change values provide objective tools to assess the significance of a change in two consecutive results for a biomarker from an individual. The reference change value calculation is based on the assumption that within-subject biological variation has random fluctuation around a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value was based on standard deviations according to the assumption of normality, but was soon changed to coefficients of variation (CV) in the formula (reference change value = ± Z ċ 2(½) ċ CV). Z is being dependent on the desired probability of significance, which also defines the percentages of false-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. Methods The five reference change value methods were examined using normally and ln-normally distributed simulated data. Results One method performed best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally distributed and ln-normally distributed data. Conclusions The optimal choice of method to calculate reference change value limits requires knowledge of the distribution of data (normal or ln-normal) and, if possible, knowledge of the homeostatic set point.

  19. Gene expression in human skeletal muscle: alternative normalization method and effect of repeated biopsies

    DEFF Research Database (Denmark)

    Lundby, Carsten; Nordsborg, Nikolai; Kusuhara, K.

    2005-01-01

    The reverse transcriptase-polymerase chain reaction (RT-PCR) method has lately become widely used to determine transcription and mRNA content in rodent and human muscle samples. However, the common use of endogenous controls for correcting for variance in cDNA between samples is not optimal. Spec...

  20. A method for unsupervised change detection and automatic radiometric normalization in multispectral data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Canty, Morton John

    2011-01-01

    Based on canonical correlation analysis the iteratively re-weighted multivariate alteration detection (MAD) method is used to successfully perform unsupervised change detection in bi-temporal Landsat ETM+ images covering an area with villages, woods, agricultural fields and open pit mines in Nort...

  1. Normal Science and the Paranormal: The Effect of a Scientific Method Course on Students' Beliefs.

    Science.gov (United States)

    Morier, Dean; Keeports, David

    1994-01-01

    A study investigated the effects of an interdisciplinary course on the scientific method on the attitudes of 34 college students toward the paranormal. Results indicated that the course substantially reduced belief in the paranormal, relative to a control group. Student beliefs in their own paranormal powers, however, did not change. (Author/MSE)

  2. Graphs of Plural Cuts

    CERN Document Server

    Dosen, K

    2011-01-01

    Plural (or multiple-conclusion) cuts are inferences made by applying a structural rule introduced by Gentzen for his sequent formulation of classical logic. As singular (single-conclusion) cuts yield trees, which underlie ordinary natural deduction derivations, so plural cuts yield graphs of a more complicated kind, related to trees, which this paper defines. Besides the inductive definition of these oriented graphs, which is based on sequent systems, a non-inductive, graph-theoretical, combinatorial, definition is given, and to reach that other definition is the main goal of the paper. As trees underlie multicategories, so the graphs of plural cuts underlie polycategories. The graphs of plural cuts are interesting in particular when the plural cuts are appropriate for sequent systems without the structural rule of permutation, and the main body of the paper deals with that matter. It gives a combinatorial characterization of the planarity of the graphs involved.

  3. Laser Cutting, Development Trends

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove

    1999-01-01

    In this paper a short review of the development trends in laser cutting will be given.The technology, which is the fastest expanding industrial production technology will develop in both its core market segment: Flat bed cutting of sheet metal, as it will expand in heavy industry and in cutting o...... of 3-dimensional shapes.The CO2-laser will also in the near future be the dominating laser source in the market, although the new developments in ND-YAG-lasers opens for new possibilities for this laser type.......In this paper a short review of the development trends in laser cutting will be given.The technology, which is the fastest expanding industrial production technology will develop in both its core market segment: Flat bed cutting of sheet metal, as it will expand in heavy industry and in cutting...

  4. Laser cutting plastic materials

    Energy Technology Data Exchange (ETDEWEB)

    Van Cleave, R.A.

    1980-08-01

    A 1000-watt CO/sub 2/ laser has been demonstrated as a reliable production machine tool for cutting of plastics, high strength reinforced composites, and other nonmetals. More than 40 different plastics have been laser cut, and the results are tabulated. Applications for laser cutting described include fiberglass-reinforced laminates, Kevlar/epoxy composites, fiberglass-reinforced phenolics, nylon/epoxy laminates, ceramics, and disposable tooling made from acrylic.

  5. Single-Phase Full-Wave Rectifier as an Effective Example to Teach Normalization, Conduction Modes, and Circuit Analysis Methods

    Directory of Open Access Journals (Sweden)

    Predrag Pejovic

    2013-12-01

    Full Text Available Application of a single phase rectifier as an example in teaching circuit modeling, normalization, operating modes of nonlinear circuits, and circuit analysis methods is proposed.The rectifier supplied from a voltage source by an inductive impedance is analyzed in the discontinuous as well as in the continuous conduction mode. Completely analytical solution for the continuous conduction mode is derived. Appropriate numerical methods are proposed to obtain the circuit waveforms in both of the operating modes, and to compute the performance parameters. Source code of the program that performs such computation is provided.

  6. Distance Determination Method for Normally Distributed Obstacle Avoidance of Mobile Robots in Stochastic Environments

    OpenAIRE

    Jinhong Noh; Ukyoul Huh

    2016-01-01

    Obstacle avoidance methods require knowledge of the distance between a mobile robot and obstacles in the environment. However, in stochastic environments, distance determination is difficult because objects have position uncertainty. The purpose of this paper is to determine the distance between a robot and obstacles represented by probability distributions. Distance determination for obstacle avoidance should consider position uncertainty, computational cost and collision probability. The pr...

  7. Epidemiological cut-off values for Flavobacterium psychrophilum MIC data generated by a standard test protocol

    DEFF Research Database (Denmark)

    Smith, P.; Endris, R.; Kronvall, G.

    2016-01-01

    antibiotics, the data sets were of sufficient quality and quantity to allow the setting of valid epidemiological cut-off values. For these agents, the cut-off values, calculated by the application of the statistically based normalized resistance interpretation method, were ≤16 mg L-1 for erythromycin, ≤2 mg L......-1 for florfenicol, ≤0.025 mg L-1 for oxolinic acid (OXO), ≤0.125 mg L-1 for oxytetracycline and ≤20 (1/19) mg L-1 for trimethoprim/sulphamethoxazole. For ampicillin and amoxicillin, the majority of putative wild-type observations were 'off scale', and therefore, statistically valid cut-off values...... could not be calculated. For ormetoprim/sulphadimethoxine, the data were excessively diverse and a valid cut-off could not be determined. For flumequine, the putative wild-type data were extremely skewed, and for enrofloxacin, there was inadequate separation in the MIC values for putative wild-type...

  8. Detection of normal plantar fascia thickness in adults via the ultrasonographic method.

    Science.gov (United States)

    Abul, Kadir; Ozer, Devrim; Sakizlioglu, Secil Sezgin; Buyuk, Abdul Fettah; Kaygusuz, Mehmet Akif

    2015-01-01

    Heel pain is a prevalent concern in orthopedic clinics, and there are numerous pathologic abnormalities that can cause heel pain. Plantar fasciitis is the most common cause of heel pain, and the plantar fascia thickens in this process. It has been found that thickening to greater than 4 mm in ultrasonographic measurements can be accepted as meaningful in diagnoses. Herein, we aimed to measure normal plantar fascia thickness in adults using ultrasonography. We used ultrasonography to measure the plantar fascia thickness of 156 healthy adults in both feet between April 1, 2011, and June 30, 2011. These adults had no previous heel pain. The 156 participants comprised 88 women (56.4%) and 68 men (43.6%) (mean age, 37.9 years; range, 18-65 years). The weight, height, and body mass index of the participants were recorded, and statistical analyses were conducted. The mean ± SD (range) plantar fascia thickness measurements for subgroups of the sample were as follows: 3.284 ± 0.56 mm (2.4-5.1 mm) for male right feet, 3.3 ± 0.55 mm (2.5-5.0 mm) for male left feet, 2.842 ± 0.42 mm (1.8-4.1 mm) for female right feet, and 2.8 ± 0.44 mm (1.8-4.3 mm) for female left feet. The overall mean ± SD (range) thickness for the right foot was 3.035 ± 0.53 mm (1.8-5.1 mm) and for the left foot was 3.053 ± 0.54 mm (1.8-5.0 mm). There was a statistically significant and positive correlation between plantar fascia thickness and participant age, weight, height, and body mass index. The plantar fascia thickness of adults without heel pain was measured to be less than 4 mm in most participants (~92%). There was no statistically significant difference between the thickness of the right and left foot plantar fascia.

  9. Ultrasonic Cutting of Foods

    Science.gov (United States)

    Schneider, Yvonne; Zahn, Susann; Rohm, Harald

    In the field of food engineering, cutting is usually classified as a mechanical unit operation dealing with size reduction by applying external forces on a bulk product. Ultrasonic cutting is realized by superpositioning the macroscopic feed motion of the cutting device or of the product with a microscopic vibration of the cutting tool. The excited tool interacts with the product and generates a number of effects. Primary energy concentration in the separation zone and the modification of contact friction along the tool flanks arise from the cyclic loading and are responsible for benefits such as reduced cutting force, smooth cut surface, and reduced product deformation. Secondary effects such as absorption and cavitation originate from the propagation of the sound field in the product and are closely related to chemical and physical properties of the material to be cut. This chapter analyzes interactions between food products and ultrasonic cutting tools and relates these interactions with physical and chemical product properties as well as with processing parameters like cutting velocity, ultrasonic amplitude and frequency, and tool design.

  10. Cerebral hemodynamics in normal-pressure hydrocephalus. Evaluation by 133Xe inhalation method and dynamic CT study

    Energy Technology Data Exchange (ETDEWEB)

    Tamaki, N.; Kusunoki, T.; Wakabayashi, T.; Matsumoto, S.

    1984-09-01

    Cerebral hemodynamics in 31 patients with suspected normal-pressure hydrocephalus were studied by means of the xenon-133 (133Xe) inhalation method and on dynamic computerized tomography (CT) scanning. Cerebral blood flow (CBF) is reduced in all patients with dementia. Hypoperfusion was noted in a frontal distribution in these patients compared with normal individuals. There was no difference in CBF patterns between patients with good and those with poor outcome. The CBF was increased following cerebrospinal fluid (CSF) shunting in patients who responded to that procedure: increase in flow correlated with clinical improvement, frontal and temporal lobe CBF was most markedly increased, and the CBF pattern became normal. In contrast, CBF was decreased after shunt placement in patients who were considered to have suffered from degenerative dementia, as evidenced by non-response to shunting. Dynamic computerized tomography studies demonstrated that patients with a good outcome showed a postoperative reduction in mean transit time of contrast material, most prominent in the frontal and temporal gray matter, and slight in the deep frontal structures, but not in the major cerebral vessels. Patients with poor outcome after shunting, however, had an increase in transit time in all regions. This corresponded well with the results as determined by the 133Xe inhalation method.

  11. Designing for hot-blade cutting

    DEFF Research Database (Denmark)

    Brander, David; Bærentzen, Jakob Andreas; Clausen, Kenn

    2016-01-01

    -trivial constraints of blade-cutting in a bottom-up fashion, enabling an exploration of the unique architectural potential of this fabrication approach. The method is implemented as prototype design tools in MatLAB, C++, GhPython, and Python and demonstrated through cutting of expanded polystyrene foam design...

  12. 3D modelling of the active normal fault network in the Apulian Ridge (Eastern Mediterranean Sea): Integration of seismic and bathymetric data with implicit surface methods

    Science.gov (United States)

    Bistacchi, Andrea; Pellegrini, Caludio; Savini, Alessandra; Marchese, Fabio

    2016-04-01

    The Apulian ridge (North-eastern Ionian Sea, Mediterranean), interposed between the facing Apennines and Hellenides subduction zones (to the west and east respectively), is characterized by thick cretaceous carbonatic sequences and discontinuous tertiary deposits crosscut by a penetrative network of NNW-SSE normal faults. These are exposed onshore in Puglia, and are well represented offshore in a dataset composed of 2D seismics and wells collected by oil companies from the '60s to the '80s, more recent seismics collected during research projects in the '90s, recent very high resolution seismics (VHRS - Sparker and Chirp-sonar data), multibeam echosounder bathymetry, and sedimentological and geo-chronological analyses of sediment samples collected on the seabed. Faults are evident in 2D seismics at all scales, and their along-strike geometry and continuity can be characterized with multibeam bathymetric data, which show continuous fault scarps on the seabed (only partly reworked by currents and covered by landslides). Fault scarps also reveal the finite displacement accumulated in the Holocene-Pleistocene. We reconstructed a 3D model of the fault network and suitable geological boundaries (mainly unconformities due to the discontinuous distribution of quaternary and tertiary sediments) with implicit surface methods implemented in SKUA/GOCAD. This approach can be considered very effective and allowed reconstructing in details complex structures, like the frequent relay zones that are particularly well imaged by seafloor geomorphology. Mutual cross-cutting relationships have been recognized between fault scarps and submarine mass-wasting deposits (Holocene-Pleistocene), indicating that, at least in places, these features are coeval, hence the fault network should be considered active. At the regional scale, the 3D model allowed measuring the horizontal WSW-ENE stretching, which can be associated to the bending moment applied to the Apulian Plate by the combined effect

  13. Multiscale simulation of nanometric cutting of single crystal Cu based on bridging domain method%基于桥域理论的Cu单晶纳米切削跨尺度仿真研究

    Institute of Scientific and Technical Information of China (English)

    梁迎春; 盆洪民; 白清顺; 卢礼华

    2011-01-01

    One of the significant methods of multiscale simulation named bridging domain method which is a mixed atomistic-continuum formulation is reviewed.The mode related to atomistic/continuum coupling is introduced.The coupled method with the treatment of the overlapping subdomain is discussed,in which different scaling parameters(weigh factors) are adopted to calculate the energy of the system in the overlapping subdomain and to constrain the atomic and the continuum displacements by the Lagrange multiplier method.A bridging domain model is set up to investigate the effect of cutting speed on chip and workpiece atom force distribution in the nanometric cutting of single crystal copper.Simulation results show the cutting deformation coefficient decreases and the workpiece atom force increases with the increase of cutting speed.In addition,the machined surface qualities at different cutting speeds are investigated.The multiscale model and simulation of nanometric cutting are accomplished based on the bridging domain method,which lays a theoretical foundation for exploring the trans-scale simulation of nanometric cutting.%桥域方法是一种典型的跨尺度仿真研究方法.基于桥域理论,本文分析了原子和连续介质耦合区域的处理问题,即在耦合区采用不同的权重计算系统的能量,通过Lagrange乘子法对原子和连续介质位移进行约束.采用桥域方法,建立了单晶Cu米纳切削的跨尺度仿真模型,获得了单晶Cu纳米切削的材料变形机理.同时,研究了不同切削速度对纳米切削过程和原子受力分布的影响,仿真结果表明:随着切削速度的提高,切削区原子所受的力值增大,切屑变形系数减小,已加工表面变质层厚度增加.本文基于桥域理论,实现了Cu单晶纳米切削跨尺度的建模和仿真,为探索纳米切削的跨尺度仿真研究提供理论基础.

  14. [Recording cervical and ocular vestibular evoked myogenic potentials: part 1: anatomy, physiology, methods and normal findings].

    Science.gov (United States)

    Walther, L E; Hörmann, K; Pfaar, O

    2010-10-01

    Vestibular evoked myogenic potentials (VEMP) have gained in clinical significance in recent years, now forming an integral part of neurootological examinations to establish the functional status of the otolith organs. They are sensitive to low-frequency acoustic stimuli. When stimulated, receptors in the sacculus and utriculous are activated. By means of reflexive connections, myogenic potentials can be recorded when the relevant muscles are tonically activated. The vestibulocolic (sacculocollic) reflex travels from the otolith organs over the central circuitry to the ipsilateral sternocleidomastoid muscle. Myogenic potentials can be recorded by means of cervical VEMP (cVEMP). The vestibuloocular reflex crosses contralaterally to the extraocular eye muscle. Ocular VEMP (oVEMP) are recorded periocularly, preferably from the inferior oblique muscle. Various stimulation methods are used including air conduction and bone conduction.

  15. Influence of cutting parameters on surface characteristics of cut section in cutting of Inconel 718 sheet using CW Nd:YAG laser

    Institute of Scientific and Technical Information of China (English)

    Dong-Gyu AHN; Kyung-Won BYUN

    2009-01-01

    Recently, laser cutting technologies begin to use for manufacturing mechanical parts of lnconel super-alloy sheet due to difficulties of machining of the Inconel material as a results of its extremely tough nature. The objective of this work is to investigate the influence of cutting parameters on surface characteristics of the cut section in the cutting of Inconel 718 super-alloy sheet using CW Nd:YAG laser through laser cutting experiments. Normal cutting experiments were performed using a laser cutting system with six-axis controlled automatic robot and auto-tracking system of the focal distance. From the results of the experiments, the effects of the cutting parameters on the surface roughness, the striation formation and the microstructure of the cut section were examined. In addition, an optimal cutting condition, at which the surface roughness is minimized and both the delayed cutting phenomenon and the micro-cracking are not initiated, is estimated to improve both the part quality and the cutting efficiency.

  16. Antimicrobial Susceptibility of Flavobacterium psychrophilum from Chilean Salmon Farms and their Epidemiological Cut-off Values using Agar Dilution and Disk Diffusion Methods

    Directory of Open Access Journals (Sweden)

    Claudio D Miranda

    2016-11-01

    Full Text Available Flavobacterium psychrophilum is the most important bacterial pathogen for freshwater farmed salmonids in Chile. The aims of this study were to determine the susceptibility to antimicrobials used in fish farming of Chilean isolates and to calculate their epidemiological cut-off (COWT values. A number of 125 Chilean isolates of F. psychrophilum were isolated from reared salmonids presenting clinical symptoms indicative of flavobacteriosis and their identities were confirmed by 16S rRNA polymerase chain reaction. Susceptibility to antibacterials was tested on diluted Mueller-Hinton by using an agar dilution MIC method and a disk diffusion method. The COWT values calculated by Normalised Resistance Interpretation (NRI analysis allow isolates to be categorized either as wild-type fully susceptible (WT or as manifesting reduced susceptibility (NWT. When MIC data was used, NRI analysis calculated a COWT of ≤ 0.125 μg mL-1, ≤ 2 μg mL-1 and ≤ 0.5 μg mL-1 for amoxicillin, florfenicol and oxytetracycline, respectively. For the quinolones, the COWT were ≤1 μg mL-1, ≤ 0.5 μg mL-1 and ≤ 0.125 μg mL-1 for oxolinic acid, flumequine and enrofloxacin respectively. The disc diffusion data sets obtained in this work were extremely diverse and were spread over a wide range. For the quinolones there was a close agreement between the frequencies of NWT isolates calculated using MIC and disc data. For oxolinic acid, flumequine and enrofloxacin the frequencies were 45, 39 and 38% using MIC data, and 42, 41 and 44%, when disc data were used. There was less agreement with the other antimicrobials, because NWT frequencies obtained using MIC and disc data respectively, were 24% and 10% for amoxicillin, 8% and 2% for florfenicol and 70% and 64% for oxytetracycline. Considering that the MIC data was more precise than the disc diffusion data, MIC determination would be the preferred method for susceptibility testing for this species and the NWT frequencies

  17. Saving Seal Cutting

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    On April 20, the graduation ceremony of China’s seal-cutting art postgraduates and visiting experts from the Institute of Seal Cutting Art under the China Art Academy was held in Beijing. On the same day, the exhibition of the works of the teachers and graduates of the institute was also held.

  18. Cutting Class Harms Grades

    Science.gov (United States)

    Taylor, Lewis A., III

    2012-01-01

    An accessible business school population of undergraduate students was investigated in three independent, but related studies to determine effects on grades due to cutting class and failing to take advantage of optional reviews and study quizzes. It was hypothesized that cutting classes harms exam scores, attending preexam reviews helps exam…

  19. Fundamentals of cutting.

    Science.gov (United States)

    Williams, J G; Patel, Y

    2016-06-06

    The process of cutting is analysed in fracture mechanics terms with a view to quantifying the various parameters involved. The model used is that of orthogonal cutting with a wedge removing a layer of material or chip. The behaviour of the chip is governed by its thickness and for large radii of curvature the chip is elastic and smooth cutting occurs. For smaller thicknesses, there is a transition, first to plastic bending and then to plastic shear for small thicknesses and smooth chips are formed. The governing parameters are tool geometry, which is principally the wedge angle, and the material properties of elastic modulus, yield stress and fracture toughness. Friction can also be important. It is demonstrated that the cutting process may be quantified via these parameters, which could be useful in the study of cutting in biology.

  20. A partial least squares based spectrum normalization method for uncertainty reduction for laser-induced breakdown spectroscopy measurements

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiongwei [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China); Wang, Zhe, E-mail: zhewang@tsinghua.edu.cn [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China); Lui, Siu-Lung; Fu, Yangting; Li, Zheng [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China); Liu, Jianming [China Guodian Science and Technology Research Institute, Nanjing 100034 (China); Ni, Weidou [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China)

    2013-10-01

    A bottleneck of the wide commercial application of laser-induced breakdown spectroscopy (LIBS) technology is its relatively high measurement uncertainty. A partial least squares (PLS) based normalization method was proposed to improve pulse-to-pulse measurement precision for LIBS based on our previous spectrum standardization method. The proposed model utilized multi-line spectral information of the measured element and characterized the signal fluctuations due to the variation of plasma characteristic parameters (plasma temperature, electron number density, and total number density) for signal uncertainty reduction. The model was validated by the application of copper concentration prediction in 29 brass alloy samples. The results demonstrated an improvement on both measurement precision and accuracy over the generally applied normalization as well as our previously proposed simplified spectrum standardization method. The average relative standard deviation (RSD), average of the standard error (error bar), the coefficient of determination (R{sup 2}), the root-mean-square error of prediction (RMSEP), and average value of the maximum relative error (MRE) were 1.80%, 0.23%, 0.992, 1.30%, and 5.23%, respectively, while those for the generally applied spectral area normalization were 3.72%, 0.71%, 0.973, 1.98%, and 14.92%, respectively. - Highlights: • Multiple pairs of lines are used to compensate plasma temperature fluctuations. • Multi-line information is utilized to represent the elemental concentration. • Advantage of PLS algorithm is exploited by the model. • Both of uncertainty reduction and accuracy improvement are achieved.

  1. ChIPnorm: a statistical method for normalizing and identifying differential regions in histone modification ChIP-seq libraries.

    Science.gov (United States)

    Nair, Nishanth Ulhas; Sahu, Avinash Das; Bucher, Philipp; Moret, Bernard M E

    2012-01-01

    The advent of high-throughput technologies such as ChIP-seq has made possible the study of histone modifications. A problem of particular interest is the identification of regions of the genome where different cell types from the same organism exhibit different patterns of histone enrichment. This problem turns out to be surprisingly difficult, even in simple pairwise comparisons, because of the significant level of noise in ChIP-seq data. In this paper we propose a two-stage statistical method, called ChIPnorm, to normalize ChIP-seq data, and to find differential regions in the genome, given two libraries of histone modifications of different cell types. We show that the ChIPnorm method removes most of the noise and bias in the data and outperforms other normalization methods. We correlate the histone marks with gene expression data and confirm that histone modifications H3K27me3 and H3K4me3 act as respectively a repressor and an activator of genes. Compared to what was previously reported in the literature, we find that a substantially higher fraction of bivalent marks in ES cells for H3K27me3 and H3K4me3 move into a K27-only state. We find that most of the promoter regions in protein-coding genes have differential histone-modification sites. The software for this work can be downloaded from http://lcbb.epfl.ch/software.html.

  2. ChIPnorm: a statistical method for normalizing and identifying differential regions in histone modification ChIP-seq libraries.

    Directory of Open Access Journals (Sweden)

    Nishanth Ulhas Nair

    Full Text Available The advent of high-throughput technologies such as ChIP-seq has made possible the study of histone modifications. A problem of particular interest is the identification of regions of the genome where different cell types from the same organism exhibit different patterns of histone enrichment. This problem turns out to be surprisingly difficult, even in simple pairwise comparisons, because of the significant level of noise in ChIP-seq data. In this paper we propose a two-stage statistical method, called ChIPnorm, to normalize ChIP-seq data, and to find differential regions in the genome, given two libraries of histone modifications of different cell types. We show that the ChIPnorm method removes most of the noise and bias in the data and outperforms other normalization methods. We correlate the histone marks with gene expression data and confirm that histone modifications H3K27me3 and H3K4me3 act as respectively a repressor and an activator of genes. Compared to what was previously reported in the literature, we find that a substantially higher fraction of bivalent marks in ES cells for H3K27me3 and H3K4me3 move into a K27-only state. We find that most of the promoter regions in protein-coding genes have differential histone-modification sites. The software for this work can be downloaded from http://lcbb.epfl.ch/software.html.

  3. Determining the Long-term Effect of Antibiotic Administration on the Human Normal Intestinal Microbiota Using Culture and Pyrosequencing Methods.

    Science.gov (United States)

    Rashid, Mamun-Ur; Zaura, Egijia; Buijs, Mark J; Keijser, Bart J F; Crielaard, Wim; Nord, Carl Erik; Weintraub, Andrej

    2015-05-15

    The purpose of the study was to assess the effect of ciprofloxacin (500 mg twice daily for 10 days) or clindamycin (150 mg 4 times daily for 10 days) on the fecal microbiota of healthy humans for a period of 1 year as compared to placebo. Two different methods, culture and microbiome analysis, were used. Fecal samples were collected for analyses at 6 time-points. The interval needed for the normal microbiota to be normalized after ciprofloxacin or clindamycin treatment differed for various bacterial species. It took 1-12 months to normalize the human microbiota after antibiotic administration, with the most pronounced effect on day 11. Exposure to ciprofloxacin or clindamycin had a strong effect on the diversity of the microbiome, and changes in microbial composition were observed until the 12th month, with the most pronounced microbial shift at month 1. No Clostridium difficile colonization or C. difficile infections were reported. Based on the pyrosequencing results, it appears that clindamycin has more impact than ciprofloxacin on the intestinal microbiota. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis.

    Science.gov (United States)

    Kuhnert, Georg; Boellaard, Ronald; Sterzer, Sergej; Kahraman, Deniz; Scheffler, Matthias; Wolf, Jürgen; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten

    2016-02-01

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT.

  5. Impact of PET/CT image reconstruction methods and liver uptake normalization strategies on quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuhnert, Georg; Sterzer, Sergej; Kahraman, Deniz; Dietlein, Markus; Drzezga, Alexander; Kobe, Carsten [University Hospital of Cologne, Department of Nuclear Medicine, Cologne (Germany); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Scheffler, Matthias; Wolf, Juergen [University Hospital of Cologne, Lung Cancer Group Cologne, Department I of Internal Medicine, Center for Integrated Oncology Cologne Bonn, Cologne (Germany)

    2016-02-15

    In oncological imaging using PET/CT, the standardized uptake value has become the most common parameter used to measure tracer accumulation. The aim of this analysis was to evaluate ultra high definition (UHD) and ordered subset expectation maximization (OSEM) PET/CT reconstructions for their potential impact on quantification. We analyzed 40 PET/CT scans of lung cancer patients who had undergone PET/CT. Standardized uptake values corrected for body weight (SUV) and lean body mass (SUL) were determined in the single hottest lesion in the lung and normalized to the liver for UHD and OSEM reconstruction. Quantitative uptake values and their normalized ratios for the two reconstruction settings were compared using the Wilcoxon test. The distribution of quantitative uptake values and their ratios in relation to the reconstruction method used were demonstrated in the form of frequency distribution curves, box-plots and scatter plots. The agreement between OSEM and UHD reconstructions was assessed through Bland-Altman analysis. A significant difference was observed after OSEM and UHD reconstruction for SUV and SUL data tested (p < 0.0005 in all cases). The mean values of the ratios after OSEM and UHD reconstruction showed equally significant differences (p < 0.0005 in all cases). Bland-Altman analysis showed that the SUV and SUL and their normalized values were, on average, up to 60 % higher after UHD reconstruction as compared to OSEM reconstruction. OSEM and HD reconstruction brought a significant difference for SUV and SUL, which remained constantly high after normalization to the liver, indicating that standardization of reconstruction and the use of comparable SUV measurements are crucial when using PET/CT. (orig.)

  6. Probing the effect of human normal sperm morphology rate on cycle outcomes and assisted reproductive methods selection.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available Sperm morphology is the best predictor of fertilization potential, and the critical predictive information for supporting assisted reproductive methods selection. Given its important predictive value and the declining reality of semen quality in recent years, the threshold of normal sperm morphology rate (NSMR is being constantly corrected and controversial, from the 4th edition (14% to the 5th version (4%. We retrospectively analyzed 4756 cases of infertility patients treated with conventional-IVF(c-IVF or ICSI, which were divided into three groups according to NSMR: ≥14%, 4%-14% and <4%. Here, we demonstrate that, with decrease in NSMR(≥14%, 4%-14%, <4%, in the c-IVF group, the rate of fertilization, normal fertilization, high-quality embryo, multi-pregnancy and birth weight of twins gradually decreased significantly (P<0.05, while the miscarriage rate was significantly increased (p<0.01 and implantation rate, clinical pregnancy rate, ectopic pregnancy rate, preterm birth rate, live birth rate, sex ratio, and birth weight(Singleton showed no significant change. In the ICSI group, with decrease in NSMR (≥14%, 4%-14%, <4%, high-quality embryo rate, multi-pregnancy rate and birth weight of twins were gradually decreased significantly (p<0.05, while other parameters had no significant difference. Considering the clinical assisted methods selection, in the NFMR ≥14% group, normal fertilization rate of c-IVF was significantly higher than the ICSI group (P<0.05, in the 4%-14% group, birth weight (twins of c-IVF were significantly higher than the ICSI group, in the <4% group, miscarriage of IVF was significantly higher than the ICSI group. Therefore, we conclude that NSMR is positively related to embryo reproductive potential, and when NSMR<4% (5th edition, ICSI should be considered first, while the NSMR≥4%, c-IVF assisted reproduction might be preferred.

  7. Utilization of Digital Image Processing In Process of Quality Control of The Primary Packaging of Drug Using Color Normalization Method

    Science.gov (United States)

    Erwanto, Danang; Arttini Dwi Prasetyowati, Sri; Nuryanto Budi Susila, Eka

    2017-04-01

    In the process of quality control, accuracy is required so that the improper drug packaging is not included into the next production process. The automatic inspection system using digital image processing can be applied to replace the manual inspection system done by humans. The image captured from the vision sensor is RGB image which is then converted into grayscale. The process of converting RGB image into grayscale image is performed using the color normalization method to spread the data of RGB colors at each pixel. From the software of image processing using the color normalization method that have been created, it shows grayscale images on the drug object which have degrees of gray higher than the grayscale image section of the background when the degree of the R, G or B color of drug is higher than the degree of the R, G, B color on the background of packaging. The determination of threshold value indicates that the binary image of the drug is white and a binary image of the background of drug packaging is black.

  8. Online Cake Cutting

    CERN Document Server

    Walsh, Toby

    2010-01-01

    We propose an online form of the cake cutting problem. This models situations where players arrive and depart during the process of dividing a resource. We show that well known fair division procedures like cut-and-choose and the Dubins-Spanier moving knife procedure can be adapted to apply to such online problems. We propose some desirable properties that online cake cutting procedures might possess like online forms of proportionality and envy-freeness, and identify which properties are in fact possessed by the different online cake procedures.

  9. Model-free methods of analyzing domain motions in proteins from simulation : A comparison of normal mode analysis and molecular dynamics simulation of lysozyme

    NARCIS (Netherlands)

    Hayward, S; Kitao, A; Berendsen, HJC

    1997-01-01

    Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by m

  10. Rational cutting height for large cutting height fully mechanized top-coal caving

    Institute of Scientific and Technical Information of China (English)

    Huang Bingxiang; Li Hongtao; Liu Changyou; Xing Shijun; Xue Weichao

    2011-01-01

    Large cutting height fully mechanized top-coal caving is a new mining method that improves recovery ratio and single-pass production.It also allows safe and efficient mining.A rational cutting height is one key parameter of this technique.Numerical simulation and a granular-media model experiment were used to analyze the effect of cutting height on the rock pressure of a fully mechanized top-coal caving work face.The recovery ratio was also studied.As the cutting height increases the top-coal thickness is reduced.Changing the ratio of cutting to drawing height intensifies the face pressure and the top-coal shattering.A maximum cutting height exists under a given set of conditions due to issues with surrounding rock-mass control.An increase in cutting height makes the top-coal cave better and the recovery ratio when drawing top-coal is then improved.A method of adjusting the face rock pressure is presented.Changing the cutting to drawing height ratio is the technique used to control face rock pressure.The recovery ratio when cutting coal exceeds that when caving top-coal so the face recovery ratio may be improved by over sizing the cutting height and increasing the top-coal drawing ratio.An optimum ratio of cutting to drawing height exists that maximizes the face recovery ratio.A rational cutting height is determined by comprehensively considering the surrounding rock-mass control and the recovery ratio.At the same time increasing the cutting height can improve single pass mining during fully mechanized top-coal caving.

  11. Experimental testing of exchangeable cutting inserts cutting ability

    OpenAIRE

    Čep, Robert; Janásek, Adam; Čepová Lenka; Petrů, Jana; Ivo HLAVATÝ; Car, Zlatan; Hatala, Michal

    2013-01-01

    The article deals with experimental testing of the cutting ability of exchangeable cutting inserts. Eleven types of exchangeable cutting inserts from five different manufacturers were tested. The tested cutting inserts were of the same shape and were different especially in material and coating types. The main aim was both to select a suitable test for determination of the cutting ability of exchangeable cutting inserts and to design such testing procedure that could make it possible...

  12. Regularities of dust formation during stone cutting for construction works

    Directory of Open Access Journals (Sweden)

    V.G. Lebedev

    2016-09-01

    Full Text Available When cutting stone, a large amount of dust release, which is a mixture of small, mostly sharp, mineral particles. Shallow dry dust with inhalation causes the pathological changes in organs that are a consequence of infiltration of acute and solids particles. Despite the importance of this problem, the questions of dust generation during the various working processes and its fractions distribution are practically not considered. This determines the time of dust standing in the air and its negative impact on a person. Aim: The aim of this research is to study the process of dusting during stones cutting and dust distribution on fractions regularities and quantification of dust formation process in order to improve the production equipment, staff individual and collective safety equipment. Materials and Methods: Many types of cutting can be divided into two types - a “dry” cutting and cutting with fluid. During “dry” cutting a dust represents a set of micro-chips which are cut off by the abrasive grains. The size of such chips very small: from a micrometer to a few micrometers fraction. Thus, the size of chips causes the possibility of creating dust slurry with low fall velocity, and which is located in the working space in large concentrations. Results: The following characteristic dependences were obtained as a result of research: dependence of the dust fall from the size of the dust particles, size of dust particles from minute feeding and grain range wheel, the specific amount of dust from the number of grit abrasive wheel and the temperature of the dust particles from the feeding at wheel turnover. It was shown that the distribution of chips (dust by size will request of a normal distribution low. Dimensions of chips during cut are in the range of 0.4...6 μm. Thus, dust slurry is formed with time of particles fall of several hours. This creates considerable minute dust concentration - within 0.28∙10^8...1.68∙10^8 units/m3.

  13. Testing Of Choiced Ceramics Cutting Tools At Irregular Interrupted Cut

    Science.gov (United States)

    Kyncl, Ladislav; Malotová, Šárka; Nováček, Pavel; Nicielnik, Henryk; Šoková, Dagmar; Hemžský, Pavel; Pitela, David; Holubjak, Jozef

    2015-12-01

    This article discusses the test of removable ceramic cutting inserts during machining irregular interrupted cut. Tests were performed on a lathe, with the preparation which simulated us the interrupted cut. By changing the number of plates mounted in a preparation it simulate us a regular or irregular interrupted cut. When with four plates it was regular interrupted cut, the remaining three variants were already irregular cut. It was examined whether it will have the irregular interrupted cutting effect on the insert and possibly how it will change life of inserts during irregular interrupted cut (variable delay between shocks).

  14. Distribution of contact loads over the flank-land of the cutter with a rounded cutting edge

    Science.gov (United States)

    Kozlov, V.; Gerasimov, A.; Kim, A.

    2016-04-01

    In this paper, contact conditions between a tool and a workpiece material for wear-simulating turning by a cutter with a sharp-cornered edge and with a rounded cutting edge are analysed. The results of the experimental study of specific contact load distribution over the artificial flank wear-land of the cutter in free orthogonal turning of the disk from titanium alloy (Ti6Al2Mo2Cr), ductile (63Cu) and brittle (57Cu1Al3Mn) brasses are described. Investigations were carried out by the method of ‘split cutter’ and by the method of the artificial flank-land of variable width. The experiments with a variable feed rate and a cutting speed show that in titanium alloy machining with a sharp-cornered cutting edge the highest normal contact load (σh max = 3400…2200 MPa) is observed immediately at the cutting edge, and the curve has a horizontal region with the length of 0.2… 0.6 mm. At a distance from the cutting edge, the value of specific normal contact load is dramatically reduced to 1100…500 MPa. The character of normal contact load for a rounded cutting edge is different -it is uniform, and its value is approximately 2 times smaller compared to machining with a sharp-cornered cutting edge. In author’s opinion it is connected with generation of a seizure zone in a chip formation region and explains the capacity of highly worn-out cutting tools for titanium alloys machining. The paper analyses the distribution of tangential contact loads over the flank land, which pattern differs considerably for machining with a sharp-cornered edge and with a rounded cutting edge. Abbreviation and symbols: m/s - meter per second (cutting speed v); mm/r - millimeter per revolution (feed rate f); MPa - mega Pascal (specific contact load as a stress σ or τ) hf - the width of the flank wear land (chamfer) of the cutting tool, flank wear land can be natural or artificial like the one in this paper [mm]; xh - distance from the cutting edge on the surface of the flank-land [mm

  15. Cutting Cakes Correctly

    CERN Document Server

    Hill, Theodore P

    2008-01-01

    Without additional hypotheses, Proposition 7.1 in Brams and Taylor's book "Fair Division" (Cambridge University Press, 1996) is false, as are several related Pareto-optimality theorems of Brams, Jones and Klamler in their 2006 cake-cutting paper.

  16. Laser cutting system

    Science.gov (United States)

    Dougherty, Thomas J

    2015-03-03

    A workpiece cutting apparatus includes a laser source, a first suction system, and a first finger configured to guide a workpiece as it moves past the laser source. The first finger includes a first end provided adjacent a point where a laser from the laser source cuts the workpiece, and the first end of the first finger includes an aperture in fluid communication with the first suction system.

  17. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Mark Pickell; Len Volk; Mike Volk; Evren Ozbayoglu; Lei Zhou

    2002-07-30

    This is the fourth quarterly progress report for Year-3 of the ACTS Project. It includes a review of progress made in: (1) Flow Loop construction and development and (2) research tasks during the period of time between April 1, 2002 and June 30, 2002. This report presents a review of progress on the following specific tasks: (a) Design and development of an Advanced Cuttings Transport Facility (Task 3: Addition of a Cuttings Injection/Separation System), (b) Research project (Task 6): ''Study of Cuttings Transport with Foam Under LPAT Conditions (Joint Project with TUDRP)''; (c) Research project (Task 9b): ''Study of Foam Flow Behavior Under EPET Conditions''; (d) Research project (Task 10): ''Study of Cuttings Transport with Aerated Mud Under Elevated Pressure and Temperature Conditions''; (e) Research on three instrumentation tasks to measure: Cuttings concentration and distribution in a flowing slurry (Task 11), Foam texture while transporting cuttings. (Task 12), and Viscosity of Foam under EPET (Task 9b); (f) Development of a Safety program for the ACTS Flow Loop. Progress on a comprehensive safety review of all flow-loop components and operational procedures. (Task 1S); (g) Activities towards technology transfer and developing contacts with Petroleum and service company members, and increasing the number of JIP members.

  18. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Mark Pickell; Len Volk; Mike Volk; Lei Zhou; Zhu Chen; Crystal Redden; Aimee Washington

    2003-01-30

    This is the second quarterly progress report for Year-4 of the ACTS Project. It includes a review of progress made in: (1) Flow Loop construction and development and (2) research tasks during the period of time between October 1, 2002 and December 30, 2002. This report presents a review of progress on the following specific tasks. (a) Design and development of an Advanced Cuttings Transport Facility Task 3: Addition of a Cuttings Injection/Separation System, Task 4: Addition of a Pipe Rotation System. (b) New research project (Task 9b): ''Development of a Foam Generator/Viscometer for Elevated Pressure and Elevated Temperature (EPET) Conditions''. (d) Research project (Task 10): ''Study of Cuttings Transport with Aerated Mud Under Elevated Pressure and Temperature Conditions''. (e) Research on three instrumentation tasks to measure: Cuttings concentration and distribution in a flowing slurry (Task 11), Foam texture while transporting cuttings. (Task 12), and Viscosity of Foam under EPET (Task 9b). (f) New Research project (Task 13): ''Study of Cuttings Transport with Foam under Elevated Pressure and Temperature Conditions''. (g) Development of a Safety program for the ACTS Flow Loop. Progress on a comprehensive safety review of all flow-loop components and operational procedures. (Task 1S). (h) Activities towards technology transfer and developing contacts with Petroleum and service company members, and increasing the number of JIP members.

  19. "ECG variability contour" method reveals amplitude changes in both ischemic patients and normal subjects during Dipyridamole stress: a preliminary report.

    Science.gov (United States)

    Dori, Guy; Gershinsky, Michal; Ben-Haim, Simona; Lewis, Basil S; Bitterman, Haim

    2011-11-01

    To detect and quantify consistent ECG amplitude changes, the "ECG variability contour" (EVC) method was proposed. Using this method we investigated amplitude changes in subjects undergoing myocardial perfusion imaging (MPI) with Dipyridamole (Dp). Fifty-three patients having reversible perfusion defects and 19 normal subjects (NS) who were free of: perfusion defects on their MPI, standard ST-T changes during Dp stress, and a negative clinical follow up. Mean ∏¹() was similar for the NS and patient group (6.2 ± 6.1 vs. 6.3 ± 6.2, P = 0.95). was 4.6 ± 3.0 in patients not having ST-T changes during Dp stress (n = 42), whereas in patients having ST-T changes (n = 11) it was 13.1 ± 10.2 (P was smaller than , which in turn was smaller than . The values of , , and for the NS, patients without and with ST-T changes were: 26.8 ± 28.6, 42.6 ± 41.8, 44.9 ± 36.5; 19.6 ± 20.8, 26.4 ± 31.4, 38.7 ± 27.3; 51.0 ± 30.0, 71.0 ± 36.8, 75.1 ± 20.9, respectively (P EVC method. The EVC method did not distinguish between NS and patients in this clinical setting.

  20. Nanometric mechanical cutting of metallic glass investigated using atomistic simulation

    Science.gov (United States)

    Wu, Cheng-Da; Fang, Te-Hua; Su, Jih-Kai

    2017-02-01

    The effects of cutting depth, tool nose radius, and temperature on the cutting mechanism and mechanics of amorphous NiAl workpieces are studied using molecular dynamics simulations based on the second-moment approximation of the many-body tight-binding potential. These effects are investigated in terms of atomic trajectories and flow field, shear strain, cutting force, resistance factor, cutting ratio, and pile-up characteristics. The simulation results show that a nanoscale chip with a shear plane of 135° is extruded by the tool from a workpiece surface during the cutting process. The workpiece atoms underneath the tool flow upward due to the adhesion force and elastic recovery. The required tangential force and normal force increase with increasing cutting depth and tool nose radius; both forces also increase with decreasing temperature. The resistance factor increases with increasing cutting depth and temperature, and decreases with increasing tool nose radius.

  1. Geometric Modelling by Recursively Cutting Vertices

    Institute of Scientific and Technical Information of China (English)

    吕伟; 梁友栋; 等

    1989-01-01

    In this paper,a new method for curve and surface modelling is introduced which generates curves and surfaces by recursively cutting and grinding polygons and polyhedra.It is a generalization of the existing corner-cutting methods.A lot of properties,such as geometric continuity,representation,shape-preserving,and the algorithm are studied which show that such curves and surfaces are suitable for geometric designs in CAD,computer graphics and their application fields.

  2. FIELD EVALUATION OF IMPROVED METHODS FOR MEASURING THE AIR LEAKAGE OF DUCT SYSTEMS UNDER NORMAL OPERATING CONDITIONS IN 51 HOMES

    Energy Technology Data Exchange (ETDEWEB)

    Paul W. Francisco; Larry Palmiter; Erin Kruse; Bob Davis

    2003-10-18

    Duct leakage in forced-air distribution systems has been recognized for years as a major source of energy losses in residential buildings. Unfortunately, the distribution of leakage across homes is far from uniform, and measuring duct leakage under normal operating conditions has proven to be difficult. Recently, two new methods for estimating duct leakage at normal operating conditions have been devised. These are called the nulling test and the Delta-Q test. Small exploratory studies have been done to evaluate these tests, but previously no large-scale study on a broad variety of homes has been performed to determine the accuracy of these new methods in the field against an independent benchmark of leakage. This sort of study is important because it is difficult in a laboratory setting to replicate the range of leakage types found in real homes. This report presents the results of a study on 51 homes to evaluate these new methods relative to an independent benchmark and a method that is currently used. An evaluation of the benchmark procedure found that it worked very well for supply-side leakage measurements, but not as well on the return side. The nulling test was found to perform well, as long as wind effects were minimal. Unfortunately, the time and difficulty of setup can be prohibitive, and it is likely that this method will not be practical for general use by contractors except in homes with no return ducts. The Delta-Q test was found to have a bias resulting in overprediction of the leakage, which qualitatively confirms the results of previous laboratory, simulation, and small-scale field studies. On average the bias was only a few percent of the air handler flow, but in about 20% of the homes the bias was large. A primary flaw with the Delta-Q test is the assumption that the pressure between the ducts and the house remain constant during the test, as this assumption does not hold true. Various modifications to the Delta-Q method were evaluated as

  3. In Vitro Toxicity Evaluation of Caffeine Imprinted Polymer (CAF-MIP for Decaffeination Method on Normal Chang Liver Cells

    Directory of Open Access Journals (Sweden)

    Fatimah Hashim

    2017-04-01

    Full Text Available Over consuming of caffeine is one of the factors to a few health problems such as insomnia, hypertension and cardiovascular disease. This preliminary study was conducted to evaluate the Caffeine-Imprinted Polymer (CAF-MIP toxicity that was synthesized for a new alternative method for decaffeination. It is crucial to evaluate the toxicity of CAF-MIP as this product is potential to be used as complimentary with any drinks containing caffeine. In this study, the CAF-MIP toxicity potential was confirmed on Normal Chang Liver cell (NCLC based on its IC50 value and acridine orange and propidium iodide (AO/PI staining for mode of cell death observation. Proliferation assay was also conducted after 24, 48 and 72 hours at 30 µg/ml on NCLC and it showed that CAF-MIP promote NCLC growth as shown by at various concentration of CAF-MIP increase the percentage of NCLC viability. Observation under light microscopes on NCLC incubated wit CAF-MIP and NIP showed the normal, viable cell morphology, cuboidal and monolayer cell morphology and this can be seen with green fluorescence when view under fluorescence microscope. In conclusion, from this study, it is proved that the CAF-MIP does not initiate toxicity effects on human liver cells, meanwhile induction of cell proliferation was observed.

  4. A NEW METHOD TO CONSTRUCT A FULL-LENGTH cDNA LIBRARY OF HUMAN NORMAL BLADDER TISSUE

    Institute of Scientific and Technical Information of China (English)

    成瑜; 李旭; 陈葳; 杨玉琮; 赵乐

    2003-01-01

    Objective Using template-switch mechanism at the 5'-end of mRNA technique (SMART) to construct a full-length cDNA library of human normal bladder tissue. Methods The novel procedures used the template-switching activity of powerscript reverse transcriptase to synthesize and anchor first-strand cDNA in one step. Following reverse transcription, 5 cycles of PCR were performed using a modified oligo(dT) primer and an anchor primer to enrich the full-length cDNA population with 1.0 g human normal bladder poly(A)+RNA, then double-strand cDNA was synthesized. After digestion with sfiI and size-fractionation by CHROMA SPIN-400 columns, double-strand cDNA was ligated into λTripIEx2 vector and was packaged. We determined the titer of the primary library and the percentage of recombinant clones and finally amplified the library. Results The titer of the cDNA library constructed was 2.1×106 pfu*mL-1, and the amplified cDNA library was 6×1011 pfu*mL-1, the percentage of recombination clones was 99%. Conclusion Using SMART technique helps us to construct full-length cDNA library with high efficiency and high capacity which lays solid foundation for screening target genes of bladder diseases with probes and antibodies.

  5. Benefits of explosive cutting for nuclear-facility applications

    Energy Technology Data Exchange (ETDEWEB)

    Hazelton, R.F.; Lundgren, R.A.; Allen, R.P.

    1981-06-01

    The study discussed in this report was a cost/benefit analysis to determine: (1) whether explosive cutting is cost effective in comparison with alternative metal sectioning methods and (2) whether explosive cutting would reduce radiation exposure or provide other benefits. Two separate approaches were pursued. The first was to qualitatively assess cutting methods and factors involved in typical sectioning cases and then compare the results for the cutting methods. The second was to prepare estimates of work schedules and potential radiation exposures for candidate sectioning methods for two hypothetical, but typical, sectioning tasks. The analysis shows that explosive cutting would be cost effective and would also reduce radiation exposure when used for typical nuclear facility sectioning tasks. These results indicate that explosive cutting should be one of the principal cutting methods considered whenever steel or similar metal structures or equipment in a nuclear facility are to be sectioned for repair or decommissioning. 13 figures, 7 tables. (DLC)

  6. Functional anatomy of the water transport system in cut chrysanthemum

    NARCIS (Netherlands)

    Nijsse, J.

    2001-01-01

    Cut flowers show a wide variance of keepability. The market demands more and more a guaranteed quality. Therefore, methods must be developed to predict vase life of cut flowers. Chrysanthemum ( Dendranthema x grandiflorum Tzvelev) and some other cut flowers suffer from unpredicted early leaf wilting

  7. Functional anatomy of the water transport system in cut chrysanthemum

    NARCIS (Netherlands)

    Nijsse, J.

    2001-01-01

    Cut flowers show a wide variance of keepability. The market demands more and more a guaranteed quality. Therefore, methods must be developed to predict vase life of cut flowers. Chrysanthemum ( Dendranthema x grandiflorum Tzvelev) and some other cut flowers suffer from

  8. NORMAL INCIDENCE SOUND ABSORPTION COEFFICIENT OF DIRECT PIERCING CARVED WOOD PANEL WITH DAUN SIREH MOTIF USING BOUNDARY ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    Mohd Zamri Jusoh

    2013-06-01

    Full Text Available The Direct Piercing Carved Wood Panel (DPCWP installed in Masjid Abidin, Kuala Terengganu, is one example that carries much aesthetic and artistic value. The use of DPCWP in earlier mosques was envisaged to improve the intelligibility of indoor speech because the perforated panels allow some of the sound energy to pass through. In this paper, the normal incidence sound absorption coefficient of DPCWP with Daun Sireh motif, which is a form of floral pattern, is discussed. The Daun Sireh motif was chosen and investigated for 30%, 35%, 40%, and 45% perforation ratios. The simulations were conducted using BEASY Acoustic Software based on the boundary element method. The simulation results were compared with measurements obtained by using the sound intensity technique. An accompanying discussion on both the numerical and the measurement tendencies of the sound absorption characteristics of the DPCWP is provided. The results show that the DPCWP with Daun Sireh motif can act as a good sound absorber.

  9. Attenuation of Lg waves in the New Madrid seismic zone of the central United States using the coda normalization method

    Science.gov (United States)

    Nazemi, Nima; Pezeshk, Shahram; Sedaghati, Farhad

    2017-08-01

    Unique properties of coda waves are employed to evaluate the frequency dependent quality factor of Lg waves using the coda normalization method in the New Madrid seismic zone of the central United States. Instrument and site responses are eliminated and source functions are isolated to construct the inversion problem. For this purpose, we used 121 seismograms from 37 events with moment magnitudes, M, ranging from 2.5 to 5.2 and hypocentral distances from 120 to 440 km recorded by 11 broadband stations. A singular value decomposition (SVD) algorithm is used to extract Q values from the data, while the geometric spreading exponent is assumed to be a constant. Inversion results are then fitted with a power law equation from 3 to 12 Hz to derive the frequency dependent quality factor function. The final results of the analysis are QVLg (f) = (410 ± 38) f0.49 ± 0.05 for the vertical component and QHLg (f) = (390 ± 26) f0.56 ± 0.04 for the horizontal component, where the term after ± sign represents one standard error. For stations within the Mississippi embayment with an average sediment depth of 1 km around the Memphis metropolitan area, estimation of quality factor using the coda normalization method is not well-constrained at low frequencies (f < 3 Hz). There may be several reasons contributing to this issue, such as low frequency surface wave contamination, site effects, or even a change in coda wave scattering regime which can exacerbate the scatter of the data.

  10. Computing Normal Shock-Isotropic Turbulence Interaction With Tetrahedral Meshes and the Space-Time CESE Method

    Science.gov (United States)

    Venkatachari, Balaji Shankar; Chang, Chau-Lyan

    2016-11-01

    The focus of this study is scale-resolving simulations of the canonical normal shock- isotropic turbulence interaction using unstructured tetrahedral meshes and the space-time conservation element solution element (CESE) method. Despite decades of development in unstructured mesh methods and its potential benefits of ease of mesh generation around complex geometries and mesh adaptation, direct numerical or large-eddy simulations of turbulent flows are predominantly carried out using structured hexahedral meshes. This is due to the lack of consistent multi-dimensional numerical formulations in conventional schemes for unstructured meshes that can resolve multiple physical scales and flow discontinuities simultaneously. The CESE method - due to its Riemann-solver-free shock capturing capabilities, non-dissipative baseline schemes, and flux conservation in time as well as space - has the potential to accurately simulate turbulent flows using tetrahedral meshes. As part of the study, various regimes of the shock-turbulence interaction (wrinkled and broken shock regimes) will be investigated along with a study on how adaptive refinement of tetrahedral meshes benefits this problem. The research funding for this paper has been provided by Revolutionary Computational Aerosciences (RCA) subproject under the NASA Transformative Aeronautics Concepts Program (TACP).

  11. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    Directory of Open Access Journals (Sweden)

    Marcello Manfredi

    2014-07-01

    Full Text Available In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time.

  12. A new quantitative method for the non-invasive documentation of morphological damage in paintings using RTI surface normals.

    Science.gov (United States)

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-07-09

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time.

  13. A Study on the Allowable Safety Factor of Cut-Slopes for Nuclear Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Soo; Yee, Eric [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    In this study, the issues of allowable safety factor design criteria for cut-slopes in nuclear facilities is derived through case analysis, a proposed construction work slope design criteria that provides relatively detailed conditions can be applied in case of the dry season and some unclear parts of slope design criteria be modified in case of the rainy season. This safety factor can be further subdivided into two; normal and earthquake factors, a factor of 1.5 is applied for normal conditions and a factor of 1.2 is applied for seismic conditions. This safety factor takes into consideration the effect of ground water and rainfall conditions. However, no criteria for the case of cut-slope in nuclear facilities and its response to seismic conditions is clearly defined, this can cause uncertainty in design. Therefore, this paper investigates the allowable safety factor for cut-slopes in nuclear facilities, reviews conditions of both local and international cut-slope models and finally suggests an alternative method of analysis. It is expected that the new design criteria adequately ensures the stability of the cut-slope to reflect clear conditions for both the supervising and design engineers.

  14. Improved cutting performance in high power laser cutting

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove

    2003-01-01

    Recent results in high power laser cutting especially with focus on cutting of mild grade steel types for shipbuilding are described.......Recent results in high power laser cutting especially with focus on cutting of mild grade steel types for shipbuilding are described....

  15. Improved cutting performance in high power laser cutting

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove

    2003-01-01

    Recent results in high power laser cutting especially with focus on cutting of mild grade steel types for shipbuilding are described.......Recent results in high power laser cutting especially with focus on cutting of mild grade steel types for shipbuilding are described....

  16. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  17. Two stage optimization method of cut order planning for apparel mass customization%大批量定制服装裁剪分床计划的两阶段优化方法

    Institute of Scientific and Technical Information of China (English)

    刘艳梅; 颜少聪; 纪杨建; 祁国宁

    2012-01-01

    为解决大批量定制服装生产中裁剪分床计划尺码较多且各尺码数量不规则的问题,建立了裁剪分床计划的数学模型;提出基于概率搜索和遗传算法的两阶段优化方法进行求解,第一阶段随机生成若干满足生产约束的初始裁床铺料层数方案,利用搜索算法结合概率,按投入裁床数量最少的原则得到最优尺码组合方案和相应初始裁床铺料层数方案,第二阶段基于前一阶段得到的最优尺码组合方案,按照满足订单情况下生产多余服装的比例不超过企业允许的最大值原则,利用遗传算法再次优化得到最优裁床铺料层数方案。针对实际生产案例,分别利用本算法和人工经验算法求解并进行比较,结果表明在相同的生产条件下,两阶段优化方法能快速求解出服装裁剪分床方案,减少铺床数、节省面料并降低成本。%To solve the problems of variousness and irregular quantity of Cut Order Planning(COP)sizes in apparel mass customization, the mathematical model was built and two stage optimization method based on probability search and genetic algorithm was proposed. In the first stage, several initial cut table layout plans which satified the production constriction were generated randomly. Combined searching algorithm with probability, the optimal sizes combination plan as well as initial cut table layout plan were obtained according to the principle of minimum number of cut table. In the second stage, the plans obtained from first stage were optimized again by using genetic algo- rithm, and the optimal cut table layout plan was received according to the principle that the producting proportion of redundant apparel not exceeding the allowable maximum value. A practical production case was given and computed separately by two stage method and manual empirical method. The results and comparison showed that the two stage optimization method could rapidly

  18. Determination of cut front position in laser cutting

    Science.gov (United States)

    Pereira, M.; Thombansen, U.

    2016-07-01

    Laser cutting has a huge importance to manufacturing industry. Laser cutting machines operate with fixed technological parameters and this does not guarantee the best productivity. The adjustment of the cutting parameters during operation can improve the machine performance. Based on a coaxial measuring device it is possible to identify the cut front position during the cutting process. This paper describes the data analysis approach used to determine the cut front position for different feed rates. The cut front position was determined with good resolution, but improvements are needed to make the whole process more stable.

  19. Investigation of formation mechanisms of chips in orthogonal cutting process

    Directory of Open Access Journals (Sweden)

    Ma W.

    2012-08-01

    Full Text Available This work investigates the formation mechanisms of chips in orthogonal cutting of mild steel and the transformation conditions between various morphology chips. It is supposed that the modeling material follows the Johnson-Cook constitutive model. In orthogonal cutting process, both the plastic flow and the instability behaviors of chip materials are caused by the plane strain loadings. Therefore, the general instability behaviors of materials in plane strain state are first analyzed with linear perturbation method and a universal instability criterion is established. Based on the analytical results, the formation mechanisms of chips and the transformation conditions between continuous and serrated chips are further studied by instability phase diagram method. The results show that the chip formation strongly depends on the intensity ratios between shear and normal stresses. The ratios of dissipative rates of plastic work done by compression and shear stresses govern the transformation from continuous to serrated chips. These results are verified by the numerical simulations on the orthogonal cutting process.

  20. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Mark Pickell; Len Volk, Mike Volk; Lei Zhou; Zhu Chen; Crystal Redden; Aimee Washington

    2002-10-30

    This is the first quarterly progress report for Year-4 of the ACTS Project. It includes a review of progress made in: (1) Flow Loop construction and development and (2) research tasks during the period of time between July 1, 2002 and Sept. 30, 2002. This report presents a review of progress on the following specific tasks: (a) Design and development of an Advanced Cuttings Transport Facility Task 3: Addition of a Cuttings Injection/Separation System, Task 4: Addition of a Pipe Rotation System, (b) New Research project (Task 9b): ''Development of a Foam Generator/Viscometer for Elevated Pressure and Elevated Temperature (EPET) Conditions'', (d) Research project (Task 10): ''Study of Cuttings Transport with Aerated Mud Under Elevated Pressure and Temperature Conditions'', (e) Research on three instrumentation tasks to measure: Cuttings concentration and distribution in a flowing slurry (Task 11), Foam texture while transporting cuttings (Task 12), Viscosity of Foam under EPET (Task 9b). (f) Development of a Safety program for the ACTS Flow Loop. Progress on a comprehensive safety review of all flow-loop components and operational procedures. (Task 1S). (g) Activities towards technology transfer and developing contacts with Petroleum and service company members, and increasing the number of JIP members.

  1. Investigation of cutting-induced damage in CMC bend bars

    Directory of Open Access Journals (Sweden)

    Neubrand A.

    2015-01-01

    Full Text Available Ceramic matrix composites (“CMC” with a strong fibre-matrix interface can be made damage-tolerant by introducing a highly porous matrix. Such composites typically have only a low interlaminar shear strength, which can potentially promote damage when preparing specimens or components by cutting. In order to investigate the damage induced by different cutting methods, waterjet cutting with and without abrasives, laser-cutting, wire eroding and cutoff grinding were used to cut plates of two different CMCs with a matrix porosity up to 35 vol.-%. For each combination of cutting method and composite, the flexural and interlaminar shear strength of the resulting specimens was determined. Additionally, the integrity of the regions near the cut surfaces was investigated by high-resolution x-ray computer tomography. It could be shown that the geometrical quality of the cut is strongly affected by the cutting method employed. Laser cut and waterjet cut specimens showed damage and delaminations near the cut surface leading to a reduced interlaminar shear strength of short bend bars in extreme cases.

  2. Automated method to compute Evans index for diagnosis of idiopathic normal pressure hydrocephalus on brain CT images

    Science.gov (United States)

    Takahashi, Noriyuki; Kinoshita, Toshibumi; Ohmura, Tomomi; Matsuyama, Eri; Toyoshima, Hideto

    2017-03-01

    The early diagnosis of idiopathic normal pressure hydrocephalus (iNPH) considered as a treatable dementia is important. The iNPH causes enlargement of lateral ventricles (LVs). The degree of the enlargement of the LVs on CT or MR images is evaluated by using a diagnostic imaging criterion, Evans index. Evans index is defined as the ratio of the maximal width of frontal horns (FH) of the LVs to the maximal width of the inner skull (IS). Evans index is the most commonly used parameter for the evaluation of ventricular enlargement. However, manual measurement of Evans index is a time-consuming process. In this study, we present an automated method to compute Evans index on brain CT images. The algorithm of the method consisted of five major steps: standardization of CT data to an atlas, extraction of FH and IS regions, the search for the outmost points of bilateral FH regions, determination of the maximal widths of both the FH and the IS, and calculation of Evans index. The standardization to the atlas was performed by using linear affine transformation and non-linear wrapping techniques. The FH regions were segmented by using a three dimensional region growing technique. This scheme was applied to CT scans from 44 subjects, including 13 iNPH patients. The average difference in Evans index between the proposed method and manual measurement was 0.01 (1.6%), and the correlation coefficient of these data for the Evans index was 0.98. Therefore, this computerized method may have the potential to accurately compute Evans index for the diagnosis of iNPH on CT images.

  3. A novel mean-centering method for normalizing microRNA expression from high-throughput RT-qPCR data

    Directory of Open Access Journals (Sweden)

    Wylie Dennis

    2011-12-01

    Full Text Available Abstract Background Normalization is critical for accurate gene expression analysis. A significant challenge in the quantitation of gene expression from biofluids samples is the inability to quantify RNA concentration prior to analysis, underscoring the need for robust normalization tools for this sample type. In this investigation, we evaluated various methods of normalization to determine the optimal approach for quantifying microRNA (miRNA expression from biofluids and tissue samples when using the TaqMan® Megaplex™ high-throughput RT-qPCR platform with low RNA inputs. Findings We compared seven normalization methods in the analysis of variation of miRNA expression from biofluid and tissue samples. We developed a novel variant of the common mean-centering normalization strategy, herein referred to as mean-centering restricted (MCR normalization, which is adapted to the TaqMan Megaplex RT-qPCR platform, but is likely applicable to other high-throughput RT-qPCR-based platforms. Our results indicate that MCR normalization performs comparable to or better than both standard mean-centering and other normalization methods. We also propose an extension of this method to be used when migrating biomarker signatures from Megaplex to singleplex RT-qPCR platforms, based on the identification of a small number of normalizer miRNAs that closely track the mean of expressed miRNAs. Conclusions We developed the MCR method for normalizing miRNA expression from biofluids samples when using the TaqMan Megaplex RT-qPCR platform. Our results suggest that normalization based on the mean of all fully observed (fully detected miRNAs minimizes technical variance in normalized expression values, and that a small number of normalizer miRNAs can be selected when migrating from Megaplex to singleplex assays. In our study, we find that normalization methods that focus on a restricted set of miRNAs tend to perform better than methods that focus on all miRNAs, including

  4. Dealing with Cuts (For Parents)

    Science.gov (United States)

    ... For Kids For Parents MORE ON THIS TOPIC Cellulitis First Aid: Cuts Staph Infections Bites and Scratches First Aid: Falls First Aid: ... Out Cuts, Scratches, and Abrasions What's a Scab? Cellulitis Cuts, Scratches, and Scrapes Staph Infections Dealing With Cuts and Wounds Contact Us Print ...

  5. Theoretical Models for Orthogonal Cutting

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This review of simple models for orthogonal cutting was extracted from: “L. De Chiffre: Metal Cutting Mechanics and Applications, D.Sc. Thesis, Technical University of Denmark, 1990.”......This review of simple models for orthogonal cutting was extracted from: “L. De Chiffre: Metal Cutting Mechanics and Applications, D.Sc. Thesis, Technical University of Denmark, 1990.”...

  6. Theoretical Models for Orthogonal Cutting

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This review of simple models for orthogonal cutting was extracted from: “L. De Chiffre: Metal Cutting Mechanics and Applications, D.Sc. Thesis, Technical University of Denmark, 1990.”......This review of simple models for orthogonal cutting was extracted from: “L. De Chiffre: Metal Cutting Mechanics and Applications, D.Sc. Thesis, Technical University of Denmark, 1990.”...

  7. Comparison between grafting and cutting as vegetative propagation methods for conilon coffee plants - doi: 10.4025/actasciagron.v35i4.16917

    Directory of Open Access Journals (Sweden)

    Saul de Andrade Júnior

    2013-05-01

    Full Text Available The purpose of this study was to assess the growth of conilon coffee tree plantlets that were propagated by grafting and cutting. The experiment was conducted at the plantlet production site of Incaper’s Experimental Farm in the city of Marilândia, Espírito Santo State. For grafting, plantlets derived from the seed propagation of Coffea canephora cv. Robusta Tropical (ENCAPER 8151 were used as rootstocks, and six clones of cv. Conilon Vitória (INCAPER 8142 were used as the grafts. The cutting was performed with six clones that were used for grafting. The experimental design consisted of randomized blocks of twelve treatments with five repetitions composed of twelve plantlets. On the hundred and fifth day, the averages of the variables were assessed and compared by the Scheffé test at a probability of 5%. The grafted plantlets were superior for almost all of the characteristics assessed, which suggests that it is possible to propagate conilon coffee trees.

  8. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Gerald Kane; Mark Pickell; Len Volk; Mike Volk; Barkim Demirdal; Affonso Lourenco; Evren Ozbayoglu; Paco Vieira; Lei Zhou

    2000-01-30

    This is the second quarterly progress report for Year 2 of the ACTS project. It includes a review of progress made in Flow Loop development and research during the period of time between Oct 1, 2000 and December 31, 2000. This report presents a review of progress on the following specific tasks: (a) Design and development of an Advanced Cuttings Transport Facility (Task 2: Addition of a foam generation and breaker system), (b) Research project (Task 6): ''Study of Cuttings Transport with Foam Under LPAT Conditions (Joint Project with TUDRP)'', (c) Research project (Task 7): ''Study of Cuttings Transport with Aerated Muds Under LPAT Conditions (Joint Project with TUDRP)'', (d) Research project (Task 8): ''Study of Flow of Synthetic Drilling Fluids Under Elevated Pressure and Temperature Conditions'', (e) Research project (Task 9): ''Study of Foam Flow Behavior Under EPET Conditions'', (f) Research project (Task 10): ''Study of Cuttings Transport with Aerated Mud Under Elevated Pressure and Temperature Conditions'', (g) Research on instrumentation tasks to measure: Cuttings concentration and distribution in a flowing slurry (Task 11), and Foam properties while transporting cuttings. (Task 12), (h) Development of a Safety program for the ACTS Flow Loop. Progress on a comprehensive safety review of all flow-loop components and operational procedures. (Task 1S). (i) Activities towards technology transfer and developing contacts with Petroleum and service company members, and increasing the number of JIP members. The tasks Completed During This Quarter are Task 7 and Task 8.

  9. The perfect cut

    DEFF Research Database (Denmark)

    Scozzafava, G.; Mueller Loose, Simone; Corsi, A.

    (organic, standard, GMO free). The cross-price elasticity provides insights to which degree different cuts compete against each other from a consumer perspective and how price premiums can be achieved by producers and marketers with certification and labeling strategies. The paper will also provide...... other from the consumer perspective dependent on price, intrinsic and extrinsic product characteristics as well as intended usage. So far, there is limited knowledge about optimal marketing and pricing of meat cuts simultaneously offered at the retail shelf. Results from an online choice experiment...

  10. Analysis of Theoretical Calculation Method of Blasting Cut Height in Demolishing Building by Controlled Blasting%楼房爆破缺口高度理论计算方法分析

    Institute of Scientific and Technical Information of China (English)

    林大能; 邓新文

    2001-01-01

    In this paper,the development of study on blasting cut hight calculation in demolishing building by controlled blasting is introduced.Based on energy conversion theory,the criteria determing minimum blasting cut height are given.The criteria are simplfied in terms of theory of progression.The practica lmethod of calculating blasting cut hei ght is given.The analysis of practical examplesi llustrated that the practicalc alulation method is of practical signifi cance for demolishing building by contro lled blasting.%介绍了爆破缺口高度的研究现状,以能量转化理论为基础给出了确定爆破缺口最小范围的判据,借助级数理论对判据进行了简化,得出了爆破缺口高度实用理论计算方法。通过实例计算分析,证明了文中的实用理论计算方法对工程实践有指导意义。

  11. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Stefan Miska; Troy Reed; Ergun Kuru

    2004-09-30

    The Advanced Cuttings Transport Study (ACTS) was a 5-year JIP project undertaken at the University of Tulsa (TU). The project was sponsored by the U.S. Department of Energy (DOE) and JIP member companies. The objectives of the project were: (1) to develop and construct a new research facility that would allow three-phase (gas, liquid and cuttings) flow experiments under ambient and EPET (elevated pressure and temperature) conditions, and at different angle of inclinations and drill pipe rotation speeds; (2) to conduct experiments and develop a data base for the industry and academia; and (3) to develop mechanistic models for optimization of drilling hydraulics and cuttings transport. This project consisted of research studies, flow loop construction and instrumentation development. Following a one-year period for basic flow loop construction, a proposal was submitted by TU to the DOE for a five-year project that was organized in such a manner as to provide a logical progression of research experiments as well as additions to the basic flow loop. The flow loop additions and improvements included: (1) elevated temperature capability; (2) two-phase (gas and liquid, foam etc.) capability; (3) cuttings injection and removal system; (4) drill pipe rotation system; and (5) drilling section elevation system. In parallel with the flow loop construction, hydraulics and cuttings transport studies were preformed using drilling foams and aerated muds. In addition, hydraulics and rheology of synthetic drilling fluids were investigated. The studies were performed under ambient and EPET conditions. The effects of temperature and pressure on the hydraulics and cuttings transport were investigated. Mechanistic models were developed to predict frictional pressure loss and cuttings transport in horizontal and near-horizontal configurations. Model predictions were compared with the measured data. Predominantly, model predictions show satisfactory agreements with the measured data. As a

  12. Investigation of Normalization Methods using Plasma Parameters for Laser Induced Breakdown Spectroscopy (LIBS) under simulated Martian Conditions

    OpenAIRE

    Vogt, David; Schröder, Susanne; Hübers, H.-W.

    2017-01-01

    Laser Induced Breakdown Spectroscopy data need to be normalized, especially in the field of planetary exploration We investigated plasma parameters as temperature and electron density for this purpose.

  13. A comparison of methods used to calculate normal background concentrations of potentially toxic elements for urban soil

    Energy Technology Data Exchange (ETDEWEB)

    Rothwell, Katherine A., E-mail: k.rothwell@ncl.ac.uk; Cooke, Martin P., E-mail: martin.cooke@ncl.ac.uk

    2015-11-01

    To meet the requirements of regulation and to provide realistic remedial targets there is a need for the background concentration of potentially toxic elements (PTEs) in soils to be considered when assessing contaminated land. In England, normal background concentrations (NBCs) have been published for several priority contaminants for a number of spatial domains however updated regulatory guidance places the responsibility on Local Authorities to set NBCs for their jurisdiction. Due to the unique geochemical nature of urban areas, Local Authorities need to define NBC values specific to their area, which the national data is unable to provide. This study aims to calculate NBC levels for Gateshead, an urban Metropolitan Borough in the North East of England, using freely available data. The ‘median + 2MAD’, boxplot upper whisker and English NBC (according to the method adopted by the British Geological Survey) methods were compared for test PTEs lead, arsenic and cadmium. Due to the lack of systematically collected data for Gateshead in the national soil chemistry database, the use of site investigation (SI) data collected during the planning process was investigated. 12,087 SI soil chemistry data points were incorporated into a database and 27 comparison samples were taken from undisturbed locations across Gateshead. The SI data gave high resolution coverage of the area and Mann–Whitney tests confirmed statistical similarity for the undisturbed comparison samples and the SI data. SI data was successfully used to calculate NBCs for Gateshead and the median + 2MAD method was selected as most appropriate by the Local Authority according to the precautionary principle as it consistently provided the most conservative NBC values. The use of this data set provides a freely available, high resolution source of data that can be used for a range of environmental applications. - Highlights: • The use of site investigation data is proposed for land contamination studies

  14. Antiepidermal growth factor variant III scFv fragment: effect of radioiodination method on tumor targeting and normal tissue clearance

    Energy Technology Data Exchange (ETDEWEB)

    Shankar, Sriram [Department of Radiology, Duke University Medical Center, Durham, NC 27710 (United States); Vaidyanathan, Ganesan [Department of Radiology, Duke University Medical Center, Durham, NC 27710 (United States); Kuan, C.-T. [Department of Pathology, Duke University Medical Center, Durham, NC 27710 (United States); Bigner, Darell D. [Department of Pathology, Duke University Medical Center, Durham, NC 27710 (United States); Zalutsky, Michael R. [Department of Radiology, Duke University Medical Center, Durham, NC 27710 (United States) and Department of Pathology, Duke University Medical Center, Durham, NC 27710 (United States) and Department of Biomedical Engineering, Duke University, Durham, NC 27708 (United States)]. E-mail: zalut001@duke.edu

    2006-01-15

    Introduction: MR1-1 is a single-chain Fv (scFv) fragment that binds with high affinity to epidermal growth factor receptor variant III, which is overexpressed on gliomas and other tumors but is not present on normal tissues. The objective of this study was to evaluate four different methods for labeling MR1-1 scFv that had been previously investigated for the radioiodinating of an intact anti-epidermal growth factor receptor variant III (anti-EGFRvIII) monoclonal antibody (mAb) L8A4. Methods: The MR1-1 scFv was labeled with {sup 125}I/{sup 131}I using the Iodogen method, and was also radiohalogenated with acylation agents bearing substituents that were positively charged-N-succinimidyl-3-[*I]iodo-5-pyridine carboxylate and N-succinimidyl-4-guanidinomethyl-3-[*I]iodobenzoate ([*I]SGMIB)-and negatively charged-N-succinimidyl-3-[*I]iodo-4-phosphonomethylbenzoate ([*I]SIPMB). In vitro internalization assays were performed with the U87MG{delta}EGFR cell line, and the tissue distribution of the radioiodinated scFv fragments was evaluated in athymic mice bearing subcutaneous U87MG{delta}EGFR xenografts. Results and Conclusion: As seen previously with the anti-EGFRvIII IgG mAb, retention of radioiodine activity in U87MG{delta}EGFR cells in the internalization assay was labeling method dependent, with SGMIB and SIPMB yielding the most prolonged retention. However, unlike the case with the intact mAb, the results of the internalization assays were not predictive of in vivo tumor localization capacity of the labeled scFv. Renal activity was dependent on the nature of the labeling method. With MR1-1 labeled using SIPMB, kidney uptake was highest and most prolonged; catabolism studies indicated that this uptake primarily was in the form of {epsilon}-N-3-[*I]iodo-4-phosphonomethylbenzoyl lysine.

  15. Comparison of normalization methods for differential gene expression analysis in RNA-Seq experiments: A matter of relative size of studied transcriptomes.

    Science.gov (United States)

    Maza, Elie; Frasse, Pierre; Senin, Pavel; Bouzayen, Mondher; Zouine, Mohamed

    2013-11-01

    In recent years, RNA-Seq technologies became a powerful tool for transcriptome studies. However, computational methods dedicated to the analysis of high-throughput sequencing data are yet to be standardized. In particular, it is known that the choice of a normalization procedure leads to a great variability in results of differential gene expression analysis. The present study compares the most widespread normalization procedures and proposes a novel one aiming at removing an inherent bias of studied transcriptomes related to their relative size. Comparisons of the normalization procedures are performed on real and simulated data sets. Real RNA-Seq data sets analyses, performed with all the different normalization methods, show that only 50% of significantly differentially expressed genes are common. This result highlights the influence of the normalization step on the differential expression analysis. Real and simulated data sets analyses give similar results showing 3 different groups of procedures having the same behavior. The group including the novel method named "Median Ratio Normalization" (MRN) gives the lower number of false discoveries. Within this group the MRN method is less sensitive to the modification of parameters related to the relative size of transcriptomes such as the number of down- and upregulated genes and the gene expression levels. The newly proposed MRN method efficiently deals with intrinsic bias resulting from relative size of studied transcriptomes. Validation with real and simulated data sets confirmed that MRN is more consistent and robust than existing methods.

  16. Comparison between Free Abrasive Method and Diamond Wire Method for SiC Materials Cutting%游离磨粒切割法和金刚石线切割法切割SiC的对比

    Institute of Scientific and Technical Information of China (English)

    王磊; 王添依; 张弛; 张海磊; 冯玢

    2014-01-01

    通过采用镀铜钢线配合游离磨粒切割法和金刚石线与砂浆配合切割法对SiC 晶锭分别进行切割试验,在对比两种方法的切割效果基础上,探讨两种切割方法对SiC 晶片表面质量和弯曲度的作用和效果,并对比两种方法的优点。通过探讨总结出镀铜钢线配合游离磨粒切割法切割出的SiC 晶片表面质量较好,但用时较长,适用于小型实验;金刚石线与砂浆配合切割法切割出的SiC 晶片几何参数更好且稳定,且加工效率较高,适用于大型生产。%C utting experim ents for SiC ingots w ere taken through two methods:copper-coated steel wire com bined with free abrasive, and diam ond wire com bined with mortar, respectively. A fterw ards, effects to surface quality and waferwar pofas-cutted wafers by the two methods were discussed and the advantages w ere com pared.Itis confirm ed thatthe form er m ethod leads to better surface quality as wellasmoretimecost,which issuitableforsmall-scaleexperiments;whilethelatermethod leadsto better shape param eter as wellas more process efficiency,which is suitable for large-scale producing.

  17. "Kid Cuts" by Broderbund.

    Science.gov (United States)

    Martin, Ron

    1994-01-01

    Describes "Kid Cuts," an arts and crafts computer software program for students in prekindergarten through sixth grade that provides 22 activities in 6 curriculum areas. An example is given of an activity for kindergarten and first graders related to counting that includes library media skills objectives and mathematics objectives. (LRW)

  18. Cutting Cakes Carefully

    Science.gov (United States)

    Hill, Theodore P.; Morrison, Kent E.

    2010-01-01

    This paper surveys the fascinating mathematics of fair division, and provides a suite of examples using basic ideas from algebra, calculus, and probability which can be used to examine and test new and sometimes complex mathematical theories and claims involving fair division. Conversely, the classical cut-and-choose and moving-knife algorithms…

  19. Simultaneous Cake Cutting

    DEFF Research Database (Denmark)

    Balkanski, Eric; Branzei, Simina; Kurokawa, David;

    2014-01-01

    We introduce the simultaneous model for cake cutting (the fair allocation of a divisible good), in which agents simultaneously send messages containing a sketch of their preferences over the cake. We show that this model enables the computation of divisions that satisfy proportionality — a popular...

  20. Cutting Cakes Carefully

    Science.gov (United States)

    Hill, Theodore P.; Morrison, Kent E.

    2010-01-01

    This paper surveys the fascinating mathematics of fair division, and provides a suite of examples using basic ideas from algebra, calculus, and probability which can be used to examine and test new and sometimes complex mathematical theories and claims involving fair division. Conversely, the classical cut-and-choose and moving-knife algorithms…

  1. Cut Locus Construction using Deformable Simplicial Complexes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Bærentzen, Jakob Andreas; Anton, François;

    2011-01-01

    In this paper we present a method for appproximating cut loci for a given point p on Riemannian 2D manifolds, closely related to the notion of Voronoi diagrams. Our method finds the cut locus by advecting a front of points equally distant from p along the geodesics originating at p and finding...... the domain to have disk topology. We test our method for tori of revolution and compare our results to the benchmark ones from [2]. The method, however, is generic and can be easily adapted to construct cut loci for other manifolds of genera other than 1....... the lines of self-intersections of the front in the parametric space. This becomes possible by using the deformable simplicial complexes (DSC, [1]) method for deformable interface tracking. DSC provide a simple collision detection mechanism, allows for interface topology control, and does not require...

  2. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Gerald Kane; Mark Pickell; Len Volk; Mike Volk; Barkim Demirdal; Affonso Lourenco; Evren Ozbayoglu; Paco Vieira

    2000-10-30

    This is the first quarterly progress report for Year 2 of the ACTS project. It includes a review of progress made in Flow Loop development and research during the period of time between July 14, 2000 and September 30, 2000. This report presents information on the following specific tasks: (a) Progress in Advanced Cuttings Transport Facility design and development (Task 2), (b) Progress on research project (Task 8): ''Study of Flow of Synthetic Drilling Fluids Under Elevated Pressure and Temperature Conditions'', (c) Progress on research project (Task 6): ''Study of Cuttings Transport with Foam Under LPAT Conditions (Joint Project with TUDRP)'', (d) Progress on research project (Task 7): ''Study of Cuttings Transport with Aerated Muds Under LPAT Conditions (Joint Project with TUDRP)'', (e) Progress on research project (Task 9): ''Study of Foam Flow Behavior Under EPET Conditions'', (f) Initiate research on project (Task 10): ''Study of Cuttings Transport with Aerated Mud Under Elevated Pressure and Temperature Conditions'', (g) Progress on instrumentation tasks to measure: Cuttings concentration and distribution (Tasks 11), and Foam properties (Task 12), (h) Initiate a comprehensive safety review of all flow-loop components and operational procedures. Since the previous Task 1 has been completed, we will now designate this new task as: (Task 1S). (i) Activities towards technology transfer and developing contacts with Petroleum and service company members, and increasing the number of JIP members.

  3. DEVELOPMENT OF THE METHOD AND U.S. NORMALIZATION DATABASE FOR LIFE CYCLE IMPACT ASSESSMENT AND SUSTAINABILITY METRICS

    Science.gov (United States)

    Normalization is an optional step within Life Cycle Impact Assessment (LCIA) that may be used to assist in the interpretation of life cycle inventory data as well as, life cycle impact assessment results. Normalization transforms the magnitude of LCI and LCIA results into relati...

  4. Comparison of advanced cutting techniques on hardox 500 steel material and the effect of structural properties of the material

    Directory of Open Access Journals (Sweden)

    L. Dahil

    2014-07-01

    Full Text Available Purpose of this study is to determine the most advantageous cutting method for a better competition chance. By presenting high hardness, high strength and superior toughness Hardox 500 steel. This sample was cut by plasma, laser, wire erosion and abrasive water jet (AWJ methods from advanced cutting technologies. By taking micro structure photos of surface of the sample cut by different cutting methods, effects of different cutting methods on metallurgical structure of material were compared.

  5. 土压平衡盾构机刀盘刀具布置方法研究%Research on Cutting Tool Layout Method of Earth Pressure Balance Shield

    Institute of Scientific and Technical Information of China (English)

    蒲毅; 刘建琴; 郭伟; 裴瑞英

    2011-01-01

    To ensure rationality and practicability of the cutter configuration, it is necessary to propose the method of cutting tool layout of earth pressure balance (EPB) shield. According to the equal-life principle of tool wear, the wear coefficient method and tunneling coefficient method are proposed to determine the number of tools; When it comes to a specific construction, it also can predict tool wear condition and the tunneling distance under the allowable attrition in the range of the shield tunnel excavation, which offer a reference for the safety of project; Based on Archimedes spiral layout method, the arrangement curve of drag bits on cutter head should be calculated according to the equal-life principle of tool wear; In order to obtain the actual depth of a cutting tool, the rule of plane symmetry layout of the cutting tool is put forward, and cutting process is analyzed; the first knife's three-dimensional arrangement method of the EPB shield is established, the actual cutting depth of a single tool is obtained; The accuracy of the cutting tool layout theory above is investigated with the example of TA07 of Nanjing Metro number two line. The research content and method lay a theoretical foundation for the EPB shield cutter selection and design theory.%研究土压平衡(Earth pressure balanced,EPB)盾构机刀盘刀具的布置方法,目的是为了确保刀具结构布置的合理性和实用性.依据刀具磨损的等寿命原则,提出确定刀具数量的磨损系数法和掘进系数法;针对具体施工问题,可预测盾构在开挖区间,刀具的磨损量以及许用磨损情况下刀具的掘进距离,为保障工程的安全顺利进行提供参考.基于阿基米德螺旋线布置方法,计算以刀具磨损的等寿命原则下,主切削刀的布置曲线;提出刀具平面对称布局原则,分析刀具的切削过程,得到单把刀具的实际切深;建立EPB盾构刀盘上先行刀的立体布局方法,计算得到先行刀的超前

  6. Optimization on cut-hole of mining tunnel excavation

    Institute of Scientific and Technical Information of China (English)

    ZHOU Chuan-bo; WANG peng; LEI Yong-jian; YIN Xiao-peng

    2009-01-01

    The efficiency of excavation a mining tunnel is definitely linked with modes of cut-holes. According to experience and methods of engineering analogy, the double-wedge cut, the 9-hole cut and the single spiral cut were determined originally by con-sidering the production conditions and blasting environment of the mining tunnels of the -74 m horizontal in the Da-ye iron mine. Based on acquired modes of cut-holes, the effect of the cut was studied, on the one hand, by a numerical simulation method with the aid of LS-DYNA3D, a nonlinear dynamic finite element program; on the other hand, a spot experiment was carried out in the mining tunnels. Both the numerical simulation and the spot experiment demonstrated and agreed that a single spiral cut provides the optimum excavation effect.

  7. 基于C8051F021的采煤机截割部自适应调高方法%Automatic Lifting Method of Cutting Part of Shearer Based on C8051F021

    Institute of Scientific and Technical Information of China (English)

    任勇; 李伟

    2015-01-01

    从采煤机记忆切割方法存在的不足出发,对采煤机使用的背景进行研究,通过对传统记忆切割的改进,提出了基于C8051F021采煤机截割部的自适应调高实现方法。该方法采用C8051F021片上系统作为核心设计硬件电路,先导式比例流量控制阀作为主控原件,实现采煤机摇臂的智能升降,系统还采用无线传输模块实现远程监控的目的。%Started from the shortcomings of memory cutting method existed in shearer, the research of shearer�s using background was carried out. Through the improvement the method of traditional mnemonic cutting, the method of realizing adaptive height setting of the shearer cutting part based on C8051F021 was put forward. This system was adopted of C8051F021 system on chip ( SOC) as the core to design hardware circuit and used the pilot proportional flow valve as the main control component. The intellectualized lifting up& down of mining machine rocker is realized, and the module of wireless transmission in this system is adopted to realize the aim of re⁃mote monitoring.

  8. NANOSCALE CUTTING OF MONOCRYSTALLINE SILICON USING MOLECULAR DYNAMICS SIMULATION

    Institute of Scientific and Technical Information of China (English)

    LI Xiaoping; CAI Minbo; RAHMAN Mustafizur

    2007-01-01

    It has been found that the brittle material, monocrystalline silicon, can be machined in ductile mode in nanoscale cutting when the tool cutting edge radius is reduced to nanoscale and the undeformed chip thickness is smaller than the tool edge radius. In order to better understand the mechanism of ductile mode cutting of silicon, the molecular dynamics (MD) method is employed to simulate the nanoscale cutting of monocrystalline silicon. The simulated variation of the cutting forces with the tool cutting edge radius is compared with the cutting force results from experimental cutting tests and they show a good agreement. The results also indicate that there is silicon phase transformation from monocrystalline to amorphous in the chip formation zone that can be used to explain the cause of ductile mode cutting. Moreover, from the simulated stress results, the two necessary conditions of ductile mode cutting, the tool cutting edge radius are reduced to nanoscale and the undeformed chip thickness should be smaller than the tool cutting edge radius, have been explained.

  9. Rooting greenwood tip cuttings of several Populus clones hydroponically (hydroponic rooting of Populus cuttings)

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, H.M.; Hansen, E.A.; Tolsted, D.N.

    1980-01-01

    Greenwood cuttings of several Populus clones were successfully rooted with a relatively simple hydroponic method. Indolebutyric acid and naphthaleneacetic acid at concentrations of 500 to 5000 ppM applied as a quick dip to the cutting bases, a complete nutrient solution at 20 to 40% of full strength, and a solution temperature between 27 and 30/sup 0/C generally produced the best rooting performance of most clones. Cuttings propagated by the hydroponic procedure rooted faster and generally outgrew those produced by a standard method after being transplanted to pots and grown in the greenhouse.

  10. Superpixel-based graph cuts for accurate stereo matching

    Science.gov (United States)

    Feng, Liting; Qin, Kaihuai

    2017-06-01

    Estimating the surface normal vector and disparity of a pixel simultaneously, also known as three-dimensional label method, has been widely used in recent continuous stereo matching problem to achieve sub-pixel accuracy. However, due to the infinite label space, it’s extremely hard to assign each pixel an appropriate label. In this paper, we present an accurate and efficient algorithm, integrating patchmatch with graph cuts, to approach this critical computational problem. Besides, to get robust and precise matching cost, we use a convolutional neural network to learn a similarity measure on small image patches. Compared with other MRF related methods, our method has several advantages: its sub-modular property ensures a sub-problem optimality which is easy to perform in parallel; graph cuts can simultaneously update multiple pixels, avoiding local minima caused by sequential optimizers like belief propagation; it uses segmentation results for better local expansion move; local propagation and randomization can easily generate the initial solution without using external methods. Middlebury experiments show that our method can get higher accuracy than other MRF-based algorithms.

  11. Selected Malaysian Wood CO2 -Laser Cutting Parameters And Cut Quality

    Directory of Open Access Journals (Sweden)

    Nukman Yusoff

    2008-01-01

    Full Text Available Laser has been used to cut most non-metallic materials very efficiently and successfully because these materials are highly absorptive by the CO2 laser wavelength of 10.6µm. Laser cutting process has been found to be reliable in loads of applications, with several advantages over other mechanical means in producing successful cut of even thermally sensitive materials such as wood. Various works which have been conducted to resolve the interaction between laser and wood but an ultimate guideline to produce the best cutting results are still undecided. This latest experiment was performed on Malaysian light hardwood namely, Nyatoh (Palaquium spp., Kembang Semangkok (Scaphium spp., Meranti (Shorea spp. and normal Plywood using low power carbon dioxide laser machine with 500 Watt maximum output. The low power laser machine (Zech Laser model ZL 1010, equipped with a slow flow CO2 laser producing maximum output power of 500 watt on beam mode of TEM01 is employed. The processing variables taken into investigation were laser power, nozzle-standoff distance (SOD or focal point position, nozzle size, assist gas pressure, types of assist gas, cutting speed and delay time. The wood properties observed were thickness, density and moisture content of the wood. The analyses considered were of the geometric and dimensional accuracy (straight sideline length, diameter of circle, kerf width, and percent over cut, material removal rate, and severity of burns of the matters upon machining with compressed air or any assist gases. The relationship between processing parameters and types of wood with different properties were outlined in terms of optimum cutting conditions, the minimum burnt-effect achievable and the best cut quality obtained with minimal surface deterioration and acceptable in accuracy. From this present study a guideline for cutting a wide range of Malaysian wood has been outlined.

  12. Research on Synchronous Control Method of Honeycomb Paper Core in High Speed Paper Feed and Linear Cutting%蜂窝纸芯高速送纸和直线剪纸的同步控制方法研究

    Institute of Scientific and Technical Information of China (English)

    杜建铭; 刘庆国; 罗世洲; 钟仲洪

    2013-01-01

    在新型蜂窝纸芯水平高速剪切生产线研发中,剪切速度成为制约生产效率的关键问题,采用直线电机取代普通变频调速电机-曲柄滑块机构,驱动1600 mm长的动刀,在25 mm的行程上,做600~800刀/min的往复直线剪切运动;采用高响应步进电机或力矩电机驱动凸轮间歇送纸机构取代原有自由落体式送纸,以满足高速剪纸的需要,此时剪纸与送纸的同步控制成为控制的难点之一。研究了高速间歇送纸的触发条件,同时对剪纸与送纸同步过程中的时序进行了计算和分析,提出了一种同步控制方法。实验结果表明,该方法能有效实现高速剪切下间歇送纸的同步控制。%In the R&D of new honeycomb paper core horizontal high-speed shear production line,cutting speed has become key problem which restricted production efficiency. By replacing the general frequency variable motor slider-crank mechanism with linear stepping motor to drive the cutter with 1 600 mm in length,reciprocating linear exercise of 600~800 times of cutting per minute in the distance of 25 mm was done. The freely falling paper transmission mechanism was replaced by the cam intermittent paper feed driven with high response stepping motor or torque motor,so as to meet needs in high-speed cutting of paper,here the synchronous control be-tween the paper cutting and the paper transmission became one of the difficulties in control. The trigger conditions of high-speed inter-mittent paper transmission were researched,at the same time,the timing of the synchronization process of paper cutting and paper transmission was calculated and analyzed,and a way of synchronous control was presented. The experimental results show that this method can effectively achieve the synchronous control of the intermittent feed of the high shear.

  13. The optimization of the cutting process of diamonds with a YAG laser

    Directory of Open Access Journals (Sweden)

    A. J. Lubbe

    1993-07-01

    Full Text Available A laser cannot, as generally assumed by the layman, cut right through a diamond with a single cut. A couple of hundred cuts may be necessary to "chip carve" through a diamond. There are several parameters, for example cutting speed, focus point, overlapping of cuts, etc., that influence the cutting process. With a view to optimizing the cutting process, laser cuts in diamonds were studied in a systematic way with the aid of an electron microscope. The method, technique and the results of the research are discussed in this article.

  14. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Mark Pickell; Len Volk; Mike Volk; Lei Zhou; Zhu Chen; Crystal Redden; Aimee Washington

    2003-04-30

    Experiments on the flow loop are continuing. Improvements to the software for data acquisition are being made as additional experience with three-phase flow is gained. Modifications are being made to the Cuttings Injection System in order to improve control and the precision of cuttings injection. The design details for a drill-pipe Rotation System have been completed. A US Patent was filed on October 28, 2002 for a new design for an instrument that can generate a variety of foams under elevated pressures and temperatures and then transfer the test foam to a viscometer for measurements of viscosity. Theoretical analyses of cuttings transport phenomena based on a layered model is under development. Calibrations of two nuclear densitometers have been completed. Baseline tests have been run to determine wall roughness in the 4 different tests sections (i.e. 2-in, 3-in, 4-in pipes and 5.76-in by 3.5-in annulus) of the flow loop. Tests have also been conducted with aerated fluids at EPET conditions. Preliminary experiments on the two candidate aqueous foam formulations were conducted which included rheological tests of the base fluid and foam stability reports. These were conducted after acceptance of the proposal on the Study of Cuttings Transport with Foam Under Elevated Pressure and Elevated Temperature Conditions. Preparation of a test matrix for cuttings-transport experiments with foam in the ACTF is also under way. A controller for instrumentation to measure cuttings concentration and distribution has been designed that can control four transceivers at a time. A prototype of the control circuit board was built and tested. Tests showed that there was a problem with radiated noise. AN improved circuit board was designed and sent to an external expert to verify the new design. The new board is being fabricated and will first be tested with static water and gravel in an annulus at elevated temperatures. A series of viscometer tests to measure foam properties have

  15. Thickness-Independent Ultrasonic Imaging Applied to Abrasive Cut-Off Wheels: An Advanced Aerospace Materials Characterization Method for the Abrasives Industry. A NASA Lewis Research Center Technology Transfer Case History

    Science.gov (United States)

    Roth, Don J.; Farmer, Donald A.

    1998-01-01

    Abrasive cut-off wheels are at times unintentionally manufactured with nonuniformity that is difficult to identify and sufficiently characterize without time-consuming, destructive examination. One particular nonuniformity is a density variation condition occurring around the wheel circumference or along the radius, or both. This density variation, depending on its severity, can cause wheel warpage and wheel vibration resulting in unacceptable performance and perhaps premature failure of the wheel. Conventional nondestructive evaluation methods such as ultrasonic c-scan imaging and film radiography are inaccurate in their attempts at characterizing the density variation because a superimposing thickness variation exists as well in the wheel. In this article, the single transducer thickness-independent ultrasonic imaging method, developed specifically to allow more accurate characterization of aerospace components, is shown to precisely characterize the extent of the density variation in a cut-off wheel having a superimposing thickness variation. The method thereby has potential as an effective quality control tool in the abrasives industry for the wheel manufacturer.

  16. Cut Locus Construction using Deformable Simplicial Complexes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Bærentzen, Jakob Andreas; Anton, François

    2011-01-01

    In this paper we present a method for appproximating cut loci for a given point p on Riemannian 2D manifolds, closely related to the notion of Voronoi diagrams. Our method finds the cut locus by advecting a front of points equally distant from p along the geodesics originating at p and finding...... the lines of self-intersections of the front in the parametric space. This becomes possible by using the deformable simplicial complexes (DSC, [1]) method for deformable interface tracking. DSC provide a simple collision detection mechanism, allows for interface topology control, and does not require...

  17. Cutting forces during turning with variable depth of cut

    Directory of Open Access Journals (Sweden)

    M. Sadílek

    2016-03-01

    The proposed research for the paper is an experimental work – measuring cutting forces and monitoring of the tool wear on the cutting edge. It compares the turning where standard roughing cycle is used and the turning where the proposed roughing cycle with variable depth of cut is applied.

  18. Cutting Out Continuations

    DEFF Research Database (Denmark)

    Bahr, Patrick; Hutton, Graham

    2016-01-01

    In the field of program transformation, one often transforms programs into continuation-passing style to make their flow of control explicit, and then immediately removes the resulting continuations using defunctionalisation to make the programs first-order. In this article, we show how these two...... transformations can be fused together into a single transformation step that cuts out the need to first introduce and then eliminate continuations. Our approach is calculational, uses standard equational reasoning techniques, and is widely applicable....

  19. Manual bamboo cutting tool.

    Science.gov (United States)

    Bezerra, Mariana Pereira; Correia, Walter Franklin Marques; da Costa Campos, Fabio Ferreira

    2012-01-01

    The paper presents the development of a cutting tool guide, specifically for the harvest of bamboo. The development was made based on precepts of eco-design and ergonomics, for prioritizing the physical health of the operator and the maintenance of the environment, as well as meet specific requirements of bamboo. The main goal is to spread the use of bamboo as construction material, handicrafts, among others, from a handy, easy assembly and material available tool.

  20. Making the cut

    OpenAIRE

    Millard, Chris

    2013-01-01

    ‘Deliberate self-harm’, ‘self-mutilation’ and ‘self-injury’ are just some of the terms used to describe one of the most prominent issues in British mental health policy in recent years. This article demonstrates that contemporary literature on ‘self-harm’ produces this phenomenon (to varying extents) around two key characteristics. First, this behaviour is predominantly performed by those identified as female. Second, this behaviour primarily involves cutting the skin. These constitutive char...

  1. Soluble oil cutting fluid

    Energy Technology Data Exchange (ETDEWEB)

    Rawlinson, A.P.; White, J.

    1987-06-23

    A soluble oil, suitable when diluted with water, for use as a cutting fluid comprises an alkali or alkaline-earth metal alkyl benzene sulphonate, a fatty acid diethanolamide, a mixed alkanolamine borate, a polyisobutenesuccinimide and a major proportion of mineral oil. The soluble oil is relatively stable without the need for a conventional coupling agent and some soluble oil emulsions are bio-static even though conventional biocides are not included.

  2. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Ergun Kuru; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Gerald Kane; Mark Pickell; Len Volk; Mike Volk; Barkim Demirdal; Affonso Lourenco; Evren Ozbayoglu; Paco Vieira; Neelima Godugu

    2000-07-30

    ACTS flow loop is now operational under elevated pressure and temperature. Currently, experiments with synthetic based drilling fluids under pressure and temperature are being conducted. Based on the analysis of Fann 70 data, empirical correlations defining the shear stress as a function of temperature, pressure and the shear rate have been developed for Petrobras synthetic drilling fluids. PVT equipment has been modified for testing Synthetic oil base drilling fluids. PVT tests with Petrobras Synthetic base mud have been conducted and results are being analyzed Foam flow experiments have been conducted and the analysis of the data has been carried out to characterize the rheology of the foam. Comparison of pressure loss prediction from the available foam hydraulic models and the test results has been made. Cuttings transport experiments in horizontal annulus section have been conducted using air, water and cuttings. Currently, cuttings transport tests in inclined test section are being conducted. Foam PVT analysis tests have been conducted. Foam stability experiments have also been conducted. Effects of salt and oil concentration on the foam stability have been investigated. Design of ACTS flow loop modification for foam and aerated mud flow has been completed. A flow loop operation procedure for conducting foam flow experiments under EPET conditions has been prepared Design of the lab-scale flow loop for dynamic foam characterization and cuttings monitoring instrumentation tests has been completed. The construction of the test loop is underway. As part of the technology transport efforts, Advisory Board Meeting with ACTS-JIP industry members has been organized on May 13, 2000.

  3. Hemoglobin cut-off values in healthy Turkish infants

    Institute of Scientific and Technical Information of China (English)

    Ahmet Arvas; Emel Gür; DurmuşDoğan

    2014-01-01

    Background: Anemia is a widespread public health problem associated with an increased risk of morbidity and mortality. This study was undertaken to determine the cut-off value of hemoglobin for infant anemia. Methods: A cross-sectional retrospective study was carried out at well-baby clinics of a tertiary care hospital. A total of 1484 healthy infants aged between 4 to 24 months were included in the study. The relationship of hemoglobin (Hb) levels with mother age, birth weight, weight gain rate, feeding, and gender was evaluated. Results: The Hb levels were assessed in four age groups (4 months, 6 months, 9-12 months, and 15-24 months) and the cut-off values of Hb were determined. Hb cut-off values (5th percentile for age) were detected as 97 g/L and 93 g/L at 4 months and 6 months, respectively. In older infants, the 5th percentile was 90.5 g/L and 93.4 g/L at 9-12 months and 15-24 months, respectively. The two values were lower than the World Health Organization criteria for anemia, which could partly due to the lack of information on iron status in our population. However, this difference highlights the need for further studies on normal Hb levels in healthy infants in developing countries. Hb levels of females were higher in all age groups; however, a statistically significant difference was found in gender in only 6 month-old infants. No statistically significant difference was found among Hb levels, mother's age, birth weight, weight gain rate, and nutritional status. Conclusion: Hb cut-off values in infants should be re-evaluated and be compatible with growth and development of children in that community.

  4. Methods for determining optical power, for power-normalizing laser measurements, and for stabilizing power of lasers via compliance voltage sensing

    Energy Technology Data Exchange (ETDEWEB)

    Taubman, Matthew S; Phillips, Mark C

    2015-04-07

    A method is disclosed for power normalization of spectroscopic signatures obtained from laser based chemical sensors that employs the compliance voltage across a quantum cascade laser device within an external cavity laser. The method obviates the need for a dedicated optical detector used specifically for power normalization purposes. A method is also disclosed that employs the compliance voltage developed across the laser device within an external cavity semiconductor laser to power-stabilize the laser mode of the semiconductor laser by adjusting drive current to the laser such that the output optical power from the external cavity semiconductor laser remains constant.

  5. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Ergun Kuru; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Gerald Kane; Len Volk; Mark Pickell; Evren Ozbayoglu; Barkim Demirdal; Paco Vieira; Affonso Lourenco

    1999-10-15

    This report includes a review of the progress made in ACTF Flow Loop development and research during 90 days pre-award period (May 15-July 14, 1999) and the following three months after the project approval date (July15-October 15, 1999) The report presents information on the following specific subjects; (a) Progress in Advanced Cuttings Transport Facility design and development, (b) Progress report on the research project ''Study of Flow of Synthetic Drilling Fluids Under Elevated Pressure and Temperature Conditions'', (c) Progress report on the research project ''Study of Cuttings Transport with Foam Under LPAT Conditions (Joint Project with TUDRP)'', (d) Progress report on the research project ''Study of Cuttings Transport with Aerated Muds Under LPAT Conditions (Joint Project with TUDRP)'', (e) Progress report on the research project ''Study of Foam Flow Behavior Under EPET Conditions'', (f) Progress report on the instrumentation tasks (Tasks 11 and 12) (g) Activities towards technology transfer and developing contacts with oil and service company members.

  6. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Mark Pickell; Len Volk; Mike Volk; Lei Zhou; Zhu Chen; Crystal Redden; Aimee Washington

    2003-07-30

    This Quarter has been divided between running experiments and the installation of the drill-pipe rotation system. In addition, valves and piping were relocated, and three viewports were installed. Detailed design work is proceeding on a system to elevate the drill-string section. Design of the first prototype version of a Foam Generator has been finalized, and fabrication is underway. This will be used to determine the relationship between surface roughness and ''slip'' of foams at solid boundaries. Additional cups and rotors are being machined with different surface roughness. Some experiments on cuttings transport with aerated fluids have been conducted at EPET. Theoretical modeling of cuttings transport with aerated fluids is proceeding. The development of theoretical models to predict frictional pressure losses of flowing foam is in progress. The new board design for instrumentation to measure cuttings concentration is now functioning with an acceptable noise level. The ultrasonic sensors are stable up to 190 F. Static tests with sand in an annulus indicate that the system is able to distinguish between different sand concentrations. Viscometer tests with foam, generated by the Dynamic Test Facility (DTF), are continuing.

  7. Vitamin C, total phenolics and antioxidative activity in tip-cut green beans (Phaseolus vulgaris) and swede rods (Brassica napus var. napobrassica) processed by methods used in catering.

    Science.gov (United States)

    Baardseth, Pernille; Bjerke, Frøydis; Martinsen, Berit K; Skrede, Grete

    2010-05-01

    Retention of nutrients in vegetables during blanching/freezing, cooking and warm-holding is crucial in the preparation of both standard and therapeutic diets. In the present study, conventional cooking in water, and cooking by pouch technology (boil-in-bag, sous vide) were compared in their ability to retain vitamin C, total phenolics and antioxidative activity (DPPH and FRAP) in industrially blanched/frozen tip-cut green beans and swede rods. After conventional cooking, 50.4% total ascorbic acid, 76.7% total phenolics, 55.7% DPPH and 59.0% FRAP were recovered in the drained beans. After boil-in-bag cooking, significantly (P sous vide cooking were comparable to those of boil-in-bag cooking. By conventional cooking, 13.5-42.8% of the nutrients leaked into the cooking water; by sous vide about 10% leaked to the exuded liquid, while no leakage occurred by boil-in-bag cooking. Warm-holding beans after cooking reduced recoveries in all components. Recoveries in swede rods were comparable but overall slightly lower. Industrially blanched/frozen vegetables should preferably be cooked by pouch technology, rather than conventional cooking in water. Including cooking water or exuded liquid into the final dish will increase the level of nutrients in a meal. Warm-holding of vegetables after cooking should be avoided.

  8. 精确图解法确定滚刀法向齿形%Accurate graphic method for determing the normal gear shape of a hob

    Institute of Scientific and Technical Information of China (English)

    熊炜; 陈文

    2001-01-01

    利用AutoLisp程序,用图解法确定滚刀法向齿形。介绍了确定矩形花键滚刀法向齿形的程序及绘图过程。%With Auto Lisp program, the normal gear shape of a hob can be determined by graphical method. The program and drawing procedure for determining the normal gear shape of a rectangle spline hob are introduced.

  9. J-R Curve Determination for Disk-shaped Compact Specimens Based on the Normalization Method and Direct Current Potential Drop Technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xiang [ORNL; Nanstad, Randy K [ORNL; Sokolov, Mikhail A [ORNL

    2014-01-01

    Material ductile fracture toughness can be described by J-integral versus crack extension relationship (J-R curve). As a conventional J-R curve measurement method, unloading compliance (UC) becomes impractical in elevated temperature testing due to relaxation of the material and a friction induced back-up shape of the J-R curve. In addition, the UC method may underpredict the crack extension for standard disk-shaped compact (DC(T)) specimens. In order to address these issues, the normalization method and direct current potential drop (DCPD) technique were applied for determining J-R curves at 24 C and 500 C for 0.18T DC(T) specimens made from type 316L stainless steel. For comparison purchase, the UC method was also applied in 24 C tests. The normalization method was able to yield valid J-R curves in all tests. The J-R curves from the DCPD technique need adjustment to account for the potential drop induced by plastic deformation, crack blunting, etc. and after applying a newly-developed DCPD adjustment procedure, the post-adjusted DCPD J-R curves essentially matched J-R curves from the normalization method. In contrast, the UC method underpredicted the crack extension in all tests resulting in substantial deviation in the derived J-R curves manifested by high Jq values than the normalization or DCPD method. Only for tests where the UC method underpredicted the crack extension by a very small value, J-R curves determined by the UC method were similar to those determined by the normalization or DCPD method.

  10. New Cutting Force Modeling Approach for Flat End Mill

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A new mechanistic cutting force model for flat end milling using the instantaneous cutting force coefficients is proposed. An in-depth analysis shows that the total cutting forces can be separated into two terms: a nominal component independent of the runout and a perturbation component induced by the runout. The instantaneous value of the nominal component is used to calibrate the cutting force coefficients. With the help of the perturbation component and the cutting force coeffcients obtained above, the cutter runout is identified.Based on simulation and experimental results, the validity of the identification approach is demonstrated. The advantage of the proposed method lies in that the calibration performed with data of one cutting test under a specific regime can be applied for a great range of cutting conditions.

  11. Cutting Forces and Chip Morphology during Wood Plastic Composites Orthogonal Cutting

    Directory of Open Access Journals (Sweden)

    Xiaolei Guo

    2014-02-01

    Full Text Available The effect of chip thickness, rake angle, and edge radius on cutting forces and chip morphology in wood plastic composites (WPCs orthogonal cutting was investigated. Three types of WPCs, Wood flour/polyethylene composite (WFPEC, wood flour/polypropylene composite (WFPPC, and wood flour/polyvinyl chloride composite (WFPVCC, that were tested exhibited different behavior with respect to the machinability aspects. The cutting forces of WFPVCC were the highest, followed by WFPPC and WFPEC. The most significant factor on the parallel cutting force of these three types of WPCs was the chip thickness, which explained more than 90%, contribution of total variation, while rake angle, edge radius, and the interactions between these factors had small contributions. The most significant factor on the normal cutting force of WPCs was also the chip thickness, which accounted for more than 60% of the total variation. The chips produced included long continuous chips, short continuous chips, flake chips, and granule chips when cutting these three types of WPCs.

  12. Thermal modelling of cooling tool cutting when milling by electrical analogy

    Directory of Open Access Journals (Sweden)

    Benmoussa H.

    2010-06-01

    Full Text Available Measurement temperatures by (some devises are applied immediately after shut-down and may be corrected for the temperature drop that occurs in the interval between shut-down and measurement. This paper presents a new procedure for thermal modelling of the tool cutting used just after machining; when the tool is out off the chip in order to extrapolate the cutting temperature from the temperature measured when the tool is at stand still. A fin approximation is made in enhancing heat loss (by conduction and convection to air stream is used. In the modelling we introduce an equivalent thermal network to estimate the cutting temperature as a function of specific energy. In another hand, a local modified element lumped conduction equation is used to predict the temperature gradient with time when the tool is being cooled, with initial and boundary conditions. These predictions provide a detailed view of the global heat transfer coefficient as a function of cutting speed because the heat loss for the tool in air stream is an order of magnitude larger than in normal environment. Finally we deduct the cutting temperature by inverse method.

  13. Thermal modelling of cooling tool cutting when milling by electrical analogy

    Science.gov (United States)

    Benabid, F.; Arrouf, M.; Assas, M.; Benmoussa, H.

    2010-06-01

    Measurement temperatures by (some devises) are applied immediately after shut-down and may be corrected for the temperature drop that occurs in the interval between shut-down and measurement. This paper presents a new procedure for thermal modelling of the tool cutting used just after machining; when the tool is out off the chip in order to extrapolate the cutting temperature from the temperature measured when the tool is at stand still. A fin approximation is made in enhancing heat loss (by conduction and convection) to air stream is used. In the modelling we introduce an equivalent thermal network to estimate the cutting temperature as a function of specific energy. In another hand, a local modified element lumped conduction equation is used to predict the temperature gradient with time when the tool is being cooled, with initial and boundary conditions. These predictions provide a detailed view of the global heat transfer coefficient as a function of cutting speed because the heat loss for the tool in air stream is an order of magnitude larger than in normal environment. Finally we deduct the cutting temperature by inverse method.

  14. 线切割自动编程中刀补方向的确定方法%The Method of Tool Compensation Direction Calculation in Wire Cutting Automatic Programming

    Institute of Scientific and Technical Information of China (English)

    莫秀波; 张秋菊

    2013-01-01

    In order to improve the wire cutting automatic programming system and simplify the operation process , it puts forward a automatic judgment tool compensation direction method based on a set of process parameters and processing conditions in the wire cutting tool path generation .This method does not need a manually tool direc-tion definition , effectively improves the efficiency of programming and avoids human error .%为了提高线切割自动编程系统的自动化程度,减少操作步骤,根据生成线切割刀补轨迹所需设定的工艺参数与加工条件,提出了一种可以自动判断刀补方向的方法,自动编程系统应用该方法后无需人工设定刀补方向,可有效提高编程效率,避免人为出错。

  15. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  16. A Study on Ultrasonic Elliptical Vibration Cutting of Inconel 718

    Directory of Open Access Journals (Sweden)

    Zhao Haidong

    2016-01-01

    Full Text Available Inconel 718 is a kind of nickel-based alloys that are widely used in the aerospace and nuclear industry owing to their high temperature mechanical properties. Cutting of Inconel 718 in conventional cutting (CC is a big challenge in modern industry. Few researches have been studied on cutting of Inconel 718 using single point diamond tool applying the UEVC method. This paper shows an experimental study on UEVC of Inconel 718 by using polycrystalline diamond (PCD coated tools. Firstly, cutting tests have been carried out to study the effect of machining parameters in the UEVC in terms of surface finish and flank wear during machining of Inconel 718. The tests have clearly shown that the PCD coated tools in cutting of Inconel 718 by the UEVC have better performance at 0.1 mm depth of cut as compared to the lower 0.05 mm depth of cut and the higher 0.12 or 0.15 mm depth of cut. Secondly, like CC method, the cutting performance in UEVC increases with the decrease of the feed rate and cutting speed. The CC tests have also been carried out to compare performance of CC with UEVC method.

  17. Application of water jet assisted drag bit and pick cutter for the cutting of coal measure rocks. Final technical report. [Tests of combination in different rocks

    Energy Technology Data Exchange (ETDEWEB)

    Ropchan, D.; Wang, F.D.; Wolgamott, J.

    1980-04-01

    A laboratory investigation was made of the effects of high pressure water jets on the cutting forces of drag bit cutters in sedimentary rocks. A hard and soft sandstone, shale and limestone were tested with commercially obtainable conical and plow type drag bits on the EMI linear cutting machine. About 1200 cuts were made at different bit penetration, jet orientation, and water pressure to determine the reduction of cutting forces on the bit from the use of the water jet. Both independent and interactive cutting was used. The greatest reduction in cutting forces were with both of the sandstones; the drag forces were reduced about 30 percent and the normal forces about 60 percent at 5000 psi water pressure with the nozzle behind the bit. The method was less effective in the shale, except at 10,000 psi water pressure the reduction in drag force was about 55 percent. Of the rocks tested, the limestone was least affected by the water jet. The cutting forces for the plow bit showed continuous change with wear so a machined conical bit was used for most of the testing. Tests with the plow bit did show a large reduction in cutting forces by using the water jet with worn bits. An economic analysis of equipping a drag bit tunnel boring machine indicated that the water jet system could reduce costs per foot in sandstone by up to 40 percent.

  18. Melt Flow and Energy Limitation of Laser Cutting

    Directory of Open Access Journals (Sweden)

    Pavel Hudeček

    2016-01-01

    Full Text Available Laser technology is a convertible technology for plenty of parts in most materials. Laser material processing for industrial manufacturing applications is today a widespread procedure for welding, cutting, marking and micro machining of metal and plastic parts and components. Involvement and support this huge mass-production industry of laser cutting, new technology and dry-process using lasers were and are being actively developed. Fundamentally, industrial laser cutting or other applications on industry should satisfy the four key practical application issues including “Quality or Performance”, “Throughput or Speed”, “Cost or Total Ownership Cost”, and “Reliability”. Laser requires for examples several complicated physical factors to be resolved including die strength to be enable good wire-bonding and survival of severe cycling test, clean cutting wall surface, good cutting of direct attach film, and proper speed of cutting for achieving economy of throughput. Some example of maximum cutting rate, wherewith is normally limited laser energy, cutting speed is depend on type laser, different of cutting with one laser beam and beam pattern and applied laser power/material thickness will be introduced in this paper.

  19. Feasibility Study of Cryogenic Cutting Technology by Using a Computer Simulation and Manufacture of Main Components for Cryogenic Cutting System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Kyun; Lee, Dong Gyu; Lee, Kune Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Song, Oh Seop [Chungnam National University, Deajeon (Korea, Republic of)

    2009-06-15

    Cryogenic cutting technology is one of the most suitable technologies for dismantling nuclear facilities due to the fact that a secondary waste is not generated during the cutting process. In this paper, the feasibility of cryogenic cutting technology was investigated by using a computer simulation. In the computer simulation, a hybrid method combined with the SPH (smoothed particle hydrodynamics) method and the FE (finite element) method was used. And also, a penetration depth equation, for the design of the cryogenic cutting system, was used and the design variables and operation conditions to cut a 10 mm thickness for steel were determined. Finally, the main components of the cryogenic cutting system were manufactures on the basis of the obtained design variables and operation conditions.

  20. An efficient reliable method to estimate the vaporization enthalpy of pure substances according to the normal boiling temperature and critical properties

    Directory of Open Access Journals (Sweden)

    Babak Mehmandoust

    2014-03-01

    Full Text Available The heat of vaporization of a pure substance at its normal boiling temperature is a very important property in many chemical processes. In this work, a new empirical method was developed to predict vaporization enthalpy of pure substances. This equation is a function of normal boiling temperature, critical temperature, and critical pressure. The presented model is simple to use and provides an improvement over the existing equations for 452 pure substances in wide boiling range. The results showed that the proposed correlation is more accurate than the literature methods for pure substances in a wide boiling range (20.3–722 K.