WorldWideScience

Sample records for workpiece temperature analyzer

  1. Development and evaluation of a workpiece temperature analyzer for industrial furnaces

    Energy Technology Data Exchange (ETDEWEB)

    1990-05-01

    An instrument capable of measuring the bulk temperature of a workpiece while it is being heated could have a variety of applications. If such an instrument were reasonably priced, it would have a tremendous impact upon national energy usage. The Department of Energy has realized the importance of this type of instrument and has sponsored three concurrent programs to evaluate three different technologies for this type of instrument. In one of these programs, Surface Combustion is the prime contractor to develop a pulsed laser, polarizing interferometer based sensor to be used as a workpiece temperature analyzer (WPTA). The overall goal of the program is to develop a workpiece temperature analyzer for industrial furnaces to significantly improve product quality, productivity and energy efficiency. The workpiece temperature analyzer concept in this program uses a pulsed laser polarizing interferometer (PLPI) for measuring sound velocity through a workpiece. This type of instrument has a high resolution and could detect surface motion of as small as 10 picometer. The sound velocity measurement can be converted to an average workpiece temperature through a mathematical equation programmed into the microprocessor used for control. 76 refs., 12 figs., 14 tabs.

  2. Development and evaluation of a workpiece temperature analyzer (WPTA) for industrial furances (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This project is directed toward the research, development, and evaluation of a viable commercial product-a workpiece temperature measurement analyzer (WPTA) for fired furnaces based on unique radiation properties of surfaces. This WPTA will provide for more uniform, higher quality products and reduce product rejects as well as permit the optimum use of energy. The WPTA may also be utilized in control system applications including metal heat treating, forging furnaces, and ceramic firing furnaces. A large market also exists in the chemical process and refining industry. WPTA applications include the verification of product temperature/time cycles, and use as a front-end sensor for automatic feedback control systems. This report summarizes the work performed in Phase 1 of this three-phase project. The work Phase 1 included the application evaluation; the evaluation of present technologies and limitations; and the development of a preliminary conceptual WPTA design, including identification of technical and economic benefits. Recommendations based on the findings of this report include near-term enhancement of the capabilities of the Pyrolaser, and long-term development of an instrument based on Raman Spectroscopy. Development of the Pyrofiber, fiberoptics version of the Pyrolaser, will be a key to solving present problems involving specularity, measurement angle, and costs of multipoint measurement. Extending the instrument's measurement range to include temperatures below 600{degrees}C will make the product useful for a wider range of applications. The development of Raman Spectroscopy would result in an instrument that could easily be adapted to incorporate a wealth of additional nondestructive analytical capabilities, including stress/stain indication, crystallography, species concentrations, corrosion studies, and catalysis studies, in addition to temperature measurement. 9 refs., 20 figs., 16 tabs.

  3. Development and evaluation of a workpiece temperature analyzer for industrial furnaces

    Energy Technology Data Exchange (ETDEWEB)

    Berthod, J.W.

    1993-06-01

    Tests were done to determine whether ultrasound could be generated, propagated through, and detected in typical steel specimens up to approximately 1020{degree}C. All specimens were subjected to room temperature tests by generating ultrasound via a 1.0 Joule Nd-YAG laser. Two specimens were also tested up to the higher temperature. Ultrasound detection was also performed with the Fabry-Perot interferometer. The tests and results are described. Test plans are presented.

  4. The Influence Study of Ultrasonic honing parameters to workpiece surface temperature

    Directory of Open Access Journals (Sweden)

    Zhang Xiaoqiang

    2016-01-01

    Full Text Available Ultrasonic vibration honing(UVH, a machine technology, has a lot of advantages. Lower grinding temperature is a significant character and is beneficial for both processing and workpiece surface. But the high temperature caused by big honing pressure becomes the main factor to produce workpiece heat damage in grinding zone. In various honing parameter combinations, the showing effect is different. Based on the thermodynamics classical theory, established the heat transfer equation for grinding zone, simplified the model and obtained the two-dimenssion temperature field expression for workpiece, then simulated the temperature changing trend in a variety of conditions. It is shown that themain temp is in a range of 700K to 1200K. In addition, the variation is huge for every parameter. The study provides a theoretical basis for deeply seeking reasonable machining parameter and obtaining better workpiece quality.

  5. Workpiece temperature distribution for deep penetration welding with high energy focused beams

    Science.gov (United States)

    Peretz, R.

    1986-01-01

    A solution for the two-dimensional temperature field in a workpiece at welding by laser or electron beams, which takes into consideration the solid-to-liquid phase change of the material, is presented. This leads to more precise process parameter correlations.

  6. Effects of high power ultrasonic vibration on temperature distribution of workpiece in dry creep feed up grinding.

    Science.gov (United States)

    Paknejad, Masih; Abdullah, Amir; Azarhoushang, Bahman

    2017-11-01

    Temperature history and distribution of steel workpiece (X20Cr13) was measured by a high tech infrared camera under ultrasonic assisted dry creep feed up grinding. For this purpose, a special experimental setup was designed and fabricated to vibrate only workpiece along two directions by a high power ultrasonic transducer. In this study, ultrasonic effects with respect to grinding parameters including depth of cut (a e ), feed speed (v w ), and cutting speed (v s ) has been investigated. The results indicate that the ultrasonic vibration has considerable effect on reduction of temperature, depth of thermal damage of workpiece and width of temperature contours. Maximum temperature reduction of 25.91% was reported at condition of v s =15m/s, v w =500mm/min, a e =0.4mm in the presence of ultrasonic vibration. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Temperature measurement of flat glass edge during grinding and effect of wheel and workpiece speeds

    Science.gov (United States)

    Moussa, Tala; Garnier, Bertrand; Peerhossaini, Hassan

    2017-06-01

    Flat glass temperature at the vicinity of the grinding wheel during grinding can become very high and reach that of the glass transition (typically around 550-600 °C). In such cases, the mechanical strength of glass is greatly affected and the grinding process cannot be carried out properly. Hence, thermal phenomena must be managed by adjusting the machining parameters to avoid overheating. For this purpose, it is very important to be able to measure the glass temperature, especially at the grinding interface. However, measuring the interfacial glass temperature is difficult and none of the existing methods for metal grinding is adequate for glass grinding. This work shows a novel temperature method that uses constantan and copper strips on both sides of the glass plates; thermoelectric contact being provided by the metallic binder of diamond particles in the grinding wheel. This new technique allows the measurement of the glass edge temperature during the wheel displacement around the glass plate. The experimental results show an average glass edge temperature between 300 and 600 °C depending on the value of the machining parameters such as work speed, wheel speed, depth of cut and water coolant flow rate. As this new thermal instrumentation is rather intrusive, glass temperature biases were analysed using a 3D heat transfer model with a moving source. Model computations performed using finite elements show that the temperature biases are less than 70 °C, which is smaller than the standard deviation of the glass edge temperatures measured during grinding.

  8. Influence of the cutting parameters on the workpiece temperature during face milling

    Directory of Open Access Journals (Sweden)

    Nowakowski Lukasz

    2017-01-01

    Full Text Available This thesis presents the outcome of experimental research of the impact of changes in cutting speed and volume of material processed during a face milling process on the temperature of the processed object made of copper of M1Ez4 class. Measurement of the temperature of the processed object was conducted in six points with K-type thermocouples. The theoretical amount of released heat per unit of time for particular parameters of machining was also calculated.

  9. Analyzing the effect of tool edge radius on cutting temperature in micro-milling process

    Science.gov (United States)

    Liang, Y. C.; Yang, K.; Zheng, K. N.; Bai, Q. S.; Chen, W. Q.; Sun, G. Y.

    2010-10-01

    Cutting heat is one of the important physical subjects in the cutting process. Cutting heat together with cutting temperature produced by the cutting process will directly have effects on the tool wear and the life as well as on the workpiece processing precision and surface quality. The feature size of the workpiece is usually several microns. Thus, the tiny changes of cutting temperature will affect the workpiece on the surface quality and accuracy. Therefore, cutting heat and temperature generated in micro-milling will have significantly different effect than the one in the traditional tools cutting. In this paper, a two-dimensional coupled thermal-mechanical finite element model is adopted to determine thermal fields and cutting temperature during the Micro-milling process, by using software Deform-2D. The effect of tool edge radius on effective stress, effective strain, velocity field and cutting temperature distribution in micro-milling of aluminum alloy Al2024-T6 were investigated and analyzed. Also, the transient cutting temperature distribution was simulated dynamically. The simulation results show that the cutting temperature in Micro-milling is lower than those occurring in conventional milling processes due to the small loads and low cutting velocity. With increase of tool edge radius, the maximum temperature region gradually occurs on the contact region between finished surfaced and flank face of micro-cutter, instead of the rake face or the corner of micro-cutter. And this phenomenon shows an obvious size effect.

  10. Experimental and Numerical Investigations in Shallow Cut Grinding by Workpiece Integrated Infrared Thermopile Array

    Directory of Open Access Journals (Sweden)

    Marcel Reimers

    2017-09-01

    Full Text Available The purpose of our study is to investigate the heat distribution and the occurring temperatures during grinding. Therefore, we did both experimental and numerical investigations. In the first part, we present the integration of an infrared thermopile array in a steel workpiece. Experiments are done by acquiring data from the thermopile array during grinding of a groove in a workpiece made of steel. In the second part, we present numerical investigations in the grinding process to further understand the thermal characteristic during grinding. Finally, we conclude our work. Increasing the feed speed leads to two things: higher heat flux densities in the workpiece and higher temperature gradients in the material.

  11. A study on the edge chipping according to spindle speed and inclination angle of workpiece in laser-assisted milling of silicon nitride

    Science.gov (United States)

    Woo, Wan-Sik; Lee, Choon-Man

    2018-02-01

    Ceramics are difficult to machine due to their high hardness and brittleness. As an effective method for machining ceramics, laser-assisted machining (LAM) has been studied by many researchers. In particular, many studies of methods to improve the machinability of silicon nitride using LAM have been performed. However, there is little research on the effect of the inclination angle of the workpiece, because varying the angle increases the difficulty of controlling the laser preheating and tool path. This paper investigates the effect of preheating temperature, spindle speed and inclination angle of the workpiece on edge chipping of silicon nitride in an effort to obtain an enhanced surface finish using laser-assisted milling (LAMill). The machining conditions were determined by considering the parameters that can reduce edge chipping using related theory. Experimental results showed a reduction in edge chipping based on increases in preheating temperature, spindle speed and inclination angle of the workpiece. Also, by increasing the spindle speed and the inclination angle of the workpiece, surface roughness was decreased due to reduction in the cutting force. The energy efficiency of LAMill by comparing the specific cutting energy according to the machining conditions is analyzed.

  12. Method of and apparatus for thermomagnetically processing a workpiece

    Science.gov (United States)

    Kisner, Roger A.; Rios, Orlando; Wilgen, John B.; Ludtka, Gerard M.; Ludtka, Gail M.

    2014-08-05

    A method of thermomagnetically processing a material includes disposing a workpiece within a bore of a magnet; exposing the workpiece to a magnetic field of at least about 1 Tesla generated by the magnet; and, while exposing the workpiece to the magnetic field, applying heat energy to the workpiece at a plurality of frequencies to achieve spatially-controlled heating of the workpiece. An apparatus for thermomagnetically processing a material comprises: a high field strength magnet having a bore extending therethrough for insertion of a workpiece therein; and an energy source disposed adjacent to an entrance to the bore. The energy source is an emitter of variable frequency heat energy, and the bore comprises a waveguide for propagation of the variable frequency heat energy from the energy source to the workpiece.

  13. Classification system to describe workpieces definitions

    CERN Document Server

    Macconnell, W R

    2013-01-01

    A Classification System to Describe Workpieces provides information pertinent to the fundamental aspects and principles of coding. This book discusses the various applications of the classification system of coding.Organized into three chapters, this book begins with an overview of the requirements of a system of classification pertaining adequately and equally to design, production, and work planning. This text then examines the purpose of the classification system in production to determine the most suitable means of machining a component. Other chapters consider the optimal utilization of m

  14. Demonstration of capabilities of high temperature composites analyzer code HITCAN

    Science.gov (United States)

    Singhal, Surendra N.; Lackney, Joseph J.; Chamis, Christos C.; Murthy, Pappu L. N.

    1990-01-01

    The capabilities a high temperature composites analyzer code, HITCAN which predicts global structural and local stress-strain response of multilayered metal matrix composite structures, are demonstrated. The response can be determined both at the constituent (fiber, matrix, and interphase) and the structure level and includes the fabrication process effects. The thermo-mechanical properties of the constituents are considered to be nonlinearly dependent on several parameters including temperature, stress, and stress rate. The computational procedure employs an incremental iterative nonlinear approach utilizing a multifactor-interactive constituent material behavior model. Various features of the code are demonstrated through example problems for typical structures.

  15. Motion characteristic between die and workpiece in spline rolling process with round dies

    Directory of Open Access Journals (Sweden)

    Da-Wei Zhang

    2016-06-01

    Full Text Available In the spline rolling process with round dies, additional kinematic compensation is an essential mechanism for improving the division of teeth and pitch accuracy as well as surface quality. The motion characteristic between the die and workpiece under varied center distance in the spline rolling process was investigated. Mathematical models of the instantaneous center of rotation, transmission ratio, and centrodes in the rolling process were established. The models were used to analyze the rolling process of the involute spline with circular dedendum, and the results indicated that (1 with the reduction in the center distance, the instantaneous center moves toward workpiece, and the transmission ratio increases at first and then decreases; (2 the variations in the instantaneous center and transmission ratio are discontinuous, presenting an interruption when the involute flank begins to be formed; (3 the change in transmission ratio at the forming stage of the workpiece with the involute flank can be negligible; and (4 the centrode of the workpiece is an Archimedes line whose polar radius reduces, and the centrode of the rolling die is similar to Archimedes line when the workpiece is with the involute flank.

  16. Properties isotropy of magnesium alloy strip workpieces

    Directory of Open Access Journals (Sweden)

    Р. Кавалла

    2016-12-01

    Full Text Available The paper discusses the issue of obtaining high quality cast workpieces of magnesium alloys produced by strip roll-casting. Producing strips of magnesium alloys by combining the processes of casting and rolling when liquid melt is fed continuously to fast rolls is quite promising and economic. In the process of sheet stamping considerable losses of metal occur on festoons formed due to anisotropy of properties of foil workpiece, as defined by the macro- and microstructure and modes of rolling and annealing. The principal causes of anisotropic mechanical properties of metal strips produced by the combined casting and rolling technique are the character of distribution of intermetallic compounds in the strip, orientation of phases of metal defects and the residual tensions. One of the tasks in increasing the output of fit products during stamping operations consists in minimizing the amount of defects. To lower the level of anisotropy in mechanical properties various ways of treating the melt during casting are suggested. Designing the technology of producing strips of magnesium alloys opens a possibility of using them in automobile industry to manufacture light-weight body elements instead of those made of steel.

  17. Workpiece Machining Accuracy Prediction Based on Milling Simulation

    Directory of Open Access Journals (Sweden)

    Lv Yan-peng

    2016-01-01

    Full Text Available To ensure the machining accuracy of workpiece, it is necessary to predict the workpiece deformation in machining process through establishing a high precision workpiece deformation forecast model. To solve these problems, a more efficient variable stiffness analysis model is proposed, which can obtain quantitative stiffness value of the machining surface. Applying simulated cutting force in sampling points using finite element analysis software ABAQUS, the single direction variable stiffness rule can be obtained. First of all, finite element simulation model of face milling is established with the Johnson-Cook material model and failure model of 7050 aluminum alloy. Prediction model is established based on SVM and input data is provided by the finite element software ABAQUS simulation. Results show that the model prediction relative error is less than 5%. It is concluded that the effects of milling parameters on workpiece machining deformation and practical guide for production.

  18. Physical-chemical quality of onion analyzed under drying temperature

    Science.gov (United States)

    Djaeni, M.; Arifin, U. F.; Sasongko, S. B.

    2017-03-01

    Drying is one of conventional processes to enhance shelf life of onion. However, the active compounds such as vitamin and anthocyanin (represented in red color), degraded due to the introduction of heat during the process. The objective of this research was to evaluate thiamine content as well as color in onion drying under different temperature. As an indicator, the thiamine and color was observed every 30 minutes for 2 hours. Results showed that thiamine content and color were sensitvely influenced by the temperature change. For example, at 50°C for 2 hours drying process, the thiamine degradation was 55.37 %, whereas, at 60°C with same drying time, the degradation was 74.01%. The quality degradation also increased by prolonging drying time.

  19. Simulation of the Process of Simultaneous Finishing Batch of Work-pieces

    Directory of Open Access Journals (Sweden)

    N. G. Nazarov

    2015-01-01

    Full Text Available The paper analyzes different approaches to creating mathematical models to provide abrasive finishing. It highlights the features of abrasive finishing of precision parts at the final processing stages. The paper persuasively argues that it is necessary to take into consideration the law of size distribution in the batch of simultaneously treated work-pieces with double-sided lapping of precision plane-parallel and cylindrical parts, such as end bars and cylinders for measuring the pitch diameter. The paper also proposes geometric models of the batch of simultaneously treated work-pieces for plane-parallel and cylindrical parts when lapping them in a double-disk lapping machine as a "general work-piece". On the basis of geometrical models an expression is created to calculate the change of a working pressure or the distributed load (for cylindrical parts in the process of abrasive finishing. The paper presents the experimental study results of changing geometrical dimensions in the batch of work-pieces of the end measures of length in abrasive process thereby confirming the validity of the proposed approach to creating model. There is a justification that in the mathematical model should be introduced expressions describing the change of abrasiveness of the abrasive used for different kinds of lapping, including that of applied to the final lapping. A mathematical model is presented in the form of integrodifferential equations describing how the rate of linear material removal in double-sided finishing depends on the speed of relative movement of the work-pieces on the lap (sanding block, as well as on the pressure or distributed load (for cylindrical parts, and acceleration of relative motion, with appropriate exponents. The expression taking into account the change of the abrasiveness of the abrasive used is introduced in the differential equation. The model is presented in a generalized form, taking into account all possible processing

  20. INFLUENCE OF WORKPIECE SURFACE PREPARATION ON THERMAL BARRIER COATING DURABILITY

    Directory of Open Access Journals (Sweden)

    M. A. Petrova

    2014-01-01

    Full Text Available Article deals with the impact of workpiece surface quality on adhesive strength and durability of thermal barrier coating. The result revealed that the roughness of metal layer influences on the adhesion of ceramic coating and depends the thickness of ceramic crystals when using method of Electron beam deposition.

  1. Study on the induction heating of the workpiece before gear rolling process

    Science.gov (United States)

    Ji, Hongchao; Wang, Baoyu; Fu, Xiaobin

    2017-10-01

    Gear forming by cross rolling with local induction heating process is an alternative method to manufacturing large diameter gear parts, and the distribution of the temperature has a great influence on the subsequently gear forming process. To get a satisfied temperature distribution of the heated workpiece, the model of induction heating process by coupling the electromagnetic and thermal field combing the rotational motion is established and the heating rate and temperature distribution along the axial and circumferential has been investigated. Moreover, corresponding experiment was carried out and the heating parameter and temperature distribution were measured by multi meter and infrared thermographic imaging. According to the comparison of the experimental and simulation results, the established model was verified and the simulation results are reliable.

  2. Resultant geometric variation of a fixtured workpiece Part I: a simulation

    Directory of Open Access Journals (Sweden)

    Supapan Sangnui Chaiprapat

    2006-01-01

    Full Text Available When a workpiece is fixtured for a machining or inspection operation, the accuracy of an operation is mainly determined by the efficiency of the fixturing method. Variability in manufactured workpiece is hardly inevitable. When such variability is found at contact areas between the workpiece and the fixture, errors in location are expected. The errors will affect quality of features to be produced. This paper developed an algorithm to determine variant final locations of a displaced workpiece given normally distributed errorsat contact points. Resultant geometric variation of workpiece location reveals interesting information which is beneficial in tolerance planning.

  3. Numerical Simulation of a Grinding Process for the Spatial Work-pieces: Modeling of Grinding Forces and System Dynamics

    Directory of Open Access Journals (Sweden)

    I. A. Kiselev

    2015-01-01

    Full Text Available The paper describes a computation-experimental technique to determine model coefficients of grinding forces using a Nelder-Mead algorithm. As an error function, the paper offers a deviation measure of calculating and experimental grinding forces averaged for a single-pass of the grinding wheel. As an example of cutting forces model coefficients calculation for linear model, in which the grinding forces depend on uncut chip thickness is analyzed. The coefficients vary on abrasive grain geometric parameters and are determined applying the authors-developed method based on Nelder-Mead technique. The measured forces while plane grinding of test work-piece are used to determine the coefficients. Model coefficients are identified if compare the measured data with the results of modeling for grinding by tool with the uniformly distributed abrasive grains with the triangular shape of cutting edge.Grinding dynamics simulation applying the determined coefficients was carried out for the processing of cantilever plane work-piece as a test example. The work-piece was processed by grinding wheel transverse passages made at different distances from the fixation. A selfoscillating process accompanied by vibration of high level was observed for some selected technological parameters of grinding. The simulation has shown qualitative and quantitative compliance with the experiment. It was shown that the intensity of the self-oscillating process arising during the processing depends on the work-piece rigidity and cutting conditions. The results of modeling can be applied in practice in developing the technology process of grinding the spatial work-pieces.

  4. Experiments for practical education in process parameter optimization for selective laser sintering to increase workpiece quality

    Science.gov (United States)

    Reutterer, Bernd; Traxler, Lukas; Bayer, Natascha; Drauschke, Andreas

    2016-04-01

    Selective Laser Sintering (SLS) is considered as one of the most important additive manufacturing processes due to component stability and its broad range of usable materials. However the influence of the different process parameters on mechanical workpiece properties is still poorly studied, leading to the fact that further optimization is necessary to increase workpiece quality. In order to investigate the impact of various process parameters, laboratory experiments are implemented to improve the understanding of the SLS limitations and advantages on an educational level. Experiments are based on two different workstations, used to teach students the fundamentals of SLS. First of all a 50 W CO2 laser workstation is used to investigate the interaction of the laser beam with the used material in accordance with varied process parameters to analyze a single-layered test piece. Second of all the FORMIGA P110 laser sintering system from EOS is used to print different 3D test pieces in dependence on various process parameters. Finally quality attributes are tested including warpage, dimension accuracy or tensile strength. For dimension measurements and evaluation of the surface structure a telecentric lens in combination with a camera is used. A tensile test machine allows testing of the tensile strength and the interpreting of stress-strain curves. The developed laboratory experiments are suitable to teach students the influence of processing parameters. In this context they will be able to optimize the input parameters depending on the component which has to be manufactured and to increase the overall quality of the final workpiece.

  5. Method for analyzing passive silicon carbide thermometry with a continuous dilatometer to determine irradiation temperature

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Anne A., E-mail: campbellaa@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Porter, Wallace D.; Katoh, Yutai [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Snead, Lance L. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)

    2016-03-01

    Highlights: • Annealing of SiC via continuous dilatometry to determine irradiation temperature. • Wrote a program to analyze dilatometry results to determine irradiation temperature. • Dilatometry results are consistent with results from a historical technique. • Computer program was written in an open-source language and is available for others. - Abstract: Silicon carbide is used as a passive post-irradiation temperature monitor because the irradiation defects will anneal out above the irradiation temperature. The irradiation temperature is determined by measuring a property change after isochronal annealing, i.e., lattice spacing, dimensions, electrical resistivity, thermal diffusivity, or bulk density. However, such methods are time-consuming since the steps involved must be performed in a serial manner. This work presents the use of thermal expansion from continuous dilatometry to calculate the SiC irradiation temperature, which is an automated process requiring minimal setup time. Analysis software was written that performs the calculations to obtain the irradiation temperature and removes possible user-introduced error while standardizing the analysis. This method has been compared to an electrical resistivity and isochronal annealing investigation, and the results revealed agreement of the calculated temperatures. These results show that dilatometry is a reliable and less time-intensive process for determining irradiation temperature from passive SiC thermometry.

  6. The effect of ambient temperature on the anti-D assay using the Auto Analyzer

    Science.gov (United States)

    Gunson, H. H.; Phillips, P. K.; Stratton, F.

    1974-01-01

    Using the AutoAnalyzer, the percentage agglutination effected by the anti-D antisera studied showed a varied dependence on the ambient temperature over the manifold subsequent to the incubation period at 37°C. This leads to assays which are a function of the ambient temperature. It is suggested that the entry of a relatively large volume of rouleaux-dispersing agent results in an elution of bound antibody to a new position of equilibrium, the shift being dependent on the particular equilibrium constant of the antibody and the rate of its attainment on the ambient temperature. A constant ambient temperature will lead to greater accuracy of anti-D assay. PMID:4212401

  7. 3D pose estimation of large and complicated workpieces based on binocular stereo vision.

    Science.gov (United States)

    Luo, Zhifeng; Zhang, Ke; Wang, Zhigang; Zheng, Jian; Chen, Yixin

    2017-08-20

    A binocular stereo vision method is proposed for automatically locating the position and posture of workpieces, which is especially important when processing large, complicated structures, for example, the laser hardening and laser cladding of automotive die. First, a binocular stereo vision positioning system was designed and modeled, from which a method of background subtraction was proposed to extract the edge line of the foreground area. Furthermore, the intersection point of the workpiece contour line was taken as the characteristic point of the workpiece, and an algorithm that combines epipolar constraint with gray value similarity was proposed to quickly and accurately realize the feature points matching. Finally, experiments show that the workpiece can be positioned accurately, and that the precision of position recognition could be controlled within ±0.5  mm when the camera was 1 m away from the workpiece, meeting the requirement of robot processing.

  8. Influence of Workpiece Material on Tool Wear Performance and Tribofilm Formation in Machining Hardened Steel

    Directory of Open Access Journals (Sweden)

    Junfeng Yuan

    2016-04-01

    Full Text Available In addition to the bulk properties of a workpiece material, characteristics of the tribofilms formed as a result of workpiece material mass transfer to the friction surface play a significant role in friction control. This is especially true in cutting of hardened materials, where it is very difficult to use liquid based lubricants. To better understand wear performance and the formation of beneficial tribofilms, this study presents an assessment of uncoated mixed alumina ceramic tools (Al2O3+TiC in the turning of two grades of steel, AISI T1 and AISI D2. Both workpiece materials were hardened to 59 HRC then machined under identical cutting conditions. Comprehensive characterization of the resulting wear patterns and the tribofilms formed at the tool/workpiece interface were made using X-ray Photoelectron Spectroscopy and Scanning Electron Microscopy. Metallographic studies on the workpiece material were performed before the machining process and the surface integrity of the machined part was investigated after machining. Tool life was 23% higher when turning D2 than T1. This improvement in cutting tool life and wear behaviour was attributed to a difference in: (1 tribofilm generation on the friction surface and (2 the amount and distribution of carbide phases in the workpiece materials. The results show that wear performance depends both on properties of the workpiece material and characteristics of the tribofilms formed on the friction surface.

  9. Analyzing the impact of ambient temperature indicators on transformer life in different regions of Chinese mainland.

    Science.gov (United States)

    Bai, Cui-fen; Gao, Wen-Sheng; Liu, Tong

    2013-01-01

    Regression analysis is applied to quantitatively analyze the impact of different ambient temperature characteristics on the transformer life at different locations of Chinese mainland. 200 typical locations in Chinese mainland are selected for the study. They are specially divided into six regions so that the subsequent analysis can be done in a regional context. For each region, the local historical ambient temperature and load data are provided as inputs variables of the life consumption model in IEEE Std. C57.91-1995 to estimate the transformer life at every location. Five ambient temperature indicators related to the transformer life are involved into the partial least squares regression to describe their impact on the transformer life. According to a contribution measurement criterion of partial least squares regression, three indicators are conclusively found to be the most important factors influencing the transformer life, and an explicit expression is provided to describe the relationship between the indicators and the transformer life for every region. The analysis result is applicable to the area where the temperature characteristics are similar to Chinese mainland, and the expressions obtained can be applied to the other locations that are not included in this paper if these three indicators are known.

  10. Analyzing the Impact of Ambient Temperature Indicators on Transformer Life in Different Regions of Chinese Mainland

    Directory of Open Access Journals (Sweden)

    Cui-fen Bai

    2013-01-01

    Full Text Available Regression analysis is applied to quantitatively analyze the impact of different ambient temperature characteristics on the transformer life at different locations of Chinese mainland. 200 typical locations in Chinese mainland are selected for the study. They are specially divided into six regions so that the subsequent analysis can be done in a regional context. For each region, the local historical ambient temperature and load data are provided as inputs variables of the life consumption model in IEEE Std. C57.91-1995 to estimate the transformer life at every location. Five ambient temperature indicators related to the transformer life are involved into the partial least squares regression to describe their impact on the transformer life. According to a contribution measurement criterion of partial least squares regression, three indicators are conclusively found to be the most important factors influencing the transformer life, and an explicit expression is provided to describe the relationship between the indicators and the transformer life for every region. The analysis result is applicable to the area where the temperature characteristics are similar to Chinese mainland, and the expressions obtained can be applied to the other locations that are not included in this paper if these three indicators are known.

  11. Analyzes of students’ higher-order thinking skills of heat and temperature concept

    Science.gov (United States)

    Slamet Budiarti, Indah; Suparmi, A.; Sarwanto; Harjana

    2017-11-01

    High order thinking skills refer to three highest domains of the revised Bloom Taxonomy. The aims of the research were to analyze the student’s higher-order thinking skills of heat and temperature concept. The samples were taken by purposive random sampling technique consisted of 85 high school students from 3 senior high schools in Jayapura city. The descriptive qualitative method was employed in this study. The data were collected by using tests and interviews regarding the subject matters of heat and temperature. Based on the results of data analysis, it was concluded that 68.24% of the students have a high order thinking skills in the analysis, 3.53% of the students have a high order thinking skills in evaluating, and 0% of the students have a high order thinking skills in creation.

  12. OPTIMIZATION OF TEMPERATURE HARDENING FOR IMPROVING THE HEAT RESISTANCE OF TOOL STEEL 4X5MФ1С IN VARIOUS WORKPIECES Part 1. INFLUENCE OF HEATING TEMPERATURE 1040 °С IN THE OIL HARDENING AND HARDERING ON THE HARDNESS AND STRUCTURE OF FORGINGS AND CASTINGS MADE OF STEEL 4X5MФ1C

    Directory of Open Access Journals (Sweden)

    V. N. Fedulov

    2017-01-01

    Full Text Available Influence of the oil quenching temperature with the heating 1040 °С near 1 hour of tool steel 4X5МФ1С forgings and castings on the microstructure and the ability to hardening after high temperature tempering at 500–650 °C for 1, 5 hours. It was shown that increase of hardening level in comparison with the required index has not been achieved.

  13. Effect of feed rate, workpiece hardness and cutting edge on subsurface residual stress in the hard turning of bearing steel using chamfer + hone cutting edge geometry

    Energy Technology Data Exchange (ETDEWEB)

    Hua Jiang [Industrial, Welding and Systems Engineering, The Ohio State University, Columbus, 210 Baker Systems Engineering Building, 1971 Neil Avenue, OH 43210 (United States)]. E-mail: hua.14@osu.edu; Shivpuri, Rajiv [Industrial, Welding and Systems Engineering, Ohio State University, Columbus, 210 Baker Systems Engineering Building, 1971 Neil Avenue, OH 43210 (United States)]. E-mail: shivpuri.1@osu.edu; Cheng Xiaomin [Industrial, Welding and Systems Engineering, Ohio State University, Columbus, 210 Baker Systems Engineering Building, 1971 Neil Avenue, OH 43210 (United States)]. E-mail: cheng.242@osu.edu; Bedekar, Vikram [Industrial, Welding and Systems Engineering, Ohio State University, Columbus, 210 Baker Systems Engineering Building, 1971 Neil Avenue, OH 43210 (United States)]. E-mail: bedekar.1@osu.edu; Matsumoto, Yoichi [Timken Research, Timken Company, Canton, 1835 Dueber Avenue SW, P.O. Box 6930, OH 44706 (United States)]. E-mail: yoichi.matsumoto@timken.com; Hashimoto, Fukuo [Timken Research, Timken Company, Canton, 1835 Dueber Avenue SW, P.O. Box 6930, OH 44706 (United States)]. E-mail: hashimof@timken.com; Watkins, Thomas R. [High Temperature Materials Laboratory, Metals and Ceramics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831-6064 (United States)]. E-mail: watkinstr@ornl.gov

    2005-03-15

    Residual stress on the machined surface and the subsurface is known to influence the service quality of a component, such as fatigue life, tribological properties, and distortion. Therefore, it is essential to predict and control it for enhanced performance. In this paper, a newly proposed hardness based flow stress model is incorporated into an elastic-viscoplastic finite element model of hard turning to analyze process variables that affect the residual stress profile of the machined surface. The effects of cutting edge geometry and workpiece hardness as well as cutting conditions, such as feed rate and cutting speed, are investigated. Numerical analysis shows that hone edge plus chamfer cutting edge and aggressive feed rate help to increase both compressive residual stress and penetration depth. These predictions are validated by face turning experiments which were conducted using a chamfer with hone cutting edge for different material hardness and cutting parameters. The residual stresses under the machined surface are measured by X-ray diffraction/electropolishing method. A maximum circumferential residual stress of about 1700 MPa at a depth of 40 {mu}m is reached for hardness of 62 HRc and feed rate of 0.56 mm/rev. This represents a significant increase from previously reported results in literatures. It is found from this analysis that using medium hone radius (0.02-0.05 mm) plus chamfer is good for keeping tool temperature and cutting force low, while obtaining desired residual stress profile.

  14. Cutter-workpiece engagement determination for general milling using triangle mesh modeling

    Directory of Open Access Journals (Sweden)

    Xun Gong

    2016-04-01

    Full Text Available Cutter-workpiece engagement (CWE is the instantaneous contact geometry between the cutter and the in-process workpiece during machining. It plays an important role in machining process simulation and directly affects the calculation of the predicted cutting forces and torques. The difficulty and challenge of CWE determination come from the complexity due to the changing geometry of in-process workpiece and the curved tool path of cutter movement, especially for multi-axis milling. This paper presents a new method to determine the CWE for general milling processes. To fulfill the requirement of generality, which means for any cutter type, any in-process workpiece shape, and any tool path even with self-intersections, all the associated geometries are to be modeled as triangle meshes. The involved triangle-to-triangle intersection calculations are carried out by an effective method in order to realize the multiple subtraction Boolean operations between the tool and the workpiece mesh models and to determine the CWE. The presented method has been validated by a series of case studies of increasing machining complexity to demonstrate its applicability to general milling processes.

  15. Estimating the workpiece-backingplate heat transfer coefficient in friction stirwelding

    DEFF Research Database (Denmark)

    Larsen, Anders; Stolpe, Mathias; Hattel, Jesper Henri

    2012-01-01

    Purpose - The purpose of this paper is to determine the magnitude and spatial distribution of the heat transfer coefficient between the workpiece and the backingplate in a friction stir welding process using inverse modelling. Design/methodology/approach - The magnitude and distribution of the heat...... in an inverse modeling approach to determine the heat transfer coefficient in friction stir welding. © Emerald Group Publishing Limited....... yields optimal values for the magnitude and distribution of the heat transfer coefficient. Findings - It is found that the heat transfer coefficient between the workpiece and the backingplate is non-uniform and takes its maximum value in a region below the welding tool. Four different parameterisations...

  16. Cutting temperature measurement and material machinability

    Directory of Open Access Journals (Sweden)

    Nedić Bogdan P.

    2014-01-01

    Full Text Available Cutting temperature is very important parameter of cutting process. Around 90% of heat generated during cutting process is then away by sawdust, and the rest is transferred to the tool and workpiece. In this research cutting temperature was measured with artificial thermocouples and question of investigation of metal machinability from aspect of cutting temperature was analyzed. For investigation of material machinability during turning artificial thermocouple was placed just below the cutting top of insert, and for drilling thermocouples were placed through screw holes on the face surface. In this way was obtained simple, reliable, economic and accurate method for investigation of cutting machinability.

  17. Specific Features of Chip Making and Work-piece Surface Layer Formation in Machining Thermal Coatings

    Directory of Open Access Journals (Sweden)

    V. M. Yaroslavtsev

    2016-01-01

    Full Text Available A wide range of unique engineering structural and performance properties inherent in metallic composites characterizes wear- and erosion-resistant high-temperature coatings made by thermal spraying methods. This allows their use both in manufacturing processes to enhance the wear strength of products, which have to operate under the cyclic loading, high contact pressures, corrosion and high temperatures and in product renewal.Thermal coatings contribute to the qualitative improvement of the technical level of production and product restoration using the ceramic composite materials. However, the possibility to have a significantly increased product performance, reduce their factory labour hours and materials/output ratio in manufacturing and restoration is largely dependent on the degree of the surface layer quality of products at their finishing stage, which is usually provided by different kinds of machining.When machining the plasma-sprayed thermal coatings, a removing process of the cut-off layer material is determined by its distinctive features such as a layered structure, high internal stresses, low ductility material, high tendency to the surface layer strengthening and rehardening, porosity, high abrasive properties, etc. When coatings are machined these coating properties result in specific characteristics of chip formation and conditions for formation of the billet surface layer.The chip formation of plasma-sprayed coatings was studied at micro-velocities using an experimental tool-setting microscope-based setup, created in BMSTU. The setup allowed simultaneous recording both the individual stages (phases of the chip formation process and the operating force factors.It is found that formation of individual chip elements comes with the multiple micro-cracks that cause chipping-off the small particles of material. The emerging main crack in the cut-off layer of material leads to separation of the largest chip element. Then all the stages

  18. Extracting and Analyzing the Warming Trend in Global and Hemispheric Temperatures

    NARCIS (Netherlands)

    Estrada, Francisco; Perron, Pierre

    2017-01-01

    This article offers an updated and extended attribution analysis based on recently published versions of temperature and forcing datasets. It shows that both temperature and radiative forcing variables can be best represented as trend stationary processes with structural changes occurring in the

  19. Analyzing of chromaticity temperature of novel bulb composed of PDMS and phosphor

    Science.gov (United States)

    Novak, M.; Fajkus, M.; Jargus, J.; Bednarek, L.; Cubik, J.; Cvejn, D.; Vasinek, V.

    2017-10-01

    The authors of this article focused on the issue of measurement of the chromaticity temperature of proposed bulbs made from polydimethylsiloxane, depending on the temperature of proposed bulbs. The advantage of this solution is the immunity to electromagnetic interference (EMI) and the ability to use, for example in dangerous environments (such as mines, factories, etc.). For the realization of incandescent bulbs was used transparent two-component elastomer Sylgard 184. A mixture of polydimethylsiloxane (PDMS) and a curing agent in a defined ratio (10:1) and admixture with garnet phosphor YAG: Ce was cured in the temperature box at temperature 90°C +/- 3°C in the shape of the bulbs. All experiments were realized with eight different weight ratios of phosphor and Sylgard 184. Optical power (5 W) from a laser with a wavelength of 455 nm was fed to the proposed bulbs using the cylindrical waveguide of polydimethylsiloxane with a diameter of 5 mm. Chromaticity temperature was measured by two temperature sensors for 12h. The outcome of this study is the evaluation of the chromaticity temperature of output light depending on temperature variations of proposed bulbs due to the conversion of optical power into heat.

  20. Crossed, Small-Deflection Energy Analyzer for Wind/Temperature Spectrometer

    Science.gov (United States)

    Herrero, Federico A.; Finne, Theodore T.

    2010-01-01

    Determination of neutral winds and ion drifts in low-Earth-orbit missions requires measurements of the angular and energy distributions of the flux of neutrals and ions entering the satellite from the ram direction. The magnitude and direction of the neutral-wind (or ion-drift) determine the location of the maximum in the angular distribution of the flux. Knowledge of the angle of maximum flux with respect to satellite coordinates (pointing) is essential to determine the wind (or ion-drift) vector. The crossed Small-Deflection Energy Analyzer (SDEA) spectrometer (see Figure 1) occupies minimal volume and consumes minimal power. Designed for upper atmosphere/ionosphere investigations at Earth altitudes above 100 km, the spectrometer operates by detecting the angular and energy distributions of neutral atoms/molecules and ions in two mutually perpendicular planes. In this configuration, the two detection planes actually cross at the spectrometer center. It is possible to merge two SDEAs so they share a common optical axis and alternate measurements between two perpendicular planes, and reduce the number of ion sources from two to one. This minimizes the volume and footprint significantly and reduces the ion source power by a factor of two. The area of the entrance aperture affects the number of ions detected/second and also determines the energy resolution. Thermionic emitters require heater power of about 100 mW to produce 1 mA of electron beam current. Typically, electron energy is about 100 eV and requires a 100-V supply for electron acceleration to supply an additional 100 mW of power. Thus, ion source power is at most 200 mW. If two ion sources were to be used, the ion source power would be, at most, 400 mW. Detector power, deflection voltage power, and microcontroller and other functions require less than 150 mW. A WTS (wind/ temperature spectrometer) with two separate optical axes would consume about 650 mW, while the crossed SDEA described here consumes about

  1. Effect of changing polarity of graphite tool/ Hadfield steel workpiece couple on machining performances in die sinking EDM

    Directory of Open Access Journals (Sweden)

    Özerkan Haci Bekir

    2017-01-01

    Full Text Available In this study, machining performance ouput parameters such as machined surface roughness (SR, material removal rate (MRR, tool wear rate (TWR, were experimentally examined and analyzed with the diversifying and changing machining parameters in (EDM. The processing parameters (input par. of this research are stated as tool material, peak current (I, pulse duration (ton and pulse interval (toff. The experimental machinings were put into practice by using Hadfield steel workpiece (prismatic and cylindrical graphite electrodes with kerosene dielectric at different machining current, polarity and pulse time settings. The experiments have shown that the type of tool material, polarity (direct polarity forms higher MRR, SR and TWR, current (high current lowers TWR and enhances MRR, TWR and pulse on time (ton=48□s is critical threshold value for MRR and TWR were influential on machining performance in electrical discharge machining.

  2. Abnormal Condition Monitoring of Workpieces Based on RFID for Wisdom Manufacturing Workshops.

    Science.gov (United States)

    Zhang, Cunji; Yao, Xifan; Zhang, Jianming

    2015-12-03

    Radio Frequency Identification (RFID) technology has been widely used in many fields. However, previous studies have mainly focused on product life cycle tracking, and there are few studies on real-time status monitoring of workpieces in manufacturing workshops. In this paper, a wisdom manufacturing model is introduced, a sensing-aware environment for a wisdom manufacturing workshop is constructed, and RFID event models are defined. A synthetic data cleaning method is applied to clean the raw RFID data. The Complex Event Processing (CEP) technology is adopted to monitor abnormal conditions of workpieces in real time. The RFID data cleaning method and data mining technology are examined by simulation and physical experiments. The results show that the synthetic data cleaning method preprocesses data well. The CEP based on the Rifidi(®) Edge Server technology completed abnormal condition monitoring of workpieces in real time. This paper reveals the importance of RFID spatial and temporal data analysis in real-time status monitoring of workpieces in wisdom manufacturing workshops.

  3. Abnormal Condition Monitoring of Workpieces Based on RFID for Wisdom Manufacturing Workshops

    Directory of Open Access Journals (Sweden)

    Cunji Zhang

    2015-12-01

    Full Text Available Radio Frequency Identification (RFID technology has been widely used in many fields. However, previous studies have mainly focused on product life cycle tracking, and there are few studies on real-time status monitoring of workpieces in manufacturing workshops. In this paper, a wisdom manufacturing model is introduced, a sensing-aware environment for a wisdom manufacturing workshop is constructed, and RFID event models are defined. A synthetic data cleaning method is applied to clean the raw RFID data. The Complex Event Processing (CEP technology is adopted to monitor abnormal conditions of workpieces in real time. The RFID data cleaning method and data mining technology are examined by simulation and physical experiments. The results show that the synthetic data cleaning method preprocesses data well. The CEP based on the Rifidi® Edge Server technology completed abnormal condition monitoring of workpieces in real time. This paper reveals the importance of RFID spatial and temporal data analysis in real-time status monitoring of workpieces in wisdom manufacturing workshops.

  4. Influence of Lubricant Pocket Geometry upon Lubrication Mechanisms on Tool-Workpiece Interfaces in Metal Forming

    DEFF Research Database (Denmark)

    Shimizu, I; Martins, P.A.F.; Bay, Niels

    2004-01-01

    Micro lubricant pockets located on the surface of plastically deforming workpieces are recognized to improve the performance of fluid lubrication in a metal forming processes. This work investigates the joint influence of pocket geometry and process working conditions on micro lubrication mechani...

  5. Using basic metrics to analyze high-resolution temperature data in the subsurface

    Science.gov (United States)

    Shanafield, Margaret; McCallum, James L.; Cook, Peter G.; Noorduijn, Saskia

    2017-08-01

    Time-series temperature data can be summarized to provide valuable information on spatial variation in subsurface flow, using simple metrics. Such computationally light analysis is often discounted in favor of more complex models. However, this study demonstrates the merits of summarizing high-resolution temperature data, obtained from a fiber optic cable installation at several depths within a water delivery channel, into daily amplitudes and mean temperatures. These results are compared to fluid flux estimates from a one-dimensional (1D) advection-conduction model and to the results of a previous study that used a full three-dimensional (3D) model. At a depth of 0.1 m below the channel, plots of amplitude suggested areas of advective water movement (as confirmed by the 1D and 3D models). Due to lack of diurnal signal at depths below 0.1 m, mean temperature was better able to identify probable areas of water movement at depths of 0.25-0.5 m below the channel. The high density of measurements provided a 3D picture of temperature change over time within the study reach, and would be suitable for long-term monitoring in man-made environments such as constructed wetlands, recharge basins, and water-delivery channels, where a firm understanding of spatial and temporal variation in infiltration is imperative for optimal functioning.

  6. A Model for Analyzing Temperature Profiles in Pipe Walls and Fluids Using Mathematical Experimentation

    Directory of Open Access Journals (Sweden)

    Moses E. Emetere

    2014-09-01

    Full Text Available Temperature profiling in both fluid and pipe walls had not been explained theoretically. The equations of energy balance and heat conductivity were queried by introducing known parameters to solveheat transfer using virtual mathematical experimentation. This was achieved by remodelingPoiseuille's equation. Distribution of temperature profiles between pipe wall, fluid flow, and surrounding air was investigated and validated upon comparison with experimental results. A new dimensionless parameter (unified number (U was introduced with the aim of solving known errors of the Reynolds and Nusselts number.

  7. Real-Time Deflection Monitoring for Milling of a Thin-Walled Workpiece by Using PVDF Thin-Film Sensors with a Cantilevered Beam as a Case Study

    Directory of Open Access Journals (Sweden)

    Ming Luo

    2016-09-01

    Full Text Available Thin-walled workpieces, such as aero-engine blisks and casings, are usually made of hard-to-cut materials. The wall thickness is very small and it is easy to deflect during milling process under dynamic cutting forces, leading to inaccurate workpiece dimensions and poor surface integrity. To understand the workpiece deflection behavior in a machining process, a new real-time nonintrusive method for deflection monitoring is presented, and a detailed analysis of workpiece deflection for different machining stages of the whole machining process is discussed. The thin-film polyvinylidene fluoride (PVDF sensor is attached to the non-machining surface of the workpiece to copy the deflection excited by the dynamic cutting force. The relationship between the input deflection and the output voltage of the monitoring system is calibrated by testing. Monitored workpiece deflection results show that the workpiece experiences obvious vibration during the cutter entering the workpiece stage, and vibration during the machining process can be easily tracked by monitoring the deflection of the workpiece. During the cutter exiting the workpiece stage, the workpiece experiences forced vibration firstly, and free vibration exists until the amplitude reduces to zero after the cutter exits the workpiece. Machining results confirmed the suitability of the deflection monitoring system for machining thin-walled workpieces with the application of PVDF sensors.

  8. Change in Water-Holding Capacity in Mushroom with Temperature Analyzed by Flory-Rehner Theory

    NARCIS (Netherlands)

    Paudel, Ekaraj; Boom, R.M.; Sman, van der R.G.M.

    2015-01-01

    The change in water-holding capacity of mushroom with the temperature was interpreted using the Flory-Rehner theory for swelling of polymeric networks, extended with the Debye-Hückel theory for electrolytic interactions. The validity of these theories has been verified with independent sorption

  9. Triclustering Georeferenced Time Series for Analyzing Patterns of Intra-Annual Variability in Temperature

    NARCIS (Netherlands)

    Wu, Xiaojing; Zurita-Milla, R.; Izquierdo Verdiguier, E.; Kraak, M.J.

    2017-01-01

    Clustering is often used to explore patterns in georeferenced time series (GTS). Most clustering studies, however, only analyze GTS from one or two dimension(s) and are not capable of the simultaneous analysis of the data from three dimensions: spatial, temporal, and any third (e.g., attribute)

  10. [X-ray hardening correction for ICT in testing workpiece].

    Science.gov (United States)

    Peng, Guang-han; Cai, Xin-hua; Han, Zhong; Yang, Xue-heng

    2008-06-01

    Since energy spectrum of X-ray is polychromatic source in X-ray industrial computerized tomography, the variation of attenuation coefficient with energy leads to the lower energy of X-ray radiation being absorbed preferentially when X-ray is transmitting the materials. And the higher the energy of X-ray, the lower the attenuation coefficient of X-ray. With the increase in the X-ray transmission thickness, it becomes easier for the X-ray to transmit the matter. Thus, the phenomenon of energy spectrum hardening of X-ray takes place, resulting from the interaction between X-ray and the materials. This results in false images in the reconstruction of X-ray industrial computerized tomography. Therefore, hardening correction of energy spectrum of X-ray has to be done. In the present paper, not only is the hardening phenomenon of X-ray transmitting the materials analyzed, but also the relation between the X-ray beam sum and the transmission thickness of X-ray is discussed. And according to the Beer law and the characteristics of interaction when X-ray is transmitting material, and by getting the data of X-ray beam sum, the relation equation is fitted between the X-ray beam sum and X-ray transmission thickness. Then, the relation and the method of equivalence are carried out for X-ray beam sum being corrected. Finally, the equivalent and monochromatic attenuation coefficient fitted value for X-ray transmitting the material is reasoned out. The attenuation coefficient fitted value is used for product back-projection image reconstruction in X-ray industrial computerized tomography. Thus, the effect caused by X-ray beam hardening is wiped off effectively in X-ray industrial computerized tomography.

  11. Method of testing carbide inserts for premature fracture by face milling of cylindrical workpieces

    Science.gov (United States)

    Kitagawa, R.; Akasawa, T.; Okusa, K.

    1984-12-01

    Methods are proposed for face milling solid cylindrical workpieces or half-cut and hollow cylindrical workpieces prepared from rectangular blocks by continuously changing both or either of the angles of engagement and disengagement. Carbide inserts are tested for premature fracture before the onset of steady wear using these face-milling methods. The premature fracture indicates the insufficient toughness of carbides to perform a given machining job. As carbides of higher wear resistance have lower shock resistance in general, they must be tested for premature fracture due to the lack of toughness to select suitable carbides for specific cutting applications. The test results obtained under the present study show that the premature fracture of carbides, whose toughness was classified by static toughness tests, can be evaluated dynamically and easily by the proposed face-milling methods.

  12. Impact of Machined Workpiece Position in Workspace on Precision of Manufacturing Using an Angular Robot

    Science.gov (United States)

    Kusá, Martina; Bárta, Jozef

    2016-09-01

    Use of an industrial robot in the production process as a spindle carrier is currently an interesting topic. However, if the stiffness of the robot is not sufficient, various imprecisions may occur during machining. The article deals with monitoring and evaluating the impact of cutting conditions and positions of the workpiece in the working area on the machined surfaces oriented in orthogonal planes. The aim of the experiment was to analyse the precision of simple planar surfaces milled using a robot.

  13. Accuracy of Setting Work-pieces on Automatic Lines with Pallet-fixtures

    Directory of Open Access Journals (Sweden)

    L. A. Kolesnikov

    2015-01-01

    Full Text Available The accuracy of positioning surfaces to be processed on automatic lines with pallet-fixtures essentially depends on the setting error of the pallet-fixtures with work-pieces in ready-to-work position.The applied methods for calculating the setting error do not give a complete picture of the possible coordinates of the point when in the pallet is displaced in different directions.The aim of the work was to determine an accuracy of the setting work-pieces on automatic lines with pallets-fixtures, using a computational and analytical method, to improve a manufacturing precision of parts.The currently used methods to calculate the setting error do not give a complete picture of the possible coordinates of the point of the pallet displacement in different directions. The paper offers a method of equivalent mechanism to determine all the variety of options for displacements in the horizontal plane with a diverse combination of angular and plane-parallel displacements.Using a four-bar linkage, as an equivalent mechanism, allows us to define a zone of the possible positions of any point of the work-piece pallet platform, as the zone bounded by the coupler curve. In case the gaps in the nodes of the two fixtures are equal the zone of possible positions of the point in the parallel displacement of the platform is determined by the circumference and at an angular displacement by the ellipse.The obtained analytical dependences allow us to determine the error at the stage of design with the certain gaps in the fixture nodes.The above method of calculation makes it possible to define a zone of the appropriate placement of the work-piece on its platform for the specified parameters of the pallet to meet conditions for ensuring the coordinate accuracy of the processed axes of holes.

  14. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  15. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Science.gov (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia

    2017-08-01

    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  16. Core-ion temperature measurement of the ADITYA tokamak using passive charge exchange neutral particle energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Santosh P.; Ajay, Kumar; Mishra, Priyanka; Dhingra, Rajani D.; Govindarajan, J. [Institute for Plasma Research, Bhat, Gandhinagar 382 428, Gujarat (India)

    2013-02-15

    Core-ion temperature measurements have been carried out by the energy analysis of passive charge exchange (CX) neutrals escaping out of the ADITYA tokamak plasma (minor radius, a= 25 cm and major radius, R= 75 cm) using a 45 Degree-Sign parallel plate electrostatic energy analyzer. The neutral particle analyzer (NPA) uses a gas cell configuration for re-ionizing the CX-neutrals and channel electron multipliers (CEMs) as detectors. Energy calibration of the NPA has been carried out using ion-source and {Delta}E/E of high-energy channel has been found to be {approx}10%. Low signal to noise ratio (SNR) due to VUV reflections on the CEMs was identified during the operation of the NPA with ADITYA plasma discharges. This problem was rectified by upgrading the system by incorporating the additional components and arrangements to suppress VUV radiations and improve its VUV rejection capabilities. The noise rejection capability of the NPA was experimentally confirmed using a standard UV-source and also during the plasma discharges to get an adequate SNR (>30) at the energy channels. Core-ion temperature T{sub i}(0) during flattop of the plasma current has been measured to be up to 150 eV during ohmically heated plasma discharges which is nearly 40% of the average core-electron temperature (typically T{sub e}(0) {approx} 400 eV). The present paper describes the principle of tokamak ion temperature measurement, NPA's design, development, and calibration along with the modifications carried out for minimizing the interference of plasma radiations in the CX-spectrum. Performance of the NPA during plasma discharges and experimental results on the measurement of ion-temperature have also been reported here.

  17. Effect of cutting parameters on workpiece and tool properties during drilling of Ti-6Al-4V

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Yahya Hisman; Yildiz, Hakan [Batman Univ. (Turkey). Dept. of Mechanical Engineering; Oezek, Cebeli [Firat Univ., Elazig (Turkey)

    2016-08-01

    The main aim of machining is to provide the dimensional preciseness together with surface and geometric quality of the workpiece to be manufactured within the desired limits. Today, it is quite hard to drill widely utilized Ti-6Al-4 V alloys owing to their superior features. Therefore, in this study, the effects of temperature, chip formation, thrust forces, surface roughness, burr heights, hole diameter deviations and tool wears on the drilling of Ti-6Al-4 V were investigated under dry cutting conditions with different cutting speeds and feed rates by using tungsten carbide (WC) and high speed steel (HSS) drills. Moreover, the mathematical modeling of thrust force, surface roughness, burr height and tool wear were formed using Matlab. It was found that the feed rate, cutting speed and type of drill have a major effect on the thrust forces, surface roughness, burr heights, hole diameter deviations and tool wears. Optimum results in the Ti-6Al-4 V alloy drilling process were obtained using the WC drill.

  18. Study and Testing of Processing Trajectory Measurement Method of Flexible Workpiece

    Directory of Open Access Journals (Sweden)

    Yaohua Deng

    2013-01-01

    Full Text Available Flexible workpiece includes the materials like elasticity spline, textile fabric, and polyurethane sponge, due to the fact that processing trajectory is composed by small arc or small line segment primitives and the deformation of the flexible workpiece during the processing trajectory, making the captured image of processing trajectory not clear, the edge of processing image over local uneven gray, and also the pixels of boundaries between the processing trajectory image edge and background organizations not obvious. This paper takes corner search of processing trajectory as the cut-in-point, slope angle curve of starting and terminal point of each primitive is also designed, put forward the search algorithm that regards Daubechies (4 as wavelet operator to conduct slope angle curve for multiple scales wavelet transform, by judging whether there is a point of the curve appears wavelet transform extremum to determine whether the point is a corner point based on wavelet edge modulus maxima extract principle. Finally, proposed a decomposition/reconstruction design method of FIR filters based on wavelet transform of processing image. Eight-tap transpose FIR filter is used to design the decomposition of Daubechies (4 and reconfigurable computing IP core. The IP core wavelet decomposition of the total time-consuming increases only 5.561% in comparsion with PC. Trajectory angle relative error is 2.2%, and the average measurement time is 212.38 ms.

  19. Method and device for determining the position of a cutting tool relative to the rotational axis of a spindle-mounted workpiece

    Science.gov (United States)

    Williams, R.R.

    1980-09-03

    The present invention is directed to a method and device for determining the location of a cutting tool with respect to the rotational axis of a spindle-mounted workpiece. A vacuum cup supporting a machinable sacrificial pin is secured to the workpiece at a location where the pin will project along and encompass the rotational axis of the workpiece. The pin is then machined into a cylinder. The position of the surface of the cutting tool contacting the machine cylinder is spaced from the rotational axis of the workpiece a distance equal to the radius of the cylinder.

  20. Parametric study on numerical simulation of the electromagnetic forming of DP780 steel workpiece with aluminium driver sheet

    Science.gov (United States)

    Park, Hyeonil; Lee, Jinwoo; Kim, Se-Jong; Lee, Youngseon; Kim, Daeyong

    2016-08-01

    The purpose of this study is to investigate the influences of numerical parameters on the electromagnetic forming (EMF) simulation. The 3-dimensional coupled electromagnetic- mechanical simulations were conducted to predict the deformation behavior of the advanced high strength steel (AHSS) sheet receiving support in EMF with aluminum driver sheet. Dual phase (DP) 780 steel workpiece was formed into a hemi elliptical protrusion shape with aluminum alloy AA1050 driver sheet using a flat spiral coil actuator and open cavity die. The deformed shape of the DP780 workpiece and the computation time with respect to element size, N cycle number and time step of electromagnetic (EM) solver were analysed.

  1. Influence of Workpiece Surface Topography on the Mechanisms of Liquid Lubrication in Strip Drawing

    DEFF Research Database (Denmark)

    Shimizu, I; Andreasen, Jan Lasson; Bech, Jakob Ilsted

    2001-01-01

    The workpiece surface topography is an important factor controlling the mechanisms of lubrication in metal forming processes. In the present work, the microscopic lubrication mechanisms induced by lubricant trapped in pockets of the surface in strip drawing are studied. The experiments...... are performed with macroscopic model pockets in the surface studying the influence of the shape of the pockets on the lubrication mechanisms. A large radius of curvature on the rear edge and a small angle to the edge of the lubricant pocket induce a large area of backward escape of lubricant caused by Micro......Plasto HydroDynamic Lubrication (MPHDL). On the other hand, when the radius on the edge is small MPHDL is impeded and MicroPlasto HydroStatic Lubrication (MPHSL) appears instead implying forward escape of the lubricant. The occurrence of these mechanisms are quantitatively explained by a mathematical model...

  2. T-Spline Based Unifying Registration Procedure for Free-Form Surface Workpieces in Intelligent CMM

    Directory of Open Access Journals (Sweden)

    Zhenhua Han

    2017-10-01

    Full Text Available With the development of the modern manufacturing industry, the free-form surface is widely used in various fields, and the automatic detection of a free-form surface is an important function of future intelligent three-coordinate measuring machines (CMMs. To improve the intelligence of CMMs, a new visual system is designed based on the characteristics of CMMs. A unified model of the free-form surface is proposed based on T-splines. A discretization method of the T-spline surface formula model is proposed. Under this discretization, the position and orientation of the workpiece would be recognized by point cloud registration. A high accuracy evaluation method is proposed between the measured point cloud and the T-spline surface formula. The experimental results demonstrate that the proposed method has the potential to realize the automatic detection of different free-form surfaces and improve the intelligence of CMMs.

  3. Characteristics of Speed Line Cutter and Fringe Analysis of Workpiece Surface

    Directory of Open Access Journals (Sweden)

    Shuai Wang

    2014-02-01

    Full Text Available Easy to operate, speed line cutter has a high machining cost performance, so is very popular among the majority of users. The precision of guide rails, screws and nuts used in most of the machines is not high, and the machine control cannot compensate for the screw pitch error, clearance during the transmission and machining error due to electrode wear. Furthermore, control signal may also be lost in control process. The development of speed line cutter focuses on the quality and machining stability of CNC speed line cutter. This article makes an analysis about the impact of machine’s inherent characteristics on machining workpiece surface, and concludes that analysis shall be made on the irregular fringe, therefore to heighten the machining precision.

  4. MATHEMATICAL DESCRIPTION OF TWO-DIMENSIONAL PERIODIC CIRCULAR MOVEMENT OF WORK-PIECE BEING TURNED IN MODERNIZED SAWING UNIT

    Directory of Open Access Journals (Sweden)

    M. G. Kiselev

    2013-01-01

    Full Text Available The paper presents a theoretical investigation on nature of two-dimensional periodic circular movement of a work-piece attached to the end of a modernized sawing unit boom.  The boom oscillation axis is established on an elastic suspension that makes forced oscillations along a horizontal axis. The paper provides a calculation model of an oscillatory system of a boom swing block for mathematical description of the point movement trajectory that belongs to the work-piece. The model permits to obtain a system of two connected equations describing this movement. Numerical solution of these equations permits to establish that the work-piece makes a two-dimensional periodic circular movement and it has an elliptical  trajectory. The paper gives data that reveal influence of elastic and inertial and dissipative characteristics of the oscillatory system on the elliptical trajectory parameters of the work-piece movement. The paper demonstrates  that the regulation of the forced oscillation frequency transferred to the boom swing block of the sawing unit is considered as. the simplest in realization and the most efficient method that makes it possible to control the required parameters.

  5. On-Line Hydrogen-Isotope Measurements of Organic Samples Using Elemental Chromium : An Extension for High Temperature Elemental-Analyzer Techniques

    NARCIS (Netherlands)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B.; Meijer, Harro A. J.; Brand, Willi A.; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by

  6. Seasonal variations in groundwater upwelling zones in a Danish lowland stream analyzed using Distributed Temperature Sensing (DTS)

    DEFF Research Database (Denmark)

    Matheswaran, Karthikeyan; Blemmer, Morten; Rosbjerg, Dan

    2014-01-01

    The distribution of groundwater inflows in a stream reach plays a major role in controlling the stream temperature, a vital component shaping the riverine ecosystem. In this study, the Distributed Temperature Sensing (DTS) system was installed in a small Danish lowland stream, Elverdamsåen, to as...

  7. Analysis of Measured Workpiece's Form Errors Influence on the Accuracy of Probe Heads Used on Five-Axis Measuring Systems

    Directory of Open Access Journals (Sweden)

    Wiktor Harmatys

    2017-12-01

    Full Text Available The five-axis measuring systems are one of the most modern inventions in coordinate measuring technique. They are capable of performing measurements using only the rotary pairs present in their kinematic structure. This possibility is very useful because it may cause significant reduction of total measurement time and costs. However, it was noted that high values of measured workpiece's form errors may cause significant reduction of five-axis measuring system accuracy. The investigation on the relation between these two parameters was conducted in this paper and possible reasons of decrease in measurement accuracy was discussed in example of measurements of workpieces with form errors ranging from 0,5 to 1,7 millimetre.

  8. HIGH-TEMPERATURE EXAFS EXPERIMENTS ON LIQUID KPB ALLOYS ANALYZED WITH THE REVERSE MONTE-CARLO METHOD

    NARCIS (Netherlands)

    BRAS, W; XU, R; WICKS, JD; VANDERHORST, F; OVERSLUIZEN, M; MCGREEVY, RL; VANDERLUGT, W

    1994-01-01

    A new sample chamber has been designed which allows high temperature Extended X-ray Absorption Fine Structure (EXAFS) experiments on metallic melts which offer a number of special experimental problems: they are highly corrosive, have high vapour pressures and strongly absorb X-rays. The EXAFS

  9. The influence of temperature calibration on the OC–EC results from a dual-optics thermal carbon analyzer

    Science.gov (United States)

    The Sunset Laboratory Dual-Optical Carbonaceous Analyzer that simultaneously measures transmission and reflectance signals is widely used in thermal-optical analysis of particulate matter samples. Most often this instrument is used to measure total carbon (TC), organic carbon (O...

  10. Analyzing land surface temperature variations during Fogo Island (Cape Verde) 2014-2015 eruption with Landsat 8 images

    Science.gov (United States)

    Vieira, D.; Teodoro, A.; Gomes, A.

    2016-10-01

    Land Surface Temperature (LST) is an important parameter related to land surface processes that changes continuously through time. Assessing its dynamics during a volcanic eruption has both environmental and socio-economical interest. Lava flows and other volcanic materials produced and deposited throughout an eruption transform the landscape, contributing to its heterogeneity and altering LST measurements. This paper aims to assess variations of satellite-derived LST and to detect patterns during the latest Fogo Island (Cape Verde) eruption, extending from November 2014 through February 2015. LST data was obtained through four processed Landsat 8 images, focused on the caldera where Pico do Fogo volcano sits. QGIS' plugin Semi-Automatic Classification was used in order to apply atmospheric corrections and radiometric calibrations. The algorithm used to retrieve LST values is a single-channel method, in which emissivity values are known. The absence of in situ measurements is compensated by the use of MODIS sensor-derived LST data, used to compare with Landsat retrieved measurements. LST data analysis shows as expected that the highest LST values are located inside the caldera. High temperature values were also founded on the south-facing flank of the caldera. Although spatial patterns observed on the retrieved data remained roughly the same during the time period considered, temperature values changed throughout the area and over time, as it was also expected. LST values followed the eruption dynamic experiencing a growth followed by a decline. Moreover, it seems possible to recognize areas affected by lava flows of previous eruptions, due to well-defined LST spatial patterns.

  11. Analyzing the Effects of Climate Change on Sea Surface Temperature in Monitoring Coral Reef Health in the Florida Keys Using Sea Surface Temperature Data

    Science.gov (United States)

    Jones, Jason; Burbank, Renane; Billiot, Amanda; Schultz, Logan

    2011-01-01

    This presentation discusses use of 4 kilometer satellite-based sea surface temperature (SST) data to monitor and assess coral reef areas of the Florida Keys. There are growing concerns about the impacts of climate change on coral reef systems throughout the world. Satellite remote sensing technology is being used for monitoring coral reef areas with the goal of understanding the climatic and oceanic changes that can lead to coral bleaching events. Elevated SST is a well-documented cause of coral bleaching events. Some coral monitoring studies have used 50 km data from the Advanced Very High Resolution Radiometer (AVHRR) to study the relationships of sea surface temperature anomalies to bleaching events. In partnership with NOAA's Office of National Marine Sanctuaries and the University of South Florida's Institute for Marine Remote Sensing, this project utilized higher resolution SST data from the Terra's Moderate Resolution Imaging Spectroradiometer (MODIS) and AVHRR. SST data for 2000-2010 was employed to compute sea surface temperature anomalies within the study area. The 4 km SST anomaly products enabled visualization of SST levels for known coral bleaching events from 2000-2010.

  12. Calibration of a miniaturized retarding field analyzer for low-temperature plasmas: geometrical transparency and collisional effects

    Energy Technology Data Exchange (ETDEWEB)

    Baloniak, Tim; Reuter, Ruediger; Floetgen, Christoph; Von Keudell, Achim [Research Group Reactive Plasmas, Ruhr-Universitaet Bochum, 44780 Bochum (Germany)

    2010-02-10

    Retarding field analyzers (RFAs) are important diagnostics to measure fluxes and energies of ions impinging onto the wall of a plasma reactor. Any quantitative use of the data requires a proper calibration, which is here performed for a miniaturized RFA. The calibration accounts for the transparencies of the RFA grids as well as for collisions inside the RFA. An analytical model is derived which covers both geometrical and collisional effects. The model is calibrated and experimentally verified using a Langmuir probe. We find that the transparency of an RFA is a random variable which depends on the individual alignment of the RFA grids. Collisions inside the RFA limit the ion current transfer through the RFA at higher pressures. A simple method is presented which allows one to remove these artefacts from the RFA data and to obtain quantitative ion velocity distributions.

  13. Quantification of the Effects of Various Levels of Several Critical Shot Peen Process Variables on Workpiece Surface Integrity and the Resultant Effect on Workpiece Fatigue Life Behavior. Phase 2

    Science.gov (United States)

    1988-10-31

    MEDIA AT 700X, PSEF BEATEN DOWN TO CONT pRM TO UNDERL•fIG SURFACE, VISUALLY UNDETECTABLE IN SURFACE INSPECTION FIGURE 111: 7075-T73 AT 0.0100A USING...on fatigae life as a function of Aimen intensity. 8 Lathe turned only and lathe turned and polished 7075-T6 specimens exhibited fatigue life at...Conference on Impact T-reatment Processes, 22-26 September, 1986, p.101.) Visual inspection for 100-percent workpiece coverage was conducted using a 70X

  14. Temperature analysis in CFRP drilling

    Science.gov (United States)

    Matsumura, Takashi; Tamura, Shoichi

    2016-10-01

    The cutting temperature in drilling of carbon fiber reinforced plastics (CFRPs) is simulated numerically in finite difference analysis. The cutting force is predicted to estimate heat generation on the shear plane and the rake face by an energy approach. In the force model, three dimensional chip flow is interpreted as a piling up of the orthogonal cuttings in the planes containing the cutting velocities and the chip flow velocities, in which the chip flow direction is determined to minimize the cutting energy. Then, the cutting force is predicted in the determined chip flow model. The cutting temperature distribution is simulated with the thermal conductions, the thermal convections and the heat generations in the discrete elements of the tool, the chip and the workpiece. The heat generations on the shear plane and the rake face are given by stress distributions based on the cutting force predicted. The cutting temperature is analyzed on assumption that all mechanical works contribute the heat generation. The temperature of CFRP is compared with that of carbon steel in the numerical simulation. The maximum temperature of CFRP is much lower than carbon steel. The position at the maximum temperature is near the tool tip due to a low thermal conductivity of CFRP.

  15. Analyzing the temperature control of steam purging of 660mw ultra-supercritical once-through boiler with pressure-reducing method

    Science.gov (United States)

    Wu, Ying; Zhong, Yong-lu; Liu, Fa-sheng; Chen, Wen; Gui, Liang-ming; Xia, Yong-jun; Wan, Zhong-hai; Yan, Tao

    2017-11-01

    This paper generally introduced the process of steam purging of the ultra-supercritical once-through boiler of Jiangxi Xinchang 2×660MW Power Plant with the pressure-reducing method. In this paper, the key-points of steam temperature control was importantly analyzed and summarized. The success experience can provide the reference for preventing steam overtemp of the similar ultra-supercritical once-through boilers with pressure-reducing method.

  16. Analysis of the influence of the workpiece self-weight in precision optics manufacturing using FEM simulation

    Science.gov (United States)

    Sitzberger, Sebastian; Trum, Christian; Rascher, Rolf

    2017-06-01

    In this publication the effects, the extent and the importance of the deformation of a workpiece, in particular of lenses, caused by its own weight, before and during the manufacturing process in precision optics are examined. Since the deformations move in the single-digit to two-digit nanometer range, the investigation is carried out in a first step with FEM calculations. This has the advantage that current situations can be quickly interpreted and improved, without having to carry out time-consuming and cost-intensive experiments from the outset. A major part of the work is therefore the investigation of current workpiece holder and clamping situations in optics manufacturing. The present work concentrates exclusively on the theoretical calculation of the deformations occurring in various clamping situations, which are traced back to reality. Ansys Workbench is used as a tool for the calculations. The test series under laboratory conditions for the validation of the theoretical results will be part of further work.

  17. Effect of Carbon in the Dielectric Fluid and Workpieces on the Characteristics of Recast Layers Machined by Electrical Discharge Machining

    Science.gov (United States)

    Muttamara, Apiwat; Kanchanomai, Chaosuan

    2016-06-01

    Electrical discharge machining (EDM) is a popular non-traditional machining technique that is usually performed in kerosene. Carbon from the kerosene is mixed into the recast layer during EDM, increasing its hardness. EDM can be performed in deionized water, which causes decarburization. We studied the effects of carbon in the dielectric fluid and workpiece on the characteristics of recast layers. Experiments were conducted using gray cast iron and mild steel workpieces in deionized water or kerosene under identical operating conditions. Scanning electron microscopy revealed that the recast layer formed on gray iron was rougher than that produced on mild steel. Moreover, the dispersion of graphite flakes in the gray iron seemed to cause subsurface cracks, even when EDM was performed in deionized water. Dendritic structures and iron carbides were found in the recast layer of gray iron treated in deionized water. Kerosene caused more microcracks to form and increased surface roughness compared with deionized water. The microcrack length per unit area of mild steel treated in deionized water was greater than that treated in kerosene, but the cracks formed in kerosene were wider. The effect of the diffusion of carbon during cooling on the characteristics of the recast layer was discussed.

  18. Numerical Simulation of a Grinding Process Model for the Spatial Work-pieces: Development of Modeling Techniques

    Directory of Open Access Journals (Sweden)

    S. A. Voronov

    2015-01-01

    Full Text Available The article presents a literature review in simulation of grinding processes. It takes into consideration the statistical, energy based, and imitation approaches to simulation of grinding forces. Main stages of interaction between abrasive grains and machined surface are shown. The article describes main approaches to the geometry modeling of forming new surfaces when grinding. The review of approaches to the chip and pile up effect numerical modeling is shown. Advantages and disadvantages of grain-to-surface interaction by means of finite element method and molecular dynamics method are considered. The article points out that it is necessary to take into consideration the system dynamics and its effect on the finished surface. Structure of the complex imitation model of grinding process dynamics for flexible work-pieces with spatial surface geometry is proposed from the literature review. The proposed model of spatial grinding includes the model of work-piece dynamics, model of grinding wheel dynamics, phenomenological model of grinding forces based on 3D geometry modeling algorithm. Model gives the following results for spatial grinding process: vibration of machining part and grinding wheel, machined surface geometry, static deflection of the surface and grinding forces under various cutting conditions.

  19. Analyzing the effects of urban expansion on land surface temperature patterns by landscape metrics: a case study of Isfahan city, Iran.

    Science.gov (United States)

    Madanian, Maliheh; Soffianian, Ali Reza; Koupai, Saeid Soltani; Pourmanafi, Saeid; Momeni, Mehdi

    2018-03-03

    Urban expansion can cause extensive changes in land use and land cover (LULC), leading to changes in temperature conditions. Land surface temperature (LST) is one of the key parameters that should be considered in the study of urban temperature conditions. The purpose of this study was, therefore, to investigate the effects of changes in LULC due to the expansion of the city of Isfahan on LST using landscape metrics. To this aim, two Landsat 5 and Landsat 8 images, which had been acquired, respectively, on August 2, 1985, and July 4, 2015, were used. The support vector machine method was then used to classify the images. The results showed that Isfahan city had been encountered with an increase of impervious surfaces; in fact, this class covered 15% of the total area in 1985, while this value had been increased to 30% in 2015. Then LST zoning maps were created, indicating that the bare land and impervious surfaces categories were dominant in high temperature zones, while in the zones where water was present or NDVI was high, LST was low. Then, the landscape metrics in each of the LST zones were analyzed in relation to the LULC changes, showing that LULC changes due to urban expansion changed such landscape properties as the percentage of landscape, patch density, large patch index, and aggregation index. This information could be beneficial for urban planners to monitor and manage changes in the LULC patterns.

  20. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yindi [Department of Nanomechanics, Tohoku University, Sendai 980-8579 (Japan); Chen, Yuan-Liu, E-mail: yuanliuchen@nano.mech.tohoku.ac.jp [Department of Nanomechanics, Tohoku University, Sendai 980-8579 (Japan); Shimizu, Yuki; Ito, So; Gao, Wei [Department of Nanomechanics, Tohoku University, Sendai 980-8579 (Japan); Zhang, Liangchi [School of Mechanical and Manufacturing Engineering, The University of New South Wales, NSW 2052 (Australia)

    2016-04-30

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  1. OPTIMIZING THE PLACEMENT OF A WORK-PIECE AT A MULTI-POSITION ROTARY TABLE OF TRANSFER MACHINE WITH VERTICAL MULTI-SPINDLE HEAD

    Directory of Open Access Journals (Sweden)

    N. N. Guschinski

    2015-01-01

    Full Text Available The problem of minimizing the weight of transfer machine with a multi-position rotary table by placing of a work-piece at the table for processing of homogeneous batch of work-pieces is considered. To solve this problem the mathematical model and heuristic particle swarm optimization algorithm are proposed. The results of numerical experiments for two real problems of this type are given. The experiments revealed that the particle swarm optimization algorithm is more effective for the solution of the problem compared to the methods of random search and LP-search.

  2. MM98.04 Measurement of temperature and determination of heat transfer coefficient in backward can extrusion

    DEFF Research Database (Denmark)

    Henningsen, Poul; Hattel, Jesper Henri; Wanheim, Tarras

    1998-01-01

    in the die insert. The die insert is divided into two halves where the thermocouples are welded to the end of milled grooves in the lower part. The temperature of the workpiece is measured by welding a thermocouple directly onto the free surface.All of the temperature measurements in the tool...... and the workpiece are compared with a number of FEM simulations computed with different heat transfer coefficients. The current heat transfer coefficient is determined from the simulations....

  3. Specific Features of Chip Making and Work-piece Surface Layer Formation in Machining Thermal Coatings

    OpenAIRE

    V. M. Yaroslavtsev

    2016-01-01

    A wide range of unique engineering structural and performance properties inherent in metallic composites characterizes wear- and erosion-resistant high-temperature coatings made by thermal spraying methods. This allows their use both in manufacturing processes to enhance the wear strength of products, which have to operate under the cyclic loading, high contact pressures, corrosion and high temperatures and in product renewal.Thermal coatings contribute to the qualitative improvement of the t...

  4. Comparative analysis between the SPIF and DPIF variants for die-less forming process for an automotive workpiece

    Directory of Open Access Journals (Sweden)

    Adrian José Benitez Lozano

    2015-07-01

    Full Text Available Over time the process of incremental deformation Die-less has been developed in many ways to meet the needs of flexible production with no investment in tooling and low production costs. Two of their configurations are the SPIF (Single point incremental forming and DPIF (Double point Incremental forming technique. The aim of this study is to compare both techniques with the purpose of exposing their advantages and disadvantages in the production of industrial parts, as well as to inform about Die-less as an alternative manufacturing process. Experiments with the exhaust pipe cover of a vehicle are performed, the main process parameters are described, and formed workpieces without evidence of defects are achieved. Significant differences between the two techniques in terms of production times and accuracy to the original model are also detected. Finally, it is suggested when is more convenient to use each of these.

  5. Viscosity Measurements of Dilute Poly(2-ethyl-2-oxazoline Aqueous Solutions Near Theta Temperature Analyzed within the Joint Rouse-Zimm Model

    Directory of Open Access Journals (Sweden)

    Jana Tóthová

    2015-01-01

    Full Text Available The steady-state shear viscosity of low-concentrated Poly(2-ethyl-2-oxazoline (PEOX aqueous solutions is measured near the presumed theta temperature using the falling ball viscometry technique. The experimental data are analyzed within the model that joins the Rouse and Zimm bead-spring theories of the polymer dynamics at the theta condition, which means that the polymer coils are considered to be partially permeable to the solvent. The polymer characteristics thus depend on the draining parameter h that is related to the strength of the hydrodynamic interaction between the polymer segments. The Huggins coefficient was found to be 0.418 at the temperature 20°C, as predicted by the theory. This value corresponds to h = 2.92, contrary to the usual assumption of the infinite h. This result indicates that the theta temperature for the PEOX water solutions is 20°C rather than 25°C in the previous studies. The experimental intrinsic viscosity is well described coming from the Arrhenius equation for the shear viscosity.

  6. Determining the ion temperature and energy distribution in a lithium-plasma interaction test stand with a retarding field energy analyzer

    Science.gov (United States)

    Christenson, M.; Stemmley, S.; Jung, S.; Mettler, J.; Sang, X.; Martin, D.; Kalathiparambil, K.; Ruzic, D. N.

    2017-08-01

    The ThermoElectric-driven Liquid-metal plasma-facing Structures (TELS) experiment at the University of Illinois is a gas-puff driven, theta-pinch plasma source that is used as a test stand for off-normal plasma events incident on materials in the edge and divertor regions of a tokamak. The ion temperatures and resulting energy distributions are crucial for understanding how well a TELS pulse can simulate an extreme event in a larger, magnetic confinement device. A retarding field energy analyzer (RFEA) has been constructed for use with such a transient plasma due to its inexpensive and robust nature. The innovation surrounding the use of a control analyzer in conjunction with an actively sampling analyzer is presented and the conditions of RFEA operation are discussed, with results presented demonstrating successful performance under extreme conditions. Such extreme conditions are defined by heat fluxes on the order of 0.8 GW m-2 and on time scales of nearly 200 μs. Measurements from the RFEA indicate two primary features for a typical TELS discharge, following closely with the pre-ionizing coaxial gun discharge characteristics. For the case using the pre-ionization pulse (PiP) and the theta pinch, the measured ion signal showed an ion temperature of 23.3 ± 6.6 eV for the first peak and 17.6 ± 1.9 eV for the second peak. For the case using only the PiP, the measured signal showed an ion temperature of 7.9 ± 1.1 eV for the first peak and 6.6 ± 0.8 eV for the second peak. These differences illustrate the effectiveness of the theta pinch for imparting energy on the ions. This information also highlights the importance of TELS as being one of the few linear pulsed plasma sources whereby moderately energetic ions will strike targets without the need for sample biasing.

  7. A probabilistic-based approach to monitoring tool wear state and assessing its effect on workpiece quality in nickel-based alloys

    Science.gov (United States)

    Akhavan Niaki, Farbod

    The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the

  8. Applying Petroleum the Pressure Buildup Well Test Procedure on Thermal Response Test—A Novel Method for Analyzing Temperature Recovery Period

    Directory of Open Access Journals (Sweden)

    Tomislav Kurevija

    2018-02-01

    Full Text Available The theory of Thermal Response Testing (TRT is a well-known part of the sizing process of the geothermal exchange system. Multiple parameters influence the accuracy of effective ground thermal conductivity measurement; like testing time, variable power, climate interferences, groundwater effect, etc. To improve the accuracy of the TRT, we introduced a procedure to additionally analyze falloff temperature decline after the power test. The method is based on a premise of analogy between TRT and petroleum well testing, since the origin of both procedures lies in the diffusivity equation with solutions for heat conduction or pressure analysis during radial flow. Applying pressure build-up test interpretation techniques to borehole heat exchanger testing, greater accuracy could be achieved since ground conductivity could be obtained from this period. Analysis was conducted on a coaxial exchanger with five different power steps, and with both direct and reverse flow regimes. Each test was set with 96 h of classical TRT, followed by 96 h of temperature decline, making for almost 2000 h of cumulative borehole testing. Results showed that the ground conductivity value could vary by as much as 25%, depending on test time, seasonal period and power fluctuations, while the thermal conductivity obtained from the falloff period provided more stable values, with only a 10% value variation.

  9. High-throughput simultaneous determination of plasma water deuterium and 18-oxygen enrichment using a high-temperature conversion elemental analyzer with isotope ratio mass spectrometry.

    Science.gov (United States)

    Richelle, M; Darimont, C; Piguet-Welsch, C; Fay, L B

    2004-01-01

    This paper presents a high-throughput method for the simultaneous determination of deuterium and oxygen-18 (18O) enrichment of water samples isolated from blood. This analytical method enables rapid and simple determination of these enrichments of microgram quantities of water. Water is converted into hydrogen and carbon monoxide gases by the use of a high-temperature conversion elemental analyzer (TC-EA), that are then transferred on-line into the isotope ratio mass spectrometer. Accuracy determined with the standard light Antartic precipitation (SLAP) and Greenland ice sheet precipitation (GISP) is reliable for deuterium and 18O enrichments. The range of linearity is from 0 up to 0.09 atom percent excess (APE, i.e. -78 up to 5725 delta per mil (dpm)) for deuterium enrichment and from 0 up to 0.17 APE (-11 up to 890 dpm) for 18O enrichment. Memory effects do exist but can be avoided by analyzing the biological samples in quintuplet. This method allows the determination of 1440 samples per week, i.e. 288 biological samples per week. Copyright 2004 John Wiley & Sons, Ltd.

  10. Welding Current Distribution in the Work-piece and Pool in Arc Welding

    Directory of Open Access Journals (Sweden)

    A. M. Rybachuk

    2015-01-01

    Full Text Available In order to select the optimal configuration of controlling magnetic fields and build rational construction of magnetic systems, we need to know the distribution of welding current in the molten metal of the weld pool. So the objective of the work is to establish the calculated methods for determining current density in the weld pool during arc welding. The distribution of welding current in the pool depends on the field of the electrical resistance, which is determined by the deformed temperature field while arc moves with the welding speed. The previous works have shown experimentally and by simulation on the conductive paper that deformation of temperature field defines deformation of electric field. On the basis thereof, under certain boundary conditions the problem has been solved to give a general solution of differential equation, which relates the potential distribution to the temperature in the product during arc welding. This solution is obtained under the following boundary conditions: 1 metal is homogeneous; 2 input and output surfaces of heat flux and electric current coincide; 3 input and output surfaces of heat flux and electric current are insulated and equipotential; 4 other (lateral surfaces are adiabatic boundaries. Therefore, this paper pays basic attention to obtaining the analytical solution of a general differential equation, which relates distribution of potential to the temperature in the product. It considers the temperature field of the heat source, which moves at a welding speed with normal-circular distribution of the heat flow at a certain concentration factor. The distribution of current density is calculated on the assumption that the welding current is introduced through the same surface as the heat flux and the distribution of current density corresponds to the normally circular at a certain concentration factor. As a result, we get an expression that allows us to calculate the current density from the known

  11. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  12. On-line hydrogen-isotope measurements of organic samples using elemental chromium: An extension for high temperature elemental-analyzer techniques

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B.; Meijer, Harro A.J.; Brand, Willi A.; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ2H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ2H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  13. Fast Measurement and Reconstruction of Large Workpieces with Freeform Surfaces by Combining Local Scanning and Global Position Data

    Directory of Open Access Journals (Sweden)

    Zhe Chen

    2015-06-01

    Full Text Available In this paper, we propose a new approach for the measurement and reconstruction of large workpieces with freeform surfaces. The system consists of a handheld laser scanning sensor and a position sensor. The laser scanning sensor is used to acquire the surface and geometry information, and the position sensor is utilized to unify the scanning sensors into a global coordinate system. The measurement process includes data collection, multi-sensor data fusion and surface reconstruction. With the multi-sensor data fusion, errors accumulated during the image alignment and registration process are minimized, and the measuring precision is significantly improved. After the dense accurate acquisition of the three-dimensional (3-D coordinates, the surface is reconstructed using a commercial software piece, based on the Non-Uniform Rational B-Splines (NURBS surface. The system has been evaluated, both qualitatively and quantitatively, using reference measurements provided by a commercial laser scanning sensor. The method has been applied for the reconstruction of a large gear rim and the accuracy is up to 0.0963 mm. The results prove that this new combined method is promising for measuring and reconstructing the large-scale objects with complex surface geometry. Compared with reported methods of large-scale shape measurement, it owns high freedom in motion, high precision and high measurement speed in a wide measurement range.

  14. Analyzing the Potential Risk of Climate Change on Lyme Disease in Eastern Ontario, Canada Using Time Series Remotely Sensed Temperature Data and Tick Population Modelling

    Directory of Open Access Journals (Sweden)

    Angela Cheng

    2017-06-01

    Full Text Available The number of Lyme disease cases (Lyme borreliosis in Ontario, Canada has increased over the last decade, and that figure is projected to continue to increase. The northern limit of Lyme disease cases has also been progressing northward from the northeastern United States into southeastern Ontario. Several factors such as climate change, changes in host abundance, host and vector migration, or possibly a combination of these factors likely contribute to the emergence of Lyme disease cases in eastern Ontario. This study first determined areas of warming using time series remotely sensed temperature data within Ontario, then analyzed possible spatial-temporal changes in Lyme disease risk in eastern Ontario from 2000 to 2013 due to climate change using tick population modeling. The outputs of the model were validated by using tick surveillance data from 2002 to 2012. Our results indicated areas in Ontario where Lyme disease risk changed from unsustainable to sustainable for sustaining Ixodes scapularis (black-legged tick populations. This study provides evidence that climate change has facilitated the northward expansion of black-legged tick populations’ geographic range over the past decade. The results demonstrate that remote sensing data can be used to increase the spatial detail for Lyme disease risk mapping and provide risk maps for better awareness of possible Lyme disease cases. Further studies are required to determine the contribution of host migration and abundance on changes in eastern Ontario’s Lyme disease risk.

  15. Analyzing the Surface Temperature Depression in Hot Stage Atomic Force Microscopy with Unheated Cantilevers: Application to the Crystallization of Poly(ethylene oxide)

    NARCIS (Netherlands)

    Schönherr, Holger; Bailey, Larry E.; Frank, Curtis W.

    2002-01-01

    The in situ study of phase transitions in polymers by real-time atomic force microscopy (AFM) has received much attention recently. In this paper we report on the accuracy of surface temperatures measured during variable-temperature AFM experiments. In AFM studies on organic and polymeric samples at

  16. Analyses of Effects of Cutting Parameters on Cutting Edge Temperature Using Inverse Heat Conduction Technique

    Directory of Open Access Journals (Sweden)

    Marcelo Ribeiro dos Santos

    2014-01-01

    Full Text Available During machining energy is transformed into heat due to plastic deformation of the workpiece surface and friction between tool and workpiece. High temperatures are generated in the region of the cutting edge, which have a very important influence on wear rate of the cutting tool and on tool life. This work proposes the estimation of heat flux at the chip-tool interface using inverse techniques. Factors which influence the temperature distribution at the AISI M32C high speed steel tool rake face during machining of a ABNT 12L14 steel workpiece were also investigated. The temperature distribution was predicted using finite volume elements. A transient 3D numerical code using irregular and nonstaggered mesh was developed to solve the nonlinear heat diffusion equation. To validate the software, experimental tests were made. The inverse problem was solved using the function specification method. Heat fluxes at the tool-workpiece interface were estimated using inverse problems techniques and experimental temperatures. Tests were performed to study the effect of cutting parameters on cutting edge temperature. The results were compared with those of the tool-work thermocouple technique and a fair agreement was obtained.

  17. Trace impurity analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN/sub 2/ cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described.

  18. Effect of the Preheating Temperature on Process Time in Friction Stir Welding of Al 6061-T6

    DEFF Research Database (Denmark)

    Jabbari, Masoud

    2013-01-01

    This paper presents the results obtained and the deductions made from an analytical modeling involving friction stir welding of Al 6061-T6. A new database was developed to simulate the contact temperature between the tool and the workpiece. A second-order equation is proposed for simulating...

  19. Metallurgical analysis and nanoindentation characterization of Ti-6Al-4V workpiece and chips in high-throughput drilling

    Energy Technology Data Exchange (ETDEWEB)

    Li Rui [Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States); Riester, Laura; Watkins, Thomas R.; Blau, Peter J. [Materials Science and Technology Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Shih, Albert J. [Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States)], E-mail: shiha@umich.edu

    2008-01-15

    The metallurgical analyses, including scanning electron microscopy (SEM), X-ray diffraction (XRD), electron microprobe, and nanoindentation characterization are conducted to study the Ti-6Al-4V hole surface and subsurface and the chips in high-throughput drilling tests. The influence of high temperature, large strain, and high strain rate deformation on the {beta} {yields} {alpha} phase transformation and mechanical properties is investigated. Diffusionless {beta} {yields} {alpha} phase transformation in the subsurface layer adjacent to the hole surface can be observed in dry drilling, but not in other drilling conditions with the supply of cutting fluid. Nanoindentation tests identify a 15-20 {mu}m high hardness subsurface layer with peak hardness over 9 GPa, relative to the 4-5 GPa bulk material hardness, adjacent to the hole surface in dry drilling. For drilling chips, the {beta} phase is retained under all conditions tested due to rapid cooling. On the chips, the saw-tooth feature and narrow shear bands are only formed at the outmost edge and no significant change of hardness across the shear bands can be found in nanoindentation.

  20. Metallurgical Analysis and Nanoindentaiton Characterization of Ti-6Al-4V Workpiece and Chips in High-throughput drilling

    Energy Technology Data Exchange (ETDEWEB)

    Li, Rui [ORNL; Riester, Laura [ORNL; Watkins, Thomas R [ORNL; Blau, Peter Julian [ORNL; Shih, Albert J. [University of Michigan

    2008-01-01

    The metallurgical analyses, including scanning electron microscopy (SEM), X-ray diffraction (XRD), electron microprobe, and nanoindentation characterization are conducted to study the Ti-6Al-4V hole surface and subsurface and the chips in high-throughput drilling tests. The influence of high temperature, large strain, and high strain rate deformation on the {beta}-{alpha} phase transformation and mechanical properties is investigated. Diffusionless {beta}-{alpha} phase transformation in the subsurface layer adjacent to the hole surface can be observed in dry drilling, but not in other drilling conditions with the supply of cutting fluid. Nanoindentation tests identify a 15-20 {micro}m high hardness subsurface layer with peak hardness over 9 GPa, relative to the 4-5 GPa bulk material hardness, adjacent to the hole surface in dry drilling. For drilling chips, the {beta} phase is retained under all conditions tested due to rapid cooling. On the chips, the saw-tooth feature and narrow shear bands are only formed at the outmost edge and no significant change of hardness across the shear bands can be found in nanoindentation.

  1. Regolith Evolved Gas Analyzer

    Science.gov (United States)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  2. An integrated systems calculation of a steam generator tube rupture in a modular prismatic HTGR (high-temperature gas-cooled reactor) conceptual design using ATHENA (Advanced Thermal-Hydraulic Energy Network Analyzer)

    Energy Technology Data Exchange (ETDEWEB)

    Beelman, R.J. (Idaho National Engineering Laboratory, Idaho Falls (USA))

    1989-11-01

    The capability to perform integrated systems calculations of modular high-temperature gas-cooled reactor (MHTGR) transients has been developed at the Idaho National Engineering Laboratory (INEL) using the Advanced Thermal-Hydraulic Energy Network Analyzer (ATHENA) computer code. A scoping calculation of a steam generator tube rupture (SGTR) water ingress event in a prismatic 2 {times} 350-MW(thermal) MHTGR conceptual design has been completed at INEL using ATHENA. The proposed MHTGR design incorporates dual, graphite-moderated, helium-cooled, 350-MW(thermal), annular prismatic core concept reactor plants, each configured with an individual helical once-through steam generator steaming a common 280-MW(electric) turbine generator set.

  3. Thermodynamic study of residual heat from a high temperature nuclear reactor to analyze its viability in cogeneration processes; Estudio termodinamico del calor residual de un reactor nuclear de alta temperatura para analizar su viabilidad en procesos de cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Santillan R, A.; Valle H, J.; Escalante, J. A., E-mail: santillanaura@gmail.com [Universidad Politecnica Metropolitana de Hidalgo, Boulevard acceso a Tolcayuca 1009, Ex-Hacienda San Javier, 43860 Tolcayuca, Hidalgo (Mexico)

    2015-09-15

    In this paper the thermodynamic study of a nuclear power plant of high temperature at gas turbine (GTHTR300) is presented for estimating the exploitable waste heat in a process of desalination of seawater. One of the most studied and viable sustainable energy for the production of electricity, without the emission of greenhouse gases, is the nuclear energy. The fourth generation nuclear power plants have greater advantages than those currently installed plants; these advantages have to do with security, increased efficiencies and feasibility to be coupled to electrical cogeneration processes. In this paper the thermodynamic study of a nuclear power plant type GTHTR300 is realized, which is selected by greater efficiencies and have optimal conditions for use in electrical cogeneration processes due to high operating temperatures, which are between 700 and 950 degrees Celsius. The aim of the study is to determine the heat losses and the work done at each stage of the system, determining where they are the greatest losses and analyzing in that processes can be taken advantage. Based on the study was appointed that most of the energy losses are in form of heat in the coolers and usually this is emitted into the atmosphere without being used. From the results a process of desalination of seawater as electrical cogeneration process is proposed. This paper contains a brief description of the operation of the nuclear power plant, focusing on operation conditions and thermodynamic characteristics for the implementation of electrical cogeneration process, a thermodynamic analysis based on mass and energy balance was developed. The results allow quantifying the losses of thermal energy and determining the optimal section for coupling of the reactor with the desalination process, seeking to have a great overall efficiency. (Author)

  4. Project ATLANTA (Atlanta Land use Analysis: Temperature and Air Quality): Use of Remote Sensing and Modeling to Analyze How Urban Land Use Change Affects Meteorology and Air Quality Through Time

    Science.gov (United States)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.

    1999-01-01

    This paper presents an overview of Project ATLANTA (ATlanta Land use ANalysis: Temperature and Air-quality) which is an investigation that seeks to observe, measure, model, and analyze how the rapid growth of the Atlanta, Georgia metropolitan area since the early 1970's has impacted the region's climate and air quality. The primary objectives for this research effort are: (1) To investigate and model the relationships between land cover change in the Atlanta metropolitan, and the development of the urban heat island phenomenon through time; (2) To investigate and model the temporal relationships between Atlanta urban growth and land cover change on air quality; and (3) To model the overall effects of urban development on surface energy budget characteristics across the Atlanta urban landscape through time. Our key goal is to derive a better scientific understanding of how land cover changes associated with urbanization in the Atlanta area, principally in transforming forest lands to urban land covers through time, has, and will, effect local and regional climate, surface energy flux, and air quality characteristics. Allied with this goal is the prospect that the results from this research can be applied by urban planners, environmental managers and other decision-makers, for determining how urbanization has impacted the climate and overall environment of the Atlanta area. Multiscaled remote sensing data, particularly high resolution thermal infrared data, are integral to this study for the analysis of thermal energy fluxes across the Atlanta urban landscape.

  5. System and method of adjusting the equilibrium temperature of an inductively-heated susceptor

    Science.gov (United States)

    Matsen, Marc R; Negley, Mark A; Geren, William Preston

    2015-02-24

    A system for inductively heating a workpiece may include an induction coil, at least one susceptor face sheet, and a current controller coupled. The induction coil may be configured to conduct an alternating current and generate a magnetic field in response to the alternating current. The susceptor face sheet may be configured to have a workpiece positioned therewith. The susceptor face sheet may be formed of a ferromagnetic alloy having a Curie temperature and being inductively heatable to an equilibrium temperature approaching the Curie temperature in response to the magnetic field. The current controller may be coupled to the induction coil and may be configured to adjust the alternating current in a manner causing a change in at least one heating parameter of the susceptor face sheet.

  6. 40 CFR 90.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... of the exhaust gas at the sample probe is below 190 °C, the temperature of the valves, pipe work, and... temperature of the exhaust gas at the sample probe is above 190 °C, the temperature of the valves, pipe work... the HFID analyzer, the detector, oven, and sample-handling components within the oven must be suitable...

  7. DOG optical gas analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Azbukin, A.A.; Buldakov, M.A.; Korolev, B.V.; Korolo' kov, V.A.; Matrosov, I.I. [Siberian Branch of the Russian Academy of Sciences, Tomsk (Russian Federation). Inst. of Optical Monitoring

    2002-01-01

    Stationary gas analyzers for continuous monitoring of sulfur and nitrogen oxides in exhaust gases of electric power plants burning fossil fuels have been developed. The DOG series of gas-analyzers use non-laser UV radiation sources and the differential absorption lidar (DIAL) measurement technique. Operation of the gas-analyzers at Russian electric power plants showed their high efficiency, reliability, and easiness in operation at lower cost as compared to similar foreign devices. 8 refs., 3 figs., 1 tab.

  8. Universal MOSFET parameter analyzer

    Science.gov (United States)

    Klekachev, A. V.; Kuznetsov, S. N.; Pikulev, V. B.; Gurtov, V. A.

    2006-05-01

    MOSFET analyzer is developed to extract most important parameters of transistors. Instead of routine DC transfer and output characteristics, analyzer provides an evaluation of interface states density by applying charge pumping technique. There are two features that outperform the analyzer among similar products of other vendors. It is compact (100 × 80 × 50 mm 3 in dimensions) and lightweight (instrument with ultra low power supply (instrument was designed on the base of component parts from CYPRESS and ANALOG DEVICES (USA).

  9. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  10. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  11. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts...... of various interviews conveyed diverse significance to the listening researcher at different times became a method of continuously opening up the empirical material in a reflexive, breakdown-oriented process of analysis. We argue that situating analysis in the present of analyzing emphasizes and acknowledges...

  12. Software Design Analyzer System

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  13. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  14. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  15. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  16. Ocular Response Analyzer

    OpenAIRE

    Kaushik, Sushmita; Pandav, Surinder Singh

    2012-01-01

    Until recently, corneal biomechanical properties could not be measured in vivo. The ocular response analyzer is a new, noninvasive device that analyses corneal biomechanical properties simply and rapidly. The ORA allows cornea compensated IOP measurements and can estimate corneal hysteresis (CH) and corneal resistance factor (CRF). It is designed to improve the accuracy of IOP measurement by using corneal biomechanical data to calculate a biomechanically adjusted estimate of intraocular press...

  17. Analyzing sustainable competitive advantage

    OpenAIRE

    Abdul Malek Nurul Aida; Shahzad Khuram; Takala Josu; Bojnec Stefan; Papler Drago; Liu Yang

    2016-01-01

    In today’s dynamic business environment, a key challenge for all companies is to make adaptive adjustments to their manufacturing strategy. This study demonstrates the competitive priorities of manufacturing strategy in hydro-power case company to evaluate the level of sustainable competitive advantage and also to further analyze how business strategies are aligned with manufacturing strategies. This research is based on new holistic analytical evaluation of manufacturing strategy index, sens...

  18. Inductive dielectric analyzer

    Science.gov (United States)

    Agranovich, Daniel; Polygalov, Eugene; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri

    2017-03-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions.

  19. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  20. Ring Image Analyzer

    Science.gov (United States)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  1. MM98.34 Experimental Measurements of Die temperatures and determination of heat transfer coefficient in backward can extrusion

    DEFF Research Database (Denmark)

    Henningsen, Poul; Hattel, Jesper Henri; Wanheim, Tarras

    1998-01-01

    from the surface. The thermocouples are welded to the end of grooves milled in a small plug, Which is pressed into a hold in the punch nose. All the temperature measurements in the tool and the workpiece are compared with a number of FEM simulations computed with different heat transfer coefficients....... The current heat transfer coefficient is determined as the one resulting in the best agreement between measurements and the simulations....

  2. Analyzing complicity in risk.

    Science.gov (United States)

    Busby, Jerry

    2008-12-01

    When risks generate anger rather than fear, there is at least someone who regards the imposition of those risks as wrongdoing; and it then makes sense to speak of the involvement in producing those risks as complicity. It is particularly relevant to examine the complicity of risk bearers, because this is likely to have a strong influence on how far other actors should go in providing them with protection. This article makes a case for analyzing complicity explicitly, in parallel with normal processes of risk assessment, and proposes a framework for this analysis. It shows how it can be applied in a case study of maritime transportation, and examines the practical and theoretical difficulties of this kind of analysis. The conclusion is that the analysis has to be formative rather than summative, but that it could provide a useful way of exposing differences in the assumptions of different actors about agency and responsibility.

  3. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... for further interrogation. To answer biologically relevant questions from these complex data sets, it becomes essential to apply computational, statistical, and predictive analytical methods. Here we provide an advanced bioinformatic platform termed "PhosphoSiteAnalyzer" to explore large phosphoproteomic data...... an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility...

  4. System performance analyzer

    Science.gov (United States)

    Helbig, H. R.

    1981-01-01

    The System Performance Analyzer (SPA) designed to provide accurate real time information about the operation of complex systems and developed for use on the Airborne Data Analysis/Monitor System (ADAMS), a ROLM 1666 based system is described. The system uses an external processor to operate an intelligent, simulated control panel. Also provided are functions to trace operations, determine frequency of use of memory areas, and time or count user tasks in a multitask environment. This augments the information available from the standard debugger and control panel, and reduces the time and effort needed by ROLM 1666 users in optimizing their system, as well as providing documentation of the effect of any changes. The operation and state of the system are evaluated.

  5. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  6. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  7. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  8. Bios data analyzer.

    Science.gov (United States)

    Sabelli, H; Sugerman, A; Kovacevic, L; Kauffman, L; Carlson-Sabelli, L; Patel, M; Konecki, J

    2005-10-01

    The Bios Data Analyzer (BDA) is a set of computer programs (CD-ROM, in Sabelli et al., Bios. A Study of Creation, 2005) for new time series analyses that detects and measures creative phenomena, namely diversification, novelty, complexes, nonrandom complexity. We define a process as creative when its time series displays these properties. They are found in heartbeat interval series, the exemplar of bios .just as turbulence is the exemplar of chaos, in many other empirical series (galactic distributions, meteorological, economic and physiological series), in biotic series generated mathematically by the bipolar feedback, and in stochastic noise, but not in chaotic attractors. Differencing, consecutive recurrence and partial autocorrelation indicate nonrandom causation, thereby distinguishing chaos and bios from random and random walk. Embedding plots distinguish causal creative processes (e.g. bios) that include both simple and complex components of variation from stochastic processes (e.g. Brownian noise) that include only complex components, and from chaotic processes that decay from order to randomness as the number of dimensions is increased. Varying bin and dimensionality show that entropy measures symmetry and variety, and that complexity is associated with asymmetry. Trigonometric transformations measure coexisting opposites in time series and demonstrate bipolar, partial, and uncorrelated opposites in empirical processes and bios, supporting the hypothesis that bios is generated by bipolar feedback, a concept which is at variance with standard concepts of polar and complementary opposites.

  9. Thermocouple and infrared sensor-based measurement of temperature distribution in metal cutting.

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-12

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  10. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Directory of Open Access Journals (Sweden)

    Abdil Kus

    2015-01-01

    Full Text Available In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  11. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M. Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-01

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining. PMID:25587976

  12. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  13. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  14. Transient, three-dimensional heat transfer model for the laser assisted machining of silicon nitride: 1. Comparison of predictions with measured surface temperature histories

    Energy Technology Data Exchange (ETDEWEB)

    Rozzi, J.C.; Pfefferkorn, F.E.; Shin, Y.C. [Purdue University, (United States). Laser Assisted Materials Processing Laboratory, School of Mechanical Engineering; Incropera, F.P. [University of Notre Dame, (United States). Aerospace and Mechanical Engineering Department

    2000-04-01

    Laser assisted machining (LAM), in which the material is locally heated by an intense laser source prior to material removal, provides an alternative machining process with the potential to yield higher material removal rates, as well as improved control of workpiece properties and geometry, for difficult-to-machine materials such as structural ceramics. To assess the feasibility of the LAM process and to obtain an improved understanding of governing physical phenomena, experiments have been performed to determine the thermal response of a rotating silicon nitride workpiece undergoing heating by a translating CO{sub 2} laser and material removal by a cutting tool. Using a focused laser pyrometer, surface temperature histories were measured to determine the effect of the rotational and translational speeds, the depth of cut, the laser-tool lead distance, and the laser beam diameter and power on thermal conditions. The measurements are in excellent agreement with predictions based on a transient, three-dimensional numerical solution of the heating and material removal processes. The temperature distribution within the unmachined workpiece is most strongly influenced by the laser power and laser-tool lead distance, as well as by the laser/tool translational velocity. A minimum allowable operating temperature in the material removal region corresponds to the YSiAlON glass transition temperature, below which tool fracture may occur. In a companion paper, the numerical model is used to further elucidate thermal conditions associated with laser assisted machining. (author)

  15. Simulation of a Hyperbolic Field Energy Analyzer

    CERN Document Server

    Gonzalez-Lizardo, Angel

    2016-01-01

    Energy analyzers are important plasma diagnostic tools with applications in a broad range of disciplines including molecular spectroscopy, electron microscopy, basic plasma physics, plasma etching, plasma processing, and ion sputtering technology. The Hyperbolic Field Energy Analyzer (HFEA) is a novel device able to determine ion and electron energy spectra and temperatures. The HFEA is well suited for ion temperature and density diagnostics at those situations where ions are scarce. A simulation of the capacities of the HFEA to discriminate particles of a particular energy level, as well as to determine temperature and density is performed in this work. The electric field due the combination of the conical elements, collimator lens, and Faraday cup applied voltage was computed in a well suited three-dimensional grid. The field is later used to compute the trajectory of a set of particles with a predetermined energy distribution. The results include the observation of the particle trajectories inside the sens...

  16. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  17. Temperature Measurement and Numerical Prediction in Machining Inconel 718.

    Science.gov (United States)

    Díaz-Álvarez, José; Tapetado, Alberto; Vázquez, Carmen; Miguélez, Henar

    2017-06-30

    Thermal issues are critical when machining Ni-based superalloy components designed for high temperature applications. The low thermal conductivity and extreme strain hardening of this family of materials results in elevated temperatures around the cutting area. This elevated temperature could lead to machining-induced damage such as phase changes and residual stresses, resulting in reduced service life of the component. Measurement of temperature during machining is crucial in order to control the cutting process, avoiding workpiece damage. On the other hand, the development of predictive tools based on numerical models helps in the definition of machining processes and the obtainment of difficult to measure parameters such as the penetration of the heated layer. However, the validation of numerical models strongly depends on the accurate measurement of physical parameters such as temperature, ensuring the calibration of the model. This paper focuses on the measurement and prediction of temperature during the machining of Ni-based superalloys. The temperature sensor was based on a fiber-optic two-color pyrometer developed for localized temperature measurements in turning of Inconel 718. The sensor is capable of measuring temperature in the range of 250 to 1200 °C. Temperature evolution is recorded in a lathe at different feed rates and cutting speeds. Measurements were used to calibrate a simplified numerical model for prediction of temperature fields during turning.

  18. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  19. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  20. Study of Cutting Edge Temperature and Cutting Force of End Mill Tool in High Speed Machining

    Directory of Open Access Journals (Sweden)

    Kiprawi Mohammad Ashaari

    2017-01-01

    Full Text Available A wear of cutting tools during machining process is unavoidable due to the presence of frictional forces during removing process of unwanted material of workpiece. It is unavoidable but can be controlled at slower rate if the cutting speed is fixed at certain point in order to achieve optimum cutting conditions. The wear of cutting tools is closely related with the thermal deformations that occurred between the frictional contact point of cutting edge of cutting tool and workpiece. This research paper is focused on determinations of relationship among cutting temperature, cutting speed, cutting forces and radial depth of cutting parameters. The cutting temperature is determined by using the Indium Arsenide (InAs and Indium Antimonide (InSb photocells to measure infrared radiation that are emitted from cutting tools and cutting forces is determined by using dynamometer. The high speed machining process is done by end milling the outer surface of carbon steel. The signal from the photocell is digitally visualized in the digital oscilloscope. Based on the results, the cutting temperature increased as the radial depth and cutting speed increased. The cutting forces increased when radial depth increased but decreased when cutting speed is increased. The setup for calibration and discussion of the experiment will be explained in this paper.

  1. Precision radiometric surface temperature (PRST) sensor

    Science.gov (United States)

    Daly, James T.; Roberts, Carson; Bodkin, Andrew; Sundberg, Robert; Beaven, Scott; Weinheimer, Jeffrey

    2013-05-01

    There is a need for a Precision Radiometric Surface Temperature (PRST) measurement capability that can achieve noncontact profiling of a sample's surface temperature when heated dynamically during laser processing, aerothermal heating or metal cutting/machining. Target surface temperature maps within and near the heated spot provide critical quantitative diagnostic data for laser-target coupling effectiveness and laser damage assessment. In the case of metal cutting, this type of measurement provides information on plastic deformation in the primary shear zone where the cutting tool is in contact with the workpiece. The challenge in these cases is to measure the temperature of a target while its surface's temperature and emissivity are changing rapidly and with incomplete knowledge of how the emissivity and surface texture (scattering) changes with temperature. Bodkin Design and Engineering, LLC (BDandE), with partners Spectral Sciences, Inc. (SSI) and Space Computer Corporation (SCC), has developed a PRST Sensor that is based on a hyperspectral MWIR imager spanning the wavelength range 2-5 μm and providing a hyperspectral datacube of 20-24 wavelengths at 60 Hz frame rate or faster. This imager is integrated with software and algorithms to extract surface temperature from radiometric measurements over the range from ambient to 2000K with a precision of 20K, even without a priori knowledge of the target's emissivity and even as the target emissivity may be changing with time and temperature. In this paper, we will present a description of the PRST system as well as laser heating test results which show the PRST system mapping target surface temperatures in the range 600-2600K on a variety of materials.

  2. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  3. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  4. Software-Design-Analyzer System

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  5. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  6. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  7. Analyzing Student Difficulties in Reading.

    Science.gov (United States)

    Ediger, Marlow

    According to this paper, a good reading teacher is able to analyze problems faced by students in reading and remediate that which is necessary. The paper stresses that the reading teacher needs to be a good observer of student reading habits to notice where to intervene to improve the skills and attitudes of the reader. It discusses diagnosis and…

  8. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa; [Ukendt], editors

    2013-01-01

    new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research......Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  9. Image analyzers for bioscience applications.

    Science.gov (United States)

    Ramm, P

    1990-01-01

    Image analysis systems are becoming more sophosticated, less costly, and very common in research laboratories. Therefore, the bioscience researcher is faced with a bewildering array of choices in establishing an image analysis facility. Critical components and characteristics of commercial image analyzers are discussed. State-of-the-art systems feature a graphical user interface, a powerful operating system (e.g., Microsoft OS/2), 1000 line image acquisition, processing and display, true color imaging, and very flexible scanner interfaces. Such systems are best suited to technically difficult applications, such as ratio fluorescence, or to automated analysis of anatomical features, particularly in stained material. Less powerful image analyzers offer medium resolution, and typically work with monochrome data acquired from video cameras. Such systems are suitable for many bioscience applications, including quantitative autoradiography and routine morphometry.

  10. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  11. Analyzing Change Management in Organization

    OpenAIRE

    Yonardy, Charles; Mekel, Peggy A.

    2014-01-01

    Every company will be faced a change either business or non-business company. A change will be happened because of technological change, industry change, and institutional rules change. If the company lack in adapt the changing of competition, business cycles, technology and institutional rules the company might face a bankruptcy. Why? To make a successful change there are steps and process of change. Research objectives are to analyze what should company do in order to do change management a...

  12. Remote Laser Diffraction PSD Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  13. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  14. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  15. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  16. 40 CFR 86.221-94 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86.221-94 Section 86.221-94 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.221-94 Hydrocarbon analyzer...

  17. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  18. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  19. VOSA: A VO SED Analyzer

    Science.gov (United States)

    Rodrigo, C.; Bayo, A.; Solano, E.

    2017-03-01

    VOSA (VO Sed Analyzer, http://svo2.cab.inta-csic.es/theory/vosa) is a public web-tool developed by the Spanish Virtual Observatory (http://svo.cab.inta-csic.es/) and designed to help users to (1) build Spectral Energy Distributions (SEDs) combining private photometric measurements with data available in VO services, (2) obtain relevant properties of these objects (distance, extinction, etc) from VO catalogs, (3) analyze them comparing observed photometry with synthetic photometry from different collections of theoretical models or observational templates, using different techniques (chi-square minimization, Bayesian analysis) to estimate physical parameters of the observed objects (teff, logg, metallicity, stellar radius/distance ratio, infrared excess, etc), and use these results to (4) estimate masses and ages via interpolation of collections of isochrones and evolutionary tracks from the VO. In particular, VOSA offers the advantage of deriving physical parameters using all the available photometric information instead of a restricted subset of colors. The results can be downloaded in different formats or sent to other VO tools using SAMP. We have upgraded VOSA to provide access to Gaia photometry and give a homogeneous estimation of the physical parameters of thousands of objects at a time. This upgrade has required the implementation of a new computation paradigm, including a distributed environment, the capability of submitting and processing jobs in an asynchronous way, the use of parallelized computing to speed up processes (˜ ten times faster) and a new design of the web interface.

  20. Analyzing the effect of cutting parameters on surface roughness and tool wear when machining nickel based hastelloy - 276

    Energy Technology Data Exchange (ETDEWEB)

    Khidhir, Basim A; Mohamed, Bashir, E-mail: Basim@student.uniten.edu.my [Department of Mechanical Engineering, College of Engineering, University Tenaga Nasional, 43009 Kajang, Selangor (Malaysia)

    2011-02-15

    Machining parameters has an important factor on tool wear and surface finish, for that the manufacturers need to obtain optimal operating parameters with a minimum set of experiments as well as minimizing the simulations in order to reduce machining set up costs. The cutting speed is one of the most important cutting parameter to evaluate, it clearly most influences on one hand, tool life, tool stability, and cutting process quality, and on the other hand controls production flow. Due to more demanding manufacturing systems, the requirements for reliable technological information have increased. For a reliable analysis in cutting, the cutting zone (tip insert-workpiece-chip system) as the mechanics of cutting in this area are very complicated, the chip is formed in the shear plane (entrance the shear zone) and is shape in the sliding plane. The temperature contributed in the primary shear, chamfer and sticking, sliding zones are expressed as a function of unknown shear angle on the rake face and temperature modified flow stress in each zone. The experiments were carried out on a CNC lathe and surface finish and tool tip wear are measured in process. Machining experiments are conducted. Reasonable agreement is observed under turning with high depth of cut. Results of this research help to guide the design of new cutting tool materials and the studies on evaluation of machining parameters to further advance the productivity of nickel based alloy Hastelloy - 276 machining.

  1. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  2. Analyzing Agricultural Agglomeration in China

    Directory of Open Access Journals (Sweden)

    Erling Li

    2017-02-01

    Full Text Available There has been little scholarly research on Chinese agriculture’s geographic pattern of agglomeration and its evolutionary mechanisms, which are essential to sustainable development in China. By calculating the barycenter coordinates, the Gini coefficient, spatial autocorrelation and specialization indices for 11 crops during 1981–2012, we analyze the evolutionary pattern and mechanisms of agricultural agglomeration. We argue that the degree of spatial concentration of Chinese planting has been gradually increasing and that regional specialization and diversification have progressively been strengthened. Furthermore, Chinese crop production is moving from the eastern provinces to the central and western provinces. This is in contrast to Chinese manufacturing growth which has continued to be concentrated in the coastal and southeastern regions. In Northeast China, the Sanjiang and Songnen plains have become agricultural clustering regions, and the earlier domination of aquaculture and rice production in Southeast China has gradually decreased. In summary, this paper provides a political economy framework for understanding the regionalization of Chinese agriculture, focusing on the interaction among the objectives, decisionmaking behavior, path dependencies and spatial effects.

  3. Measurement of Transient Tool Internal Temperature Fields by Novel Micro Thin Film Sensors Embedded in Polycrystalline Cubic Boron Nitride Cutting Inserts

    Science.gov (United States)

    Werschmoeller, Dirk

    Monitoring and control of thermomechanical phenomena in tooling are imperative for advancing fundamental understanding, enhancing reliability, and improving workpiece quality in material removal processes. Polycrystalline cubic boron nitride (PCBN) tools are being used heavily in numerous machining processes, e.g., machining of hardened low carbon steel and superalloys. These processes are very sensitive to variations in local cutting conditions at, or close to, the tool-workpiece interface, but lack a thorough understanding of fundamental transient thermo-mechanical phenomena present. As a result, abrupt catastrophic tool failures and degraded machined surfaces frequently occur. Existing sensors are not suitable for process control and monitoring, as they are either destructively embedded and/or do not possess the necessary spatial and temporal resolution to provide relevant data during machining. This research presents a novel approach for obtaining thermomechanical data from the close vicinity (i.e., 10s of micrometers) of the tool-workpiece interface. Arrays of micro thin film thermocouples with junction size 5 x 5 mum were fabricated by standard microfabrication methods and have been successfully embedded into PCBN using diffusion bonding. Electron microscopy and X-ray spectroscopy were employed to examine material interactions at the bonding interface and to determine optimal bonding parameters. Static and dynamic sensor performances have been characterized. The sensors exhibit excellent linearity up to 1300 °C, fast rise time of 150 ns, and possess good sensitivity. The inserts instrumented with embedded thin film C-type thermocouples were successfully applied to measure internal tool temperatures as close as 70 mum to the cutting edge while machining aluminum and hardened steel workpieces at industrially relevant cutting parameters. Acquired temperature data follow theoretical trends very well. Correlations between temperature and cutting parameters have

  4. Properties of Free-Machining Aluminum Alloys at Elevated Temperatures

    Science.gov (United States)

    Faltus, Jiří; Karlík, Miroslav; Haušild, Petr

    In areas close to the cutting tool the workpieces being dry machined could be heated up to 350°C and they may be impact loaded. Therefore it is of interest to study mechanical properties of corresponding materials at elevated temperatures. Free-machining alloys of Al-Cu and Al-Mg-Si systems containing Pb, Bi and Sn additions (AA2011, AA2111B, AA6262, and AA6023) were subjected to Charpy U notch impact test at the temperatures ranging from 20 to 350°C. The tested alloys show a sharp drop in notch impact strength KU at different temperatures. This drop of KU is caused by liquid metal embrittlement due to the melting of low-melting point dispersed phases which is documented by differential scanning calorimetry. Fracture surfaces of the specimens were observed using a scanning electron microscope. At room temperature, the fractures of all studied alloys exhibited similar ductile dimple fracture micromorphology, at elevated temperatures, numerous secondary intergranular cracks were observed.

  5. Finite element analysis of spot laser of steel welding temperature history

    Directory of Open Access Journals (Sweden)

    Shibib Khalid S.

    2009-01-01

    Full Text Available Laser welding process reduces the heat input to the work-piece which is the main goal in aerospace and electronics industries. A finite element model for axi-symmetric transient heat conduction has been used to predict temperature distribution through a steel cylinder subjected to CW laser beam of rectangular beam profile. Many numerical improvements had been used to reduce time of calculation and size of the program so as to achieve the task with minimum time required. An experimental determined absorptivity has been used to determine heat induced when laser interact with material. The heat affected zone and welding zone have been estimated to determine the effect of welding on material. The ratio of depth to width of the welding zone can be changed by proper selection of beam power to meet the specific production requirement. The temperature history obtained numerically has been compared with experimental data indicating good agreement.

  6. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  7. A mathematical approach based on finite differences method for analyzing the temperature field in arc welding of stainless steel thin sheets; Desarrollo de un modelo matematico de diferencias finitas para el analisis del campo de temperaturas en la soldadura por arco de chapas finas de acero inoxidable

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Conesa, E.J.; Estrems, M.; Miguel, V.

    2010-07-01

    This work develops a finite difference method to evaluate the temperature field in the heat affected zone in butt welding applied to AISI 304 stainless steel thin sheet by GTAW process. A computer program has been developed and implemented by Visual Basic for Applications (VBA) in MS-Excel spreadsheet. The results that are obtained using the numerical application foresee the thermal behaviour of arc welding processes. An experimental methodology has been developed to validate the mathematical model that allows to measure the temperature in several points close to the weld bead. The methodology is applied to a stainless steel sheet with a thickness lower than 3 mm, although may be used for other steels and welding processes as MIG/MAG and SMAW. The data which has been obtained from the experimental procedure have been used to validate the results that have been calculated by the finite differences numerical method. The mathematical model adjustment has been carried out taking into account the experimental results. The differences found between the experimental and theoretical approaches are due to the convection and radiation heat losses, which have not been considered in the simulation model.With this simple model, the designer will be able to calculate the thermal cycles that take place in the process as well as to predict the temperature field in the proximity of the weld bead. (Author). 18 refs.

  8. 40 CFR 91.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... gas at the sample probe is below 190 °C, the temperature of the valves, pipe work, and so forth, must... gas at the sample probe is above 190 °C, the temperature of the valves, pipe work, and so forth, must..., the detector, oven, and sample-handling components within the oven must be suitable for continuous...

  9. Temperature measurement

    Science.gov (United States)

    ... an oral temperature. Other factors to take into account are: In general, rectal temperatures are considered to ... urac.org). URAC's accreditation program is an independent audit to verify that A.D.A.M. follows ...

  10. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  11. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  12. A portable, integrated analyzer for microfluidic - based molecular analysis.

    Science.gov (United States)

    Qiu, Xianbo; Chen, Dafeng; Liu, Changchun; Mauk, Michael G; Kientz, Terry; Bau, Haim H

    2011-10-01

    A portable, fully automated analyzer that provides actuation and flow control to a disposable, self-contained, microfluidic cassette ("chip") for point-of-care, molecular testing is described. The analyzer provides mechanical actuation to compress pouches that pump liquids in the cassette, to open and close diaphragm valves for flow control, and to induce vibrations that enhance stirring. The analyzer also provides thermal actuation for the temperature cycling needed for polymerase chain reaction (PCR) amplification of nucleic acids and for various drying processes. To improve the temperature uniformity of the PCR chamber, the system utilizes a double-sided heating/cooling scheme with a custom feedforward, variable, structural proportional-integral-derivative (FVSPID) controller. The analyzer includes a programmable central processing unit that directs the sequence and timing of the various operations and that is interfaced with a computer. The disposable cassette receives a sample, and it carries out cell lysis, nucleic acid isolation, concentration, and purification, thermal cycling, and either real time or lateral flow (LF) based detection. The system's operation was demonstrated by processing saliva samples spiked with B. cereus cells. The amplicons were detected with a lateral flow assay using upconverting phosphor reporter particles. This system is particularly suited for use in regions lacking centralized laboratory facilities and skilled personnel.

  13. Analyzing Information in Complex Collaborative Tasks

    NARCIS (Netherlands)

    Zaad, Lambert; Dick Lenior; Els van der Pool; Thea van der Geest

    2017-01-01

    In this article, we present a method for analyzing the communication of people who exchange dynamic and complex information to come to a shared understanding of situations and of the actions planned and monitored by one party, but executed remotely by another. To examine this situation, we analyzed

  14. How to Analyze Paired Comparison Data

    Science.gov (United States)

    2011-05-01

    How to Analyze Paired Comparison Data Kristi Tsukida and Maya R. Gupta Department of Electrical Engineering University of Washington Seattle, WA...REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE How to Analyze Paired Comparison Data 5a. CONTRACT NUMBER 5b. GRANT

  15. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  16. Analyzing machine noise for real time maintenance

    Science.gov (United States)

    Yamato, Yoji; Fukumoto, Yoshifumi; Kumazaki, Hiroki

    2017-02-01

    Recently, IoT technologies have been progressed and applications of maintenance area are expected. However, IoT maintenance applications are not spread in Japan yet because of one-off solution of sensing and analyzing for each case, high cost to collect sensing data and insufficient maintenance automation. This paper proposes a maintenance platform which analyzes sound data in edges, analyzes only anomaly data in cloud and orders maintenance automatically to resolve existing technology problems. We also implement a sample application and compare related work.

  17. Random Bin for Analyzing Neuron Spike Trains

    Directory of Open Access Journals (Sweden)

    Shinichi Tamura

    2012-01-01

    Full Text Available When analyzing neuron spike trains, it is always the problem of how to set the time bin. Bin width affects much to analyzed results of such as periodicity of the spike trains. Many approaches have been proposed to determine the bin setting. However, these bins are fixed through the analysis. In this paper, we propose a randomizing method of bin width and location instead of conventional fixed bin setting. This technique is applied to analyzing periodicity of interspike interval train. Also the sensitivity of the method is presented.

  18. On-line chemical composition analyzer development

    Energy Technology Data Exchange (ETDEWEB)

    Garrison, A.A.

    1993-01-01

    This report relates to the development of an on-line Raman analyzer for control of a distillation column. It is divided into: program issues, experimental control system evaluation, energy savings analysis, and reliability analysis. (DLC)

  19. On-Demand Urine Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  20. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  1. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  2. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  3. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  4. The Effect of Process and Model Parameters in Temperature Prediction for Hot Stamping of Boron Steel

    Directory of Open Access Journals (Sweden)

    Chaoyang Sun

    2013-01-01

    Full Text Available Finite element models of the hot stamping and cold die quenching process for boron steel sheet were developed using either rigid or elastic tools. The effect of tool elasticity and process parameters on workpiece temperature was investigated. Heat transfer coefficient between blank and tools was modelled as a function of gap and contact pressure. Temperature distribution and thermal history in the blank were predicted, and thickness distribution of the blank was obtained. Tests were carried out and the test results are used for the validation of numerical predictions. The effect of holding load and the size of cooling ducts on temperature distribution during the forming and the cool die quenching process was also studied by using two models. The results show that higher accuracy predictions of blank thickness and temperature distribution during deformation were obtained using the elastic tool model. However, temperature results obtained using the rigid tool model were close to those using the elastic tool model for a range of holding load.

  5. Response surface and neural network based predictive models of cutting temperature in hard turning

    Directory of Open Access Journals (Sweden)

    Mozammel Mia

    2016-11-01

    Full Text Available The present study aimed to develop the predictive models of average tool-workpiece interface temperature in hard turning of AISI 1060 steels by coated carbide insert. The Response Surface Methodology (RSM and Artificial Neural Network (ANN were employed to predict the temperature in respect of cutting speed, feed rate and material hardness. The number and orientation of the experimental trials, conducted in both dry and high pressure coolant (HPC environments, were planned using full factorial design. The temperature was measured by using the tool-work thermocouple. In RSM model, two quadratic equations of temperature were derived from experimental data. The analysis of variance (ANOVA and mean absolute percentage error (MAPE were performed to suffice the adequacy of the models. In ANN model, 80% data were used to train and 20% data were employed for testing. Like RSM, herein, the error analysis was also conducted. The accuracy of the RSM and ANN model was found to be ⩾99%. The ANN models exhibit an error of ∼5% MAE for testing data. The regression coefficient was found to be greater than 99.9% for both dry and HPC. Both these models are acceptable, although the ANN model demonstrated a higher accuracy. These models, if employed, are expected to provide a better control of cutting temperature in turning of hardened steel.

  6. Response surface and neural network based predictive models of cutting temperature in hard turning.

    Science.gov (United States)

    Mia, Mozammel; Dhar, Nikhil R

    2016-11-01

    The present study aimed to develop the predictive models of average tool-workpiece interface temperature in hard turning of AISI 1060 steels by coated carbide insert. The Response Surface Methodology (RSM) and Artificial Neural Network (ANN) were employed to predict the temperature in respect of cutting speed, feed rate and material hardness. The number and orientation of the experimental trials, conducted in both dry and high pressure coolant (HPC) environments, were planned using full factorial design. The temperature was measured by using the tool-work thermocouple. In RSM model, two quadratic equations of temperature were derived from experimental data. The analysis of variance (ANOVA) and mean absolute percentage error (MAPE) were performed to suffice the adequacy of the models. In ANN model, 80% data were used to train and 20% data were employed for testing. Like RSM, herein, the error analysis was also conducted. The accuracy of the RSM and ANN model was found to be ⩾99%. The ANN models exhibit an error of ∼5% MAE for testing data. The regression coefficient was found to be greater than 99.9% for both dry and HPC. Both these models are acceptable, although the ANN model demonstrated a higher accuracy. These models, if employed, are expected to provide a better control of cutting temperature in turning of hardened steel.

  7. Kundt's Tube: An Acoustic Gas Analyzer

    Science.gov (United States)

    Aristov, Natasha; Habekost, Gehsa; Habekost, Achim

    2011-01-01

    A Kundt tube is normally used to measure the speed of sound in gases. Therefore, from known speeds of sound, a Kundt tube can be used to identify gases and their fractions in mixtures. In these experiments, the speed of sound is determined by measuring the frequency of a standing sound wave at a fixed tube length, temperature, and pressure. This…

  8. World Ocean Atlas 2005, Temperature

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — World Ocean Atlas 2005 (WOA05) is a set of objectively analyzed (1° grid) climatological fields of in situ temperature, salinity, dissolved oxygen, Apparent Oxygen...

  9. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  10. Analyzing migration phenomena with spatial autocorrelation techniques

    Directory of Open Access Journals (Sweden)

    Giuseppe Borruso

    2013-12-01

    Full Text Available In less than one century, Italy has tested a strong intensification of immigration changing from a country originating great migration flows to a country which is the destination of migration flows. The aim of this paper is to examine foreign immigration in Italy distinguishing according to nationality of foreigners. The spatial dimension of migration flows has been analyzed in this paper using Spatial Autocorrelation techniques and more particularly Local Indicators of Spatial Association in order to analyze the highest values of a foreigner group considering the relationship with the surrounding municipalities.

  11. Influence of formulated neem seed oil and jatropha curcas seed oil on wire drawing of mild steel and medium carbon steel at elevated temperatures

    Directory of Open Access Journals (Sweden)

    Mamuda Muhammad

    2016-09-01

    Full Text Available So many facets of hot wire drawing process, despite its extensive and long time employment in the industries, still remain unclear, due to want of systematic investigation of the process. This work investigated the influence of formulated neem seed and jatropha seed oil as lubricants, using antimony dialkyl dithiocarbamates (ADTC as an additive, on wire drawing process. The suitability of the bio-based oils in friction and wear control during wire drawing process were investigated, using a four ball tester. Experimental drawing process, using a Tungsten Carbide die and the formulated lubricants was carried out on mild steel and medium carbon steel rod (6 and 8mm diameter respectively at temperatures from 20OC to 750OC, on a drawing bench. The stresses and the temperature distribution profiles along the work-piece were reported. Up to 45% of reductions in area, without wire fracture, achieved on the drawing of the medium carbon steel have equally been reported.

  12. Laser Welding of Ultrahigh Strength Steels at Subzero Temperatures

    Science.gov (United States)

    Gerhards, Benjamin; Reisgen, Uwe; Olschok, Simon

    Ultrahigh strength steels like press hardened 22MnB5 and S1100QL make good construction materials, however when fusion welded they bring the disadvantage of softening in heat affected zones (HAZ). State of the art research of Laser welding ultrahigh strength steels shows that neither increasing the welding velocity, nor post weld heat treatment have an effect on the strength and hardness drop in the HAZ. The key to improving these material characteristics is to increase heat dissipation out of the workpiece. To do so, the cooling conditions while laser welding press hardened 22MnB5 and S1100QL were dramatically increased. Experiments were carried out at subzero temperatures down to -98°C by using a mixture of liquid and gaseous nitrogen. To further improve heat dissipation, the clamping jaws were entirely made of copper. Hardness measurements and tensile tests were conducted to compare joints welded at room temperature with those welded at -98°C. Whilst an improvement of the hardness values could be achieved for the press hardened 22MnB5 and the S1100, no change in the mechanical behavior regarding tensile tests could be observed. Thus, it can be noticed that there is no possibility to improve the strength levels of martensitic steels throughout varying process parameters (like welding velocity) or utilizing an active cooling, even if subzero temperatures down to -98°C are applied. A further improvement at lower temperatures is unlikely because heat dissipation in the work piece itself is the limiting factor.

  13. Industrial solvents analyzed on SolGel-WAX(TM)

    National Research Council Canada - National Science Library

    Angus Hibberd; Gerard Sharp; Dan DiFeo

    2002-01-01

    ... phase-bonding process in which the polyethylene glycol phase is encapsulated into a sol-gel matrix. The SolGel-WAX column, by the nature of the sol-gel bonding, is an extremely inert, low-bleed, high-temperature column. This inertness gives excellent peak shape of difficult-to-- analyze polar solvents. The high thermal stability gives a low-bleed column and therefore higher signal-to-noise ratio allowing lower detection limits, which is essential in many industrial processes. Also note the excellent separa...

  14. Huygens/ASI plasma wave analyzer capabilities for aerosol measurement

    Science.gov (United States)

    Borucki, William J.; Fulchignoni, Marcello

    1992-01-01

    The capabilities of the Huygens Atmospheric Structure Instrument (HASI) are described. These include measurement of atmospheric electrical conductivity by the plasma wave analyzer. How measurement of the atmospheric conductivity can lead to an estimate or the total surface area of the aerosols as a function of altitude is outlined. It is concluded that since the ASI is designed to measure the electrical conductivity as well as the pressure and temperature as a function of altitude, its results should provide a useful check on other instruments that are designed to measure the abundance of aerosols in a limited range of sizes.

  15. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand states....

  16. Construction of an Ion Energy Analyzer

    Science.gov (United States)

    Little, Steven; Bellan, Paul

    1999-11-01

    An ion energy analyzer is being constructed and will be used to observe energetic ions emitted by the solar prominence simulation experiment at Caltech. The analyzer contains three stacked grids that are mounted on modular frames from Kimball Physics (eV Parts) followed by a collector plate. The first grid is negatively biased to repel electrons. The second grid is positively biased (discriminator grid) and is varied to filter the velocity distribution and give information about ion energy. The third grid is negatively biased relative to the collector to suppress secondary electrons resulting from ions striking the collector. The collector plate is also negatively biased and the ion flux impinging on it is measured. The electroformed mesh grid has spacings of the order of the Debye length. A bake-out heating element is also incorporated by using small molybdenum wire that can be resistively heated. The entire analyzer is designed for ease of assembly/disassembly and is mounted on a soft, copper tube that can be bent to point in different directions. The analyzer will be located about 1 meter from the main plasma and has a diameter of 3 cm and a length of 2 cm.

  17. Graphic method for analyzing common path interferometers

    DEFF Research Database (Denmark)

    Glückstad, J.

    1998-01-01

    Common path interferometers are widely used for visualizing phase disturbances and fluid flows. They are attractive because of the inherent simplicity and robustness in the setup. A graphic method will be presented for analyzing and optimizing filter parameters in common path interferometers....

  18. Analyzing the Information Economy: Tools and Techniques.

    Science.gov (United States)

    Robinson, Sherman

    1986-01-01

    Examines methodologies underlying studies which measure the information economy and considers their applicability and limitations for analyzing policy issues concerning libraries and library networks. Two studies provide major focus for discussion: Porat's "The Information Economy: Definition and Measurement" and Machlup's…

  19. Thermal and Evolved-Gas Analyzer Illustration

    Science.gov (United States)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  20. Analyzing and Interpreting Research in Health Education ...

    African Journals Online (AJOL)

    While qualitative research is used when little or nothing is known about the subject, quantitative research is required when there are quantifiable variables to be measured. By implication, health education research is based on phenomenological, ethnographical and/or grounded theoretical approaches that are analyzable ...

  1. Total Cost Management: Analyzing Operational Support Costs.

    Science.gov (United States)

    Jenny, Hans J.

    1996-01-01

    Total cost management, an innovation useful in higher education, is best implemented in the institution's support services. Total cost management is the practice of analyzing and improving an institution's financial and qualitative performance when producing a particular product or service, paying attention to the complete work process and all…

  2. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  3. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  4. Quantifying the Analyzability of Software Architectures

    NARCIS (Netherlands)

    Bouwers, E.M.; Correia, J.P.; Van Deursen, A.; Visser, J.

    2011-01-01

    The decomposition of a software system into components is a major decision in any software architecture, having a strong influence on many of its quality aspects. A system’s analyzability, in particular, is influenced by its decomposition into components. But into how many components should a system

  5. Consideration Regarding Diagnosis Analyze of Corporate Management

    Directory of Open Access Journals (Sweden)

    Mihaela Ciopi OPREA

    2009-01-01

    Full Text Available Diagnosis management aims to identify critical situations and positive aspectsof corporate management. An effective diagnosis made by a team with thestatus of independence from the organization’s management is for managers auseful feedback necessary to improve performance. The work presented focuseson the methodology to achieve effective diagnosis, considering multitudecriteria and variables to be analyzed.

  6. Assessment of the Sperm Quality Analyzer.

    Science.gov (United States)

    Johnston, R C; Clarke, G N; Liu, D Y; Baker, H W

    1995-05-01

    To assess the relationship between the results of the Sperm Quality Analyzer (United Medical Systems Inc., Santa Ana, CA), which measures motile sperm concentration by light scattering, conventional manual semen analysis characteristics, and computer-assisted sperm motility analyses. Sperm Quality Analyzer measurements and manual and computer-assisted semen analyses were performed on 150 (50, 62, and 38) samples in three laboratories and the results were compared. The study was performed in the Andrology Laboratory of Prince Henry's Institute of Medical Research, Monash Medical Centre, and Andrology Laboratory and Reproductive Biology Unit at the Royal Women's Hospital, Melbourne, Victoria, Australia. Patients presented to the laboratories for routine fertility evaluation in the male and were selected at random to reflect the range of normal and abnormal samples seen in the laboratories. None. Sperm count, motility (percent motility, motility index, velocity, and amplitude of lateral head displacement [ALH]), morphology, and normal acrosomes were evaluated by manual and computer-assisted semen analysis and sperm quality analyzer motility index. Spearman nonparametric univariate analysis showed strong correlations between sperm motility index and manual sperm concentration, motility, abnormal morphology, and normal acrosomes by Pisum sativum agglutinin; and computer-assisted sperm motility analysis sperm concentration, motile concentration, and percent static. Curvilinear velocity, straight-line velocity (VSL), and linearity also were related significantly to sperm motility index values. By multiple regression analysis, the significant covariates of the sperm motility index were motile sperm concentration, abnormal morphology, ALH, and straight-line velocity and these accounted for 85.5% of the variance of the sperm motility index. The Sperm Quality Analyzer is easy to use. The good correlation between the sperm motility index, motile sperm concentration, and, in

  7. Calibration of optical particle-size analyzer

    Science.gov (United States)

    Pechin, William H.; Thacker, Louis H.; Turner, Lloyd J.

    1979-01-01

    This invention relates to a system for the calibration of an optical particle-size analyzer of the light-intercepting type for spherical particles, wherein a rotary wheel or disc is provided with radially-extending wires of differing diameters, each wire corresponding to a particular equivalent spherical particle diameter. These wires are passed at an appropriate frequency between the light source and the light detector of the analyzer. The reduction of light as received at the detector is a measure of the size of the wire, and the electronic signal may then be adjusted to provide the desired signal for corresponding spherical particles. This calibrator may be operated at any time without interrupting other processing.

  8. CRIE: An automated analyzer for Chinese texts.

    Science.gov (United States)

    Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En

    2016-12-01

    Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.

  9. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  10. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social...... and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...

  11. Analyzing Worms and Network Traffic using Compression

    OpenAIRE

    Wehner, Stephanie

    2005-01-01

    Internet worms have become a widespread threat to system and network operations. In order to fight them more efficiently, it is necessary to analyze newly discovered worms and attack patterns. This paper shows how techniques based on Kolmogorov Complexity can help in the analysis of internet worms and network traffic. Using compression, different species of worms can be clustered by type. This allows us to determine whether an unknown worm binary could in fact be a later version of an existin...

  12. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  13. Monitoring and Analyzing a Game Server Scenario

    OpenAIRE

    Jelmert, Stian Opsahl

    2008-01-01

    Master i nettverks- og systemadministrasjon Today, most literature about services in system administration is about conventional services like email servers. How could one monitor and analyze a scenario where the service in question is a game server? As these two services are technologically different, conventional monitoring tools may miss vital information in the context of game servers. This thesis focuses on developing a monitoring system for a game server in order to...

  14. MORPHOLOGICAL ANALYZER MYSTEM 3.0

    Directory of Open Access Journals (Sweden)

    A. I. Zobnin

    2015-01-01

    Full Text Available The large part of the Russian National Corpus has automatic morphological markup. It is based on the morphological analyzer Mystem developed in Yandex with some postprocessing of the results (for example, all indeclinable nouns acquire the tag '0', verbs are divided into separate paradigms by aspect, etc.. Recently a new (third version of Mystem has been released (see https://tech.yandex.ru/mystem/.  In this article we give an overview of its capabilities.

  15. Analyzing Consumer Behavior Towards Contemporary Food Retailers

    OpenAIRE

    E.Dursun; M.O. Azabagaoglu

    2008-01-01

    The objective of this research is analyzing consumer behaviors towards to contemporary food retailers. Food retailing has been changing during recent years in Turkey. Foreign investors captivated with this market potential of food retailing. Retailer‟s format has been changed and featuring large-scale, extended product variety and full service retailers spreading rapidly through the nation-wide. Consumers‟ tend to shop their household needs from contemporary retailers due mainly to urbanism, ...

  16. Analyzing Reliability Change in Legal Case

    OpenAIRE

    Jirakunkanok, Pimolluck; Sano, Katsuhiko; Tojo, Satoshi

    2015-01-01

    A consideration of the reliability plays a significant role in agent communication. An agent can change her belief about the reliability ordering between the other agents with respect to new incoming information. In order to analyze reliability change of an agent, this paper proposes a logical formalization with two dynamic operators, i.e., downgrade and upgrade operators. The downgrade operator allows an agent to downgrade some specified agents to be less reliable corresponding to the degree...

  17. Upgrade of the mini spectrum analyzer

    Science.gov (United States)

    Montebugnoli, Stelio; Bortolotti, Claudio; Buttaccio, Salvo; Cattani, Alessandro; Maccaferri, Andrea; Maccaferri, Giuseppe; Miani, Cristiano; Orfei, Alessandro; Roma, Mauro; Tuccari, Gino; Amico, Nicola D.; Grueff, Gavril

    1997-01-01

    The upgrade of the mini spectrum analyzer, built at the Medicina radiotelescope station laboratories and devoted to the Jupiter-SL9 crash on July 94, is presented. The new version of the spectrometer allows precise spectroscopy measurements and it has just been used for the Comet Hyakutake observations (May 1996) with very promising results. The same system could be used in small SETI activities with a possible future involvement of the Medicina/Noto antennas in this program.

  18. Analyzing Gender Stereotyping in Bollywood Movies

    OpenAIRE

    Madaan, Nishtha; Mehta, Sameep; Agrawaal, Taneea S; Malhotra, Vrinda; Aggarwal, Aditi; Saxena, Mayank

    2017-01-01

    The presence of gender stereotypes in many aspects of society is a well-known phenomenon. In this paper, we focus on studying such stereotypes and bias in Hindi movie industry (Bollywood). We analyze movie plots and posters for all movies released since 1970. The gender bias is detected by semantic modeling of plots at inter-sentence and intra-sentence level. Different features like occupation, introduction of cast in text, associated actions and descriptions are captured to show the pervasiv...

  19. Semantic analyzability in children's understanding of idioms.

    Science.gov (United States)

    Gibbs, R W

    1991-06-01

    This study investigated the role of semantic analyzability in children's understanding of idioms. Kindergartners and first, third, and fourth graders listened to idiomatic expressions either alone or at the end of short story contexts. Their task was to explain verbally the intended meanings of these phrases and then to choose their correct idiomatic interpretations. The idioms presented to the children differed in their degree of analyzability. Some idioms were highly analyzable or decomposable, with the meanings of their parts contributing independently to their overall figurative meanings. Other idioms were nondecomposable because it was difficult to see any relation between a phrase's individual components and the idiom's figurative meaning. The results showed that younger children (kindergartners and first graders) understood decomposable idioms better than they did nondecomposable phrases. Older children (third and fourth graders) understood both kinds of idioms equally well in supporting contexts, but were better at interpreting decomposable idioms than they were at understanding nondecomposable idioms without contextual information. These findings demonstrate that young children better understand idiomatic phrases whose individual parts independently contribute to their overall figurative meanings.

  20. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  1. Visual analyzer as anticipatory system (functional organization)

    Science.gov (United States)

    Kirvelis, Dobilas

    2000-05-01

    Hypothetical functional organization of the visual analyzer is presented. The interpretation of visual perception, anatomic and morphological structure of visual systems of animals, neuro-physiological, psychological and psycho-physiological data in the light of a number of the theoretical solutions of image recognition and visual processes simulation enable active information processing. The activities in special areas of cortex are as follows: focused attention, prediction with analysis of visual scenes and synthesis, predictive mental images. In the projection zone of visual cortex Area Streata or V1 a "sensory" screen (SS) and "reconstruction" screen (RS) are supposed to exist. The functional structure of visual analyzer consist of: analysis of visual scenes projected onto SS; "tracing" of images; preliminary recognition; reversive image reconstruction onto RS; comparison of images projected onto SS with images reconstructed onto RS; and "correction" of preliminary recognition. Special attention is paid to the quasiholographical principles of the neuronal organization within the brain, of the image "tracing," and of reverse image reconstruction. Tachistoscopic experiments revealed that the duration of one such hypothesis-testing cycle of the human visual analyzers is about 8-10 milliseconds.

  2. Optical multichannel analyzer techniques for high resolution optical spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Chao, J.L.

    1980-06-01

    The development of optical multichannel analyzer techniques for UV/VIS spectroscopy is presented. The research focuses on the development of spectroscopic techniques for measuring high resolution spectral lineshape functions from the exciton phosphorescence in H/sub 2/-1,2,4,5-tetrachlorobenzene. It is found that the temperature dependent frequency shifts and widths confirm a theoretical model based on an exchange theory. The exchange of low energy phonon modes which couple with excited state exciton transitions is shown to display the proper temperature dependent behavior. In addition to the techniques for using the optical multichannel analyzer (OMA) to perform low light level target integration, the use of the OMA for capturing spectral information in transient pulsed laser applications is discussed. An OMP data acquisition system developed for real-time signal processng is described. Both hardware and software interfacing considerations for control and data acquisition by a microcomputer are described. The OMA detector is described in terms of the principles behind its photoelectron detection capabilities and its design is compared with other optoelectronic devices.

  3. Development of a test facility for analyzing supercritical fluid blowdown

    Energy Technology Data Exchange (ETDEWEB)

    Roberto, Thiago D.; Alvim, Antonio C.M., E-mail: thiagodbtr@gmail.com [Coordenacao dos Programas de Pos-Graduacao em Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Silva, Mario A.B. da, E-mail: mabs500@gmail.com [Universidade Federal de Pernambuco (CTG/UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear; Lapa, Celso M.F., E-mail: lapa@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The generation IV nuclear reactors under development mostly use supercritical fluids as the working fluid because higher temperatures improve the thermal efficiency. Supercritical fluids are used by modern nuclear power plants to achieve thermal efficiencies of around 45%. With water as the supercritical working fluid, these plants operate at a high temperature and pressure. However, experiments on supercritical water are limited by technical and financial difficulties. These difficulties can be overcome by using model fluids, which have more feasible supercritical conditions and exhibit a lower critical pressure and temperature. Experimental research is normally used to determine the conditions under which model fluids represent supercritical fluids under steady-state conditions. A fluid-to-fluid scaling approach has been proposed to determine model fluids that can represent supercritical fluids in a transient state. This paper presents an application of fractional scale analysis to determine the simulation parameters for a depressurization test facility. Carbon dioxide (CO{sub 2}) and R134a gas were considered as the model fluids because their critical point conditions are more feasible than those of water. The similarities of water (prototype), CO{sub 2} (model) and R134a (model) for depressurization in a pressure vessel were analyzed. (author)

  4. Sensing temperature.

    Science.gov (United States)

    Sengupta, Piali; Garrity, Paul

    2013-04-22

    Temperature is an omnipresent physical variable reflecting the rotational, vibrational and translational motion of matter, what Richard Feynman called the "jiggling" of atoms. Temperature varies across space and time, and this variation has dramatic effects on the physiology of living cells. It changes the rate and nature of chemical reactions, and it alters the configuration of the atoms that make up nucleic acids, proteins, lipids and other biomolecules, significantly affecting their activity. While life may have started in a "warm little pond", as Charles Darwin mused, the organisms that surround us today have only made it this far by devising sophisticated systems for sensing and responding to variations in temperature, and by using these systems in ways that allow them to persist and thrive in the face of thermal fluctuation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Air sampling unit for breath analyzers

    Science.gov (United States)

    Szabra, Dariusz; Prokopiuk, Artur; Mikołajczyk, Janusz; Ligor, Tomasz; Buszewski, Bogusław; Bielecki, Zbigniew

    2017-11-01

    The paper presents a portable breath sampling unit (BSU) for human breath analyzers. The developed unit can be used to probe air from the upper airway and alveolar for clinical and science studies. The BSU is able to operate as a patient interface device for most types of breath analyzers. Its main task is to separate and to collect the selected phases of the exhaled air. To monitor the so-called I, II, or III phase and to identify the airflow from the upper and lower parts of the human respiratory system, the unit performs measurements of the exhaled CO2 (ECO2) in the concentration range of 0%-20% (0-150 mm Hg). It can work in both on-line and off-line modes according to American Thoracic Society/European Respiratory Society standards. A Tedlar bag with a volume of 5 dm3 is mounted as a BSU sample container. This volume allows us to collect ca. 1-25 selected breath phases. At the user panel, each step of the unit operation is visualized by LED indicators. This helps us to regulate the natural breathing cycle of the patient. There is also an operator's panel to ensure monitoring and configuration setup of the unit parameters. The operation of the breath sampling unit was preliminarily verified using the gas chromatography/mass spectrometry (GC/MS) laboratory setup. At this setup, volatile organic compounds were extracted by solid phase microextraction. The tests were performed by the comparison of GC/MS signals from both exhaled nitric oxide and isoprene analyses for three breath phases. The functionality of the unit was proven because there was an observed increase in the signal level in the case of the III phase (approximately 40%). The described work made it possible to construct a prototype of a very efficient breath sampling unit dedicated to breath sample analyzers.

  6. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  7. Bifocal: A Multifunctional, Next Generation Electrostatic Analyzer

    Science.gov (United States)

    Andreone, G. D.; Halekas, J. S.

    2016-12-01

    We describe the design and initial development of a next generation charged particle analyzer capable of taking both routine survey measurements and targeted high angular resolution measurements of the distribution. Space physics missions are constrained by both mass and power considerations. Each instrument on a spacecraft must maximize its usefulness while minimizing the drain on resources. The proposed Bifocal electrostatic analyzer fulfills this requirement by making both coarse and fine resolution in-situ electron measurements. Bifocal is a modified tophat analyzer with 2 sets of electrostatic deflectors which divide the entrance of the instrument into two distinct apertures. The top aperture makes fine measurements that allow a detailed look at fine-scale features of the plasma such as loss cones. The lower aperture makes coarse measurements. We performed extensive computer simulations to optimize the angular resolution of the Bifocal sensor. Following the optics, transmitted charged particles hit a microchannel plate (MCP) detector. Below the MCP's, Bifocal utilizes multiple imaging anodes to achieve fine azimuthal resolution. To optimize detection efficiency and imaging resolution, we performed simulations varying both voltage and distance between the MCP exit face and the anodes. Minimizing azimuthal resolution of the fine aperture will be achieved using imaging anodes. Each anode is divided into two different sections with multiple wedge electrodes, with each section attached to separate preamplifiers whose signals provide the inputs to a signal divider circuit. Using the normalized signal difference between the two parts of the circuit, Bifocal determines the azimuthal location of incident particles to high accuracy. We describe the results of initial design and testing of the preamplifier and divider circuitry.

  8. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  9. Blood Gas Analyzer Accuracy of Glucose Measurements.

    Science.gov (United States)

    Liang, Yafen; Wanderer, Jonathan; Nichols, James H; Klonoff, David; Rice, Mark J

    2017-07-01

    To investigate the comparability of glucose levels measured with blood gas analyzers (BGAs) and by central laboratories (CLs). Glucose measurements obtained between June 1, 2007, and March 1, 2016, at the Vanderbilt University Medical Center were reviewed. The agreement between CL and BGA results were assessed using Bland-Altman, consensus error grid (CEG), and surveillance error grid (SEG) analyses. We further analyzed the BGAs' performance against the US Food and Drug Administration (FDA) 2014 draft guidance and 2016 final guidance for blood glucose monitoring and the International Organization for Standardization (ISO) 15197:2013 standard. We analyzed 2671 paired glucose measurements, including 50 pairs of hypoglycemic values (1.9%). Bland-Altman analysis yielded a mean bias of -3.1 mg/dL, with 98.1% of paired values meeting the 95% limits of agreement. In the hypoglycemic range, the mean bias was -0.8 mg/dL, with 100% of paired values meeting the 95% limits of agreement. When using CEG analysis, 99.9% of the paired values fell within the no risk zone. Similar results were found using SEG analysis. For the FDA 2014 draft guidance, our data did not meet the target compliance rate. For the FDA 2016 final guidance, our data partially met the target compliance rate. For the ISO standard, our data met the target compliance rate. In this study, the agreement for glucose measurement between common BGAs and CL instruments met the ISO 2013 standard. However, BGA accuracy did not meet the stricter requirements of the FDA 2014 draft guidance or 2016 final guidance. Fortunately, plotting these results on either the CEG or the SEG revealed no results in either the great or extreme clinical risk zones. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  10. Analyzing Dendritic Morphology in Columns and Layers.

    Science.gov (United States)

    Ting, Chun-Yuan; McQueen, Philip G; Pandya, Nishith; McCreedy, Evan S; McAuliffe, Matthew; Lee, Chi-Hon

    2017-03-23

    In many regions of the central nervous systems, such as the fly optic lobes and the vertebrate cortex, synaptic circuits are organized in layers and columns to facilitate brain wiring during development and information processing in developed animals. Postsynaptic neurons elaborate dendrites in type-specific patterns in specific layers to synapse with appropriate presynaptic terminals. The fly medulla neuropil is composed of 10 layers and about 750 columns; each column is innervated by dendrites of over 38 types of medulla neurons, which match with the axonal terminals of some 7 types of afferents in a type-specific fashion. This report details the procedures to image and analyze dendrites of medulla neurons. The workflow includes three sections: (i) the dual-view imaging section combines two confocal image stacks collected at orthogonal orientations into a high-resolution 3D image of dendrites; (ii) the dendrite tracing and registration section traces dendritic arbors in 3D and registers dendritic traces to the reference column array; (iii) the dendritic analysis section analyzes dendritic patterns with respect to columns and layers, including layer-specific termination and planar projection direction of dendritic arbors, and derives estimates of dendritic branching and termination frequencies. The protocols utilize custom plugins built on the open-source MIPAV (Medical Imaging Processing, Analysis, and Visualization) platform and custom toolboxes in the matrix laboratory language. Together, these protocols provide a complete workflow to analyze the dendritic routing of Drosophila medulla neurons in layers and columns, to identify cell types, and to determine defects in mutants.

  11. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  12. A Conceptual Framework for Analyzing Terrorist Groups,

    Science.gov (United States)

    1985-06-01

    Libyan 17 18 8 29 63 6 Other Middle East 7 11 17 7 7 6 Other European 6 10 8 7 8 6 Sub-Sahara African 7 25 68 22 12 6 Other 6 7 7 17 12 6 grow to be...VIOLENCE, BY REGION, 1980-1982 Inter- Attacks Total national on Terrorist Group Incidents’ Incidentsb Americans Latin America MIR ( Movimiento de...details. (Use separate entries for each person, Bla, Bib, etc.) 67 ____ 4- 68 A CONCEPTUAL FRAMEWORK FOR ANALYZING TERRORIST GROUPS B2. Names of other key

  13. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  14. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  15. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  16. Analyzing PICL trace data with MEDEA

    Energy Technology Data Exchange (ETDEWEB)

    Merlo, A.P. [Pavia Univ. (Italy). Dipt di Informatica e Sistemistica; Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  17. Analyzing PICL trace data with MEDEA

    Energy Technology Data Exchange (ETDEWEB)

    Merlo, A.P. [Pavia Univ., (Italy). Dipt. Informatica e Sistemistica; Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1994-04-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  18. Timed Automata Semantics for Analyzing Creol

    Directory of Open Access Journals (Sweden)

    Mohammad Mahdi Jaghoori

    2010-07-01

    Full Text Available We give a real-time semantics for the concurrent, object-oriented modeling language Creol, by mapping Creol processes to a network of timed automata. We can use our semantics to verify real time properties of Creol objects, in particular to see whether processes can be scheduled correctly and meet their end-to-end deadlines. Real-time Creol can be useful for analyzing, for instance, abstract models of multi-core embedded systems. We show how analysis can be done in Uppaal.

  19. Analyzing Mode Confusion via Model Checking

    Science.gov (United States)

    Luettgen, Gerald; Carreno, Victor

    1999-01-01

    Mode confusion is one of the most serious problems in aviation safety. Today's complex digital flight decks make it difficult for pilots to maintain awareness of the actual states, or modes, of the flight deck automation. NASA Langley leads an initiative to explore how formal techniques can be used to discover possible sources of mode confusion. As part of this initiative, a flight guidance system was previously specified as a finite Mealy automaton, and the theorem prover PVS was used to reason about it. The objective of the present paper is to investigate whether state-exploration techniques, especially model checking, are better able to achieve this task than theorem proving and also to compare several verification tools for the specific application. The flight guidance system is modeled and analyzed in Murphi, SMV, and Spin. The tools are compared regarding their system description language, their practicality for analyzing mode confusion, and their capabilities for error tracing and for animating diagnostic information. It turns out that their strengths are complementary.

  20. Basis-neutral Hilbert-space analyzers.

    Science.gov (United States)

    Martin, Lane; Mardani, Davood; Kondakci, H Esat; Larson, Walker D; Shabahang, Soroush; Jahromi, Ali K; Malhotra, Tanya; Vamivakas, A Nick; Atia, George K; Abouraddy, Ayman F

    2017-03-27

    Interferometry is one of the central organizing principles of optics. Key to interferometry is the concept of optical delay, which facilitates spectral analysis in terms of time-harmonics. In contrast, when analyzing a beam in a Hilbert space spanned by spatial modes - a critical task for spatial-mode multiplexing and quantum communication - basis-specific principles are invoked that are altogether distinct from that of 'delay'. Here, we extend the traditional concept of temporal delay to the spatial domain, thereby enabling the analysis of a beam in an arbitrary spatial-mode basis - exemplified using Hermite-Gaussian and radial Laguerre-Gaussian modes. Such generalized delays correspond to optical implementations of fractional transforms; for example, the fractional Hankel transform is the generalized delay associated with the space of Laguerre-Gaussian modes, and an interferometer incorporating such a 'delay' obtains modal weights in the associated Hilbert space. By implementing an inherently stable, reconfigurable spatial-light-modulator-based polarization-interferometer, we have constructed a 'Hilbert-space analyzer' capable of projecting optical beams onto any modal basis.

  1. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Fiber optic multiple blood gas analyzer

    Science.gov (United States)

    Rademaker, Diane M.; Zimmerman, Donald E.; James, Kenneth A.; Quick, William H.

    1994-07-01

    Blood gas analysis has been shown to be the most critical factor in determining patient survivability in a trauma care environment. Present techniques of non-invasive measurement of blood gases in the trauma care unit such as optical pulse oximetry and transcutaneous electrodes are inadequate due to complexity and inaccuracy. The crux of the solution to this problem is the application of a recent, DOD/NASA developed micro-optic spectrophotometer to perform blood gas analysis via fiber optic transmission. The newly developed blood gas analyzer described here will not only overcome the aforementioned drawbacks but also be highly accurate, durable, and safe in hazardous environments: e.g., oxygen rich environments. This spectrophotometer is driven by a microprocessor based `Kalman filter' algorithm which not only controls the monitoring of all the patients in the care center but also separates the patient's superimposed blood gas spectra into its individual components to allow a number of gases critical for trauma care to be analyzed simultaneously.

  3. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    Energy Technology Data Exchange (ETDEWEB)

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  4. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  5. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  6. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO2 differential (ΔCO2) increased two-fold with no change in apparent Rd, when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO2. Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  7. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  8. Analyzing delay causes in Egyptian construction projects.

    Science.gov (United States)

    Marzouk, Mohamed M; El-Rasas, Tarek I

    2014-01-01

    Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor's organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  9. Ambulatory bruxism recording system with sleep-stage analyzing function.

    Science.gov (United States)

    Mizumori, Takahiro; Inano, Shinji; Sumiya, Masakazu; Kobayashi, Yasuyoshi; Watamoto, Takao; Yatani, Hirofumi

    2009-07-01

    The aim of this study was to develop an ambulatory bruxism recording system capable of sleep-stage analysis. A portable EMG system was used to record masseter muscle activity. An EMG sensor was attached onto the masseter muscle belly at either side. EMG data were stored on a notebook type personal computer. A sound level meter was used to assess the sound level of bruxism. Sound level (dB) readings were taken every second and recorded on the same computer. A prototype of sleep sensor, a wristwatch-style biological signal sensor-recorder device, recorded and stored pulse wave, acceleration and temperature on a memory card. All stored data were transferred to a personal computer and analyzed. The whole system was transportable within a protective case and weighed approximately 5kg. Raw EMG signals were processed to derive integrated EMG data. TOSHIBA Sleep Analysis Program classified sleep-stages as awake, shallow sleep, deep sleep and REM based on the activity of the autonomic nervous system that was estimated from the fluctuations of pulse intervals. An EMG, sound level and sleep-stage analysis program was developed to analyze all data simultaneously. Using this program, the masseter muscle activity, sound level and sleep-stage could be quantified and correlated. We developed an ambulatory bruxism recording system that analyzes sleep-stage. We expect that this system will enable us to measure sleep bruxism activity in each sleep-stage on an electromyographical and auditory basis at the subject's home.

  10. Performance evaluation of Samsung LABGEO(HC10) Hematology Analyzer.

    Science.gov (United States)

    Park, Il Joong; Ahn, Sunhyun; Kim, Young In; Kang, Seon Joo; Cho, Sung Ran

    2014-08-01

    The Samsung LABGEO(HC10) Hematology Analyzer (LABGEO(HC10)) is a recently developed automated hematology analyzer that uses impedance technologies. The analyzer provides 18 parameters including 3-part differential at a maximum rate of 80 samples per hour. To evaluate the performance of the LABGEO(HC10). We evaluated precision, linearity, carryover, and relationship for complete blood cell count parameters between the LABGEO(HC10) and the LH780 (Beckman Coulter Inc) in a university hospital in Korea according to the Clinical and Laboratory Standards Institute guidelines. Sample stability and differences due to the anticoagulant used (K₂EDTA versus K₃EDTA) were also evaluated. The LABGEO(HC10) showed linearity over a wide range and minimal carryover ( 0.92) except for mean corpuscular hemoglobin concentration. The bias estimated was acceptable for all parameters investigated except for monocyte count. Most parameters were stable until 24 hours both at room temperature and at 4°C. The difference by anticoagulant type was statistically insignificant for all parameters except for a few red cell parameters. The accurate results achievable and simplicity of operation make the unit recommendable for small to medium-sized laboratories.

  11. Low temperature (

    NARCIS (Netherlands)

    Rath, J.K.; de Jong, M.; Schropp, R.E.I.

    2008-01-01

    Amorphous silicon films have been made by HWCVD at a very low substrate temperature of ≤ 100 °C (in a dynamic substrate heating mode) without artificial substrate cooling, through a substantial increase of the filament–substrate distance ( 80 mm) and using one straight tantalum filament. The

  12. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  13. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  14. Analyzing Demand: Hegemonic Masculinity and Feminine Prostitution

    Directory of Open Access Journals (Sweden)

    Beatriz Ranea Triviño

    2016-12-01

    Full Text Available In this article, it is presented an exploratory research in which we analyzed the relationship between the construction of hegemonic masculinity and consumption of female prostitution. We have focused our attention on the experiences, attitudes and perceptions of young heterosexual men who have ever paid for sex. Following with a quantitative method of analysis, we conducted six semi-structured interviews with men between 18 to 35 years old. The analysis of the interviews shows the different demographic characteristics, such as, frequency of payment for sexual services, diversity of motivations, spaces where prostitutes are searched, opinions on prostitution and prostitutes. The main conclusions of this study are that the discourses of the interviewees reproduce gender stereotypes and gender sexual roles. And it is suggested that prostitution can be interpreted as a scenario where these men performance their hegemonic masculinity.

  15. Modeling and analyzing architectural change with alloy

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ingstrup, Mads

    2010-01-01

    Although adaptivity based on reconfiguration has the potential to improve dependability of systems, the cost of a failed attempt at reconfiguration is prohibitive in precisely the applications where high dependability is required. Existing work on formal modeling and verification of architectural...... reconfigurations partly achieve the goal of ensuring correctness, however the formalisms used often lack tool support and the ensuing models have uncertain relation to a concrete implementation. Thus a practical way to ensure with formal certainty that specific architectural changes are correct remains a barrier...... to the uptake of reconfiguration techniques in industry. Using the Alloy language and associated tool, we propose a practical way to formally model and analyze runtime architectural change expressed as architectural scripts. Our evaluation shows the performance to be acceptable; our experience...

  16. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  17. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business....... This thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  18. Stackable differential mobility analyzer for aerosol measurement

    Science.gov (United States)

    Cheng, Meng-Dawn; Chen, Da-Ren

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  19. Analyzing DNA replication checkpoint in budding yeast.

    Science.gov (United States)

    Hustedt, Nicole; Shimada, Kenji

    2014-01-01

    Checkpoints are conserved mechanisms that prevent progression into the next phase of the cell cycle when cells are unable to accomplish the previous event properly. Cells also possess a surveillance mechanism called the DNA replication checkpoint, which consists of a conserved kinase cascade that is provoked by insults that block or slow down replication fork progression. In the budding yeast Saccharomyces cerevisiae, the DNA replication checkpoint controls the timing of S-phase events such as origin firing and spindle elongation. This checkpoint also upregulates dNTP pools and maintains the replication fork structure in order to resume DNA replication after replication block. Many replication checkpoint factors have been found to be tumor suppressors, highlighting the importance of this checkpoint pathway in human health. Here we describe a series of protocols to analyze the DNA replication checkpoint in S. cerevisiae.

  20. ANALYZING COMPLAINTS BY INDONESIAN EFL SPEAKERS

    Directory of Open Access Journals (Sweden)

    Anna Marietta da Silva

    2014-12-01

    Full Text Available The English language competence of an EFL learner can be reflectedin his pragmatic competence. Yet, for language learners and teachers a mastery of the pragmatic competence may unconsciously be neglected. In other words, it may not be taught in line with the grammatical competence since the initial period of learning. The article centers on two problems: (1 the similarities and differences of speech act of complaints among Indonesian EFL learners, Indonesian EFL teachers and American native speakers, and (2 the evidence of any pragmatic transfer in the complaint performance. DCT was used to gather the data, which was then analyzed using Rinnert, Nogami and Iwai?s aspects of complaining (2006. It was found that there were both differences and similarities of complaints performed by both the native and non-native speakers of English when power and social status were involved. Some evidence on pragmatic transfer was also tangible; mainly it was due to cultural differences

  1. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  2. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  3. Statistical models for jointly analyzing multiple allometries.

    Science.gov (United States)

    Gao, Huijiang; Liu, Yongxin; Zhang, Tingting; Yang, Runqing; Yang, Huanmin

    2013-02-07

    As the reciprocal of simple allometry equation, power allometry equation can also be used to define allometry scaling but the scaling exponent has an opposite meaning to that of simple allometry equation. Based on this observation, a joint static allometry scaling model of entire body size on multiple partial body size is established, which can not only simultaneously evaluate allometry scaling of multiple partial body sizes, but also take into account the correlations among multiple partial body sizes, facilitating subsequent statistical inference and practice. Since ontogenetic allometry may be time-dependent, ontogenetic allometry is estimated by jointly analyzing changes of entire and multiple partial body sizes as growth time using multivariate stepwise analysis. Joint analysis of allometry scaling is suitable for multiple biological traits and functions with same property or comparability, which is illustrated by two examples. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Orthopedic surgical analyzer for percutaneous vertebroplasty

    Science.gov (United States)

    Tack, Gye Rae; Choi, Hyung Guen; Lim, Do H.; Lee, Sung J.

    2001-05-01

    Since the spine is one of the most complex joint structures in the human body, its surgical treatment requires careful planning and high degree of precision to avoid any unwanted neurological compromises. In addition, comprehensive biomechanical analysis can be very helpful because the spine is subject to a variety of load. In case for the osteoporotic spine in which the structural integrity has been compromised, it brings out the double challenges for a surgeon both clinically and biomechanically. Thus, we have been developing an integrated medical image system that is capable of doing the both. This system is called orthopedic surgical analyzer and it combines the clinical results from image-guided examination and the biomechanical data from finite element analysis. In order to demonstrate its feasibility, this system was applied to percutaneous vertebroplasty. Percutaneous vertebroplasty is a surgical procedure that has been recently introduced for the treatment of compression fracture of the osteoporotic vertebrae. It involves puncturing vertebrae and filling with polymethylmethacrylate (PMMA). Recent studies have shown that the procedure could provide structural reinforcement for the osteoporotic vertebrae while being minimally invasive and safe with immediate pain relief. However, treatment failures due to excessive PMMA volume injection have been reported as one of complications. It is believed that control of PMMA volume is one of the most critical factors that can reduce the incidence of complications. Since the degree of the osteoporosis can influence the porosity of the cancellous bone in the vertebral body, the injection volume can be different from patient to patient. In this study, the optimal volume of PMMA injection for vertebroplasty was predicted based on the image analysis of a given patient. In addition, biomechanical effects due to the changes in PMMA volume and bone mineral density (BMD) level were investigated by constructing clinically

  5. Modelling of semi-liquid aluminium flow in extrusion with temperature effect

    Directory of Open Access Journals (Sweden)

    G. Skorulski

    2007-04-01

    Full Text Available material remains stiff and holds its shape so it can be readily handled, but rapidly thins and flows like a liquid when sheared. It is this behaviour that is the key to the thixoforming process where material flows as a semi-liquid slurry into a die, as in conventional die-casting. Modelling the influence of the temperature distribution heterogeneity on deformation mechanisms during extrusion of the aluminium alloys in semi - liquid phase, the way of preparing samples and experimental technique has been analysed in the following work. There were made an analysis on the influence of the possible temperature distribution in recipient obtained during heating it on the extrusion process proceedings. The conclusions concerning stability of the process and appearing during it deformation mechanisms had been drawn on the ground of the received results. The plasticine and rape oil have been choosen as a substitute materials. Some kind of different variants have been investigated used a special experimental stand. The results of the tests presented below prove that the proposed technique can provide valuable insight into the material flow during deformation of aluminium alloys in the semi-liquid state and thus can give some guidance concerning the desirable temperature distribution within the workpiece.

  6. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-01

    Super-insulated homes offer many benefits including improved comfort, reduced exterior noise penetration, lower energy bills, and the ability to withstand power and fuel outages under much more comfortable conditions than a typical home. While these homes aren't necessarily constructed with excessive mass in the form of concrete floors and walls, the amount of insulation and the increase in the thickness of the building envelope can lead to a mass effect, resulting in the structures ability to store much more heat than a code built home. This results in a very low thermal inertia making the building much less sensitive to drastic temperature swings thereby decreasing the peak heating load demand. During the winter of 2013/2014, CARB monitored the energy use of three homes in climate zone 6 in an attempt to evaluate the accuracy of two different mechanical system sizing methods for low load homes. Based on the results, it is recommended that internal and solar gains be included and some credit for thermal inertia be used in sizing calculations for super insulated homes.

  7. Mass Analyzers Facilitate Research on Addiction

    Science.gov (United States)

    2012-01-01

    The famous go/no go command for Space Shuttle launches comes from a place called the Firing Room. Located at Kennedy Space Center in the Launch Control Center (LCC), there are actually four Firing Rooms that take up most of the third floor of the LCC. These rooms comprise the nerve center for Space Shuttle launch and processing. Test engineers in the Firing Rooms operate the Launch Processing System (LPS), which is a highly automated, computer-controlled system for assembly, checkout, and launch of the Space Shuttle. LPS monitors thousands of measurements on the Space Shuttle and its ground support equipment, compares them to predefined tolerance levels, and then displays values that are out of tolerance. Firing Room operators view the data and send commands about everything from propellant levels inside the external tank to temperatures inside the crew compartment. In many cases, LPS will automatically react to abnormal conditions and perform related functions without test engineer intervention; however, firing room engineers continue to look at each and every happening to ensure a safe launch. Some of the systems monitored during launch operations include electrical, cooling, communications, and computers. One of the thousands of measurements derived from these systems is the amount of hydrogen and oxygen inside the shuttle during launch.

  8. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  9. Glaucoma suspect & Humphrey Field Analyzer a correlation

    Directory of Open Access Journals (Sweden)

    P Dahal

    2012-09-01

    Full Text Available Glaucoma originally meant "clouded", in Greek.The term glaucoma refers to a group of diseases that have in common characteristic optic neuropathy with associated visual field loss for which elevated intraocular pressure is one of the primary risk factor. The purpose of the study is to correlate the clinically diagnosed cases of glaucoma suspect with the Humphrey Field Analyzer (HFA. Fifty cases of glaucoma suspect who attended the glaucoma clinic of Nepal Eye Hospital Tripureswor, Kathmandu, Nepal and who meets at least two criteria, among the four types of glaucoma suspects were advised for the HFA for the study. In this study out of 50 patient, 36 (72% patients had normal visual field. 14 (28% patients had thinning of the neural retinal rim (NRR in both eyes. The significant relation with thinning of neural retina rim and glaucomatous hemifield test was found in the study. Journal of College of Medical Sciences-Nepal,2012,Vol-8,No-1, 23-28 DOI: http://dx.doi.org/10.3126/jcmsn.v8i1.6822

  10. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  11. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  12. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  13. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  14. Photomask pattern viewer and analyzer: HOTSCOPE

    Science.gov (United States)

    Narukawa, Shogo; Yamasaki, Kiyoshi; Machiya, Yuji; Hayashi, Naoya

    2005-06-01

    Recently, photomask pattern feature have become different from LSI layout pattern feature by the OPC process and CMP DUMMY pattern insertion. And then, photomask pattern data volume is very large compared with LSI layout pattern data volume. Therefore, in the usual JOBDECK pattern viewer software, it is difficult to draw those huge pattern data smoothly and quickly. Moreover, various proposals of RET (Resolution Enhancement Technology) are made from various companies and organizations, and it is discussed by various societies. According to the RET, mask pattern feature and structure have been more complicated than the present pattern, and mask difficulty and mask cost might be going to increase and will have great anxiety. Photomask pattern viewer, HOTSCOPE which we developed isn't an only high speed photomask pattern viewer and analyzer, but also can superpose and observe some other mask format pattern and GDS2 format pattern by changing pattern magnification and mirror processing by itself. And HOTSCOPE is the tool which fully incorporated the function required for mask manufactures, such as a plan of a mask, preparation of JOBDECK, and the mask pattern analysis purpose.

  15. Analyzing dialect variation in historical speech corpora.

    Science.gov (United States)

    Renwick, Margaret E L; Olsen, Rachel M

    2017-07-01

    The Linguistic Atlas of the Gulf States is an extensive audio corpus of sociolinguistic interviews with 1121 speakers from eight southeastern U.S. states. Complete interviews have never been fully transcribed, leaving a wealth of phonetic information unexplored. This paper details methods for large-scale acoustic analysis of this historical speech corpus, providing a fuller picture of Southern speech than offered by previous impressionistic analyses. Interviews from 10 speakers (∼36 h) in southeast Georgia were transcribed and analyzed for dialectal features associated with the Southern Vowel Shift and African American Vowel Shift, also considering the effects of age, gender, and race. Multiple tokens of common words were annotated (N = 6085), and formant values of their stressed vowels were extracted. The effects of shifting on relative vowel placement were evaluated via Pillai scores, and vowel dynamics were estimated via functional data analysis and modeled with linear mixed-effects regression. Results indicate that European American speakers show features of the Southern Vowel Shift, though certain speakers shift in more ways than others, and African American speakers' productions are consistent with the African American Vowel Shift. Wide variation is apparent, even within this small geographic region, contributing evidence of the complexity of Southern speech.

  16. A Method for Analyzing Volunteered Geographic Information ...

    Science.gov (United States)

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  17. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  18. Analyzing organizational practices in local health departments.

    Science.gov (United States)

    Studnicki, J; Steverson, B; Blais, H N; Goley, E; Richards, T B; Thornton, J N

    1994-01-01

    Few researchers have examined the problem of comparing the performances of local health departments. A contributing factor is the lack of a uniform method for describing the range of public health activities. The Centers for Disease Control and Prevention's Public Health Practice Program Office has identified 10 organizational practices that may be used to assure that the core functions of public health are being carried out at a local health department. The researchers determined the percentage of time devoted to each of the 10 practices by individual employees at a local public health unit in Tampa, FL. They identified the manpower expenditures and hours allocated to each of the 10 practices within the major program divisions of the unit. They found that the largest portion of manpower resources was allocated to implementing programs. A much smaller fraction of agency resources was devoted to analysis of the health needs of the community and to the development of plans and policies. Together, primary care and communicable disease programs accounted for fully three-quarters of the resources, environmental health for 11 percent, and administrative support services for 13 percent. With continuing refinement and modification, the methodology could provide a highly effective basis for describing and analyzing the activities and performances of local health departments.

  19. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  20. Solid Hydrogen Particles Analyzed for Atomic Fuels

    Science.gov (United States)

    Palaszewski, Bryan A.

    2001-01-01

    Solid hydrogen particles have been selected as a means of storing atomic propellants in future launch vehicles (refs. 1 to 2). In preparation for this, hydrogen particle formation in liquid helium was tested experimentally. These experiments were conducted to visually characterize the particles and to observe their formation and molecular transformations (aging) while in liquid helium. The particle sizes, molecular transformations, and agglomeration times were estimated from video image analyses. The experiments were conducted at the NASA Glenn Research Center in the Supplemental Multilayer Insulation Research Facility (SMIRF, ref. 3). The facility has a vacuum tank, into which the experimental setup was placed. The vacuum tank prevented heat leaks and subsequent boiloff of the liquid helium, and the supporting systems maintained the temperature and pressure of the liquid helium bath where the solid particles were created. As the operation of the apparatus was developed, the hydrogen particles were easily visualized. The figures (ref. 1) show images from the experimental runs. The first image shows the initial particle freezing, and the second image shows the particles after the small particles have agglomerated. The particles finally all clump, but stick together loosely. The solid particles tended to agglomerate within a maximum of 11 min, and the agglomerate was very weak. Because the hydrogen particles are buoyant in the helium, the agglomerate tends to compact itself into a flat pancake on the surface of the helium. This pancake agglomerate is easily broken apart by reducing the pressure above the liquid. The weak agglomerate implies that the particles can be used as a gelling agent for the liquid helium, as well as a storage medium for atomic boron, carbon, or hydrogen. The smallest particle sizes that resulted from the initial freezing experiments were about 1.8 mm. About 50 percent of the particles formed were between 1.8 to 4.6 mm in diameter. These very

  1. Corrosion studies by use of the thermogravimetric analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brinker, G.M.

    1995-11-01

    A series of tests have been performed by use of the thermogravimetric analyzer. Weight gain vs. time graphs have been generated by exposing one inch by two inch by sixty five hundredths of an inch low carbon (1020) steel specimens to a range of relative humidities (65%-90%) and temperatures (50-70{degrees}C). Data collected from these studies will give insight to both the kinetics of oxide formation and the material`s critical relative humidity. It has been observed that two separate rates and mechanisms for oxide formation exist. It is believed that dry oxidation is prevalent at low relative humidities, while aqueos electrochemical corrosion persists at high relative humidities. The relative humidity(s) and temperatures that oxidation formation transforms from one rate and mechanism to the other is of interest. The critical relative humidity is defined as the relative humidity at which oxide formation will become highly accelerated with respect to its normal growth rate. Hence, a better understanding of 1020 steel`s oxide formation kinetics and the alloy`s critical relative humidity will aid in waste package designs for use in conjunction with the proposed nuclear waste containment center at Yucca Mountain.

  2. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  3. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  4. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  5. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  6. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  7. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  8. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  9. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  10. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  11. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  12. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  13. ICAN - INTEGRATED COMPOSITE ANALYZER (IBM PC VERSION)

    Science.gov (United States)

    Murthy, P. L.

    1994-01-01

    The Integrated Composite Analyzer (ICAN) is a computer program designed to carry out a comprehensive linear analysis of multilayered fiber composites. The analysis contains the essential features required to effectively design structural components made from fiber composites. ICAN includes the micromechanical design features of the Intraply Hybrid Composite Design (INHYD) program to predict ply level hygral, thermal, and mechanical properties. The laminate analysis features of the Multilayered Filamentary Composite Analysis (MFCA) program are included to account for interply layer effects. ICAN integrates these and additional features to provide a comprehensive analysis capability for composite structures. Additional features unique to ICAN include the following: 1) ply stress-strain influence coefficients, 2) microstresses and microstrain influence coefficients, 3) concentration factors around a circular hole, 4) calculation of probable delamination locations around a circular hole, 5) Poisson's ratio mismatch details near a straight edge, 6) free-edge stresses, 7) material card input for finite element analysis using NASTRAN (available separately from COSMIC) or MARC, 8) failure loads based on maximum stress criterion, and laminate failure stresses based on first-ply failures and fiber breakage criteria, 9) transverse shear stresses, normal and interlaminar stresses, and 10) durability/fatigue type analyses for thermal as well as mechanical cyclic loads. The code can currently assess degradation due to mechanical and thermal cyclic loads with or without a defect. ICAN includes a dedicated data bank of constituent material properties, and allows the user to build a database of material properties of commonly used fibers and matrices so the user need only specify code names for constituents. Input to ICAN includes constituent material properties (or code names), factors reflecting the fabrication process, and composite geometry. ICAN performs micromechanics

  14. ICAN - INTEGRATED COMPOSITE ANALYZER (IBM 370 VERSION)

    Science.gov (United States)

    Murthy, P. L.

    1994-01-01

    The Integrated Composite Analyzer (ICAN) is a computer program designed to carry out a comprehensive linear analysis of multilayered fiber composites. The analysis contains the essential features required to effectively design structural components made from fiber composites. ICAN includes the micromechanical design features of the Intraply Hybrid Composite Design (INHYD) program to predict ply level hygral, thermal, and mechanical properties. The laminate analysis features of the Multilayered Filamentary Composite Analysis (MFCA) program are included to account for interply layer effects. ICAN integrates these and additional features to provide a comprehensive analysis capability for composite structures. Additional features unique to ICAN include the following: 1) ply stress-strain influence coefficients, 2) microstresses and microstrain influence coefficients, 3) concentration factors around a circular hole, 4) calculation of probable delamination locations around a circular hole, 5) Poisson's ratio mismatch details near a straight edge, 6) free-edge stresses, 7) material card input for finite element analysis using NASTRAN (available separately from COSMIC) or MARC, 8) failure loads based on maximum stress criterion, and laminate failure stresses based on first-ply failures and fiber breakage criteria, 9) transverse shear stresses, normal and interlaminar stresses, and 10) durability/fatigue type analyses for thermal as well as mechanical cyclic loads. The code can currently assess degradation due to mechanical and thermal cyclic loads with or without a defect. ICAN includes a dedicated data bank of constituent material properties, and allows the user to build a database of material properties of commonly used fibers and matrices so the user need only specify code names for constituents. Input to ICAN includes constituent material properties (or code names), factors reflecting the fabrication process, and composite geometry. ICAN performs micromechanics

  15. Core Outlet Temperature Study

    Energy Technology Data Exchange (ETDEWEB)

    Moisseytsev, A. [Argonne National Laboratory (ANL), Argonne, IL (United States); Hoffman, E. [Argonne National Laboratory (ANL), Argonne, IL (United States); Majumdar, S. [Argonne National Laboratory (ANL), Argonne, IL (United States)

    2008-07-28

    It is a known fact that the power conversion plant efficiency increases with elevation of the heat addition temperature. The higher efficiency means better utilization of the available resources such that higher output in terms of electricity production can be achieved for the same size and power of the reactor core or, alternatively, a lower power core could be used to produce the same electrical output. Since any nuclear power plant, such as the Advanced Burner Reactor, is ultimately built to produce electricity, a higher electrical output is always desirable. However, the benefits of the higher efficiency and electricity production usually come at a price. Both the benefits and the disadvantages of higher reactor outlet temperatures are analyzed in this work.

  16. [Preliminary design for a VI system combining the voice acoustic analyzing and glottal image analyzing].

    Science.gov (United States)

    Pan, Yan; Liu, Yan; Cai, Xiaolan; Li, Qiao; Meng, Yan; Xu, Xin; Sun, Wenhong; Zhang, Yuhua; Li, Xin; Qi, Yan

    2008-04-01

    This work is directed at developing a virtual instrument system as an accessorial diagnostic instrument for laryngeal diseases. Programmed with LabWindows/CVI, the system combines the voice acoustic analyzing function with the glottal image measuring function. The voice acoustic analyzing system can sample, store and replay vocal signals; can extract and analyze parameters, including fundamental frequency (F0), frequency perturbation quotient (FPQ), amplitude perturbation quotient(APQ), harmonics-to-noise ratio (HNR), jitter frequency (JF), Shimmer; and can do 3D sound graph analysis. The glottal image analyzing system can sample and store the image observed by the laryngostroboscope; can display any phase in one cycle of the vibration of the vocal cords or a slow and continuous movement of vibrating vocal cords; can snap and save the diagnostic frame of image; and can extract the parameters of the image such as the length and area of the glottis, the length and area of the vocal cords and the diseased part.

  17. Residential Indoor Temperature Study

    Energy Technology Data Exchange (ETDEWEB)

    Booten, Chuck [National Renewable Energy Lab. (NREL), Golden, CO (United States); Robertson, Joseph [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, Dane [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heaney, Mike [Arrow Electronics, Centennial, CO (United States); Brown, David [Univ. of Virginia, Charlottesville, VA (United States); Norton, Paul [Norton Energy Research and Development, Boulder, CO (United States); Smith, Chris [Ingersoll-Rand Corp., Dublin (Ireland)

    2017-04-07

    In this study, we are adding to the body of knowledge around answering the question: What are good assumptions for HVAC set points in U.S. homes? We collected and analyzed indoor temperature data from US homes using funding from the U.S. Department of Energy's Building America (BA) program, due to the program's reliance on accurate energy simulation of homes. Simulations are used to set Building America goals, predict the impact of new building techniques and technologies, inform research objectives, evaluate home performance, optimize efficiency packages to meet savings goals, customize savings approaches to specific climate zones, and myriad other uses.

  18. CHIP MORPHOLOGY AND HOLE SURFACE TEXTURE IN THE DRILLING OF CAST ALUMINUM ALLOYS. (R825370C057)

    Science.gov (United States)

    The effects of cutting fluid and other process variables on chip morphology when drilling cast aluminium alloys are investigated. The effects of workpiece material, speed, feed, hole depth, cutting-fluid presence and percentage oil concentration, workpiece temperature, drill t...

  19. AN EXPERIMENTAL STUDY OF CUTTING FLUID EFFECTS IN DRILLING. (R825370C057)

    Science.gov (United States)

    Experiments were designed and conducted on aluminum alloys and gray cast iron to determine the function of cutting fluid in drilling. The variables examined included speed, feed, hole depth, tool and workpiece material, cutting fluid condition, workpiece temperatures and drill...

  20. Solid Hydrogen Particles and Flow Rates Analyzed for Atomic Fuels

    Science.gov (United States)

    Palaszewski, Bryan A.

    2003-01-01

    The experiments were conducted at Glenn's Small Multipurpose Research Facility (SMIRF, ref. 5). The experimental setup was placed in the facility's vacuum tank to prevent heat leaks and subsequent boiloff of the liquid helium. Supporting systems maintained the temperature and pressure of the liquid helium bath where the solid particles were created. Solid hydrogen particle formation was tested from February 23 to April 2, 2001. Millimeter-sized solid-hydrogen particles were formed in a Dewar of liquid helium as a prelude to creating atomic fuels and propellants for aerospace vehicles. Atomic fuels or propellants are created when atomic boron, carbon, or hydrogen is stored in solid hydrogen particles. The current testing characterized the solid hydrogen particles without the atomic species, as a first step to creating a feed system for the atomic fuels and propellants. This testing did not create atomic species, but only sought to understand the solid hydrogen particle formation and behavior in the liquid helium. In these tests, video images of the solid particle formation were recorded, and the total mass flow rate of the hydrogen was measured. The mass of hydrogen that went into the gaseous phase was also recorded using a commercially available residual gas analyzer. The temperatures, pressures, and flow rates of the liquids and gases in the test apparatus were recorded as well. Testing conducted in 1999 recorded particles as small as 2 to 5 mm in diameter. The current testing extended the testing conditions to a very cold Dewar ullage gas of about 20 to 90 K above the 4 K liquid helium. With the very cold Dewar gas, the hydrogen freezing process took on new dimensions, in some cases creating particles so small that they seemed to be microscopic, appearing as infinitesimally small scintillations on the videotaped images.

  1. Calculation of the electrical of induction heating coils in two dimensional axissymmetric geometry

    Energy Technology Data Exchange (ETDEWEB)

    Nerg, J.; Partanen, J. [Lappeenranta University of Technology (Finland). Department of Energy Technology, Laboratory of Electrical Engineering

    1997-12-31

    The effect of the workpiece temperature on the electrical parameters of a plane, spiral inductor is discussed. The effect of workpiece temperature on the electrical efficiency, power transfer to the workpiece and electromagnetic distortion are also presented. Calculation is performed in two dimensional axissymmetric geometry using a FEM program. (orig.) 5 refs.

  2. Retarding field energy analyzer for the Saskatchewan Torus-Modified plasma boundary

    Science.gov (United States)

    Dreval, M.; Rohraff, D.; Xiao, C.; Hirose, A.

    2009-10-01

    The retarding field energy analyzer (RFA) is a simple and reliable diagnostic technique to measure the ion temperature in the scrape-off layer and edge of magnetic fusion devices. Design and operation features of a single-sided (facing the ion flow) RFA for ion temperature measurements in the Saskatchewan Torus-Modified (STOR-M) tokamak are described. Its compact size (21×15×20 mm3) allows RFA measurements without perturbing plasma significantly. Both ion and electron temperature have been measured by RFA in the STOR-M tokamak. A method is proposed to correct the effects of ion flow on the ion temperature using the simultaneously measured Mach number. The measured electron temperature is consistent with the previously reported Langmuir probe data. Abnormal behavior of the RFA has been observed in both ion and electron modes when RFA is inserted deep into the plasma.

  3. 40 CFR 90.317 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... analyzer to optimize performance on the most sensitive range to be used. (2) Zero the carbon monoxide... monoxide analyzer. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon monoxide analyzer calibration...

  4. Assessing corneal hysteresis using the Ocular Response Analyzer.

    Science.gov (United States)

    McMonnies, Charles W

    2012-03-01

    An examination of studies that have assessed corneal biomechanical performance using the Ocular Response Analyzer (ORA: Reichert Ophthalmic Instruments, Depew, NY) raises some questions regarding the influence of measurement variables and the interpretation of the findings obtained with this instrument. This analysis of those questions describes additional factors which do or may contribute to the assessment of corneal hysteresis (CH). Using key words CH and ORA, English language articles relevant to this analysis were selected after a PubMed search with the addition of some articles referenced in the selected publications. Corneal thickness, the level of edema, intraocular pressure, and corneal temperature as well as the area, location, rate, duration, and sequence of corneal unloading and loading may need to be considered as significant variables when assessing CH. CH values may be specific to measurement method and conditions rather than representing an unequivocal corneal property. Consideration of the uncontrolled variables involved may help explain some of the findings obtained with the ORA. That a CH measurement might vary with the sequence of unloading and loading suggests that the ORA CH finding may not represent the CH, but instead represents a hysteresis value better described as central, applanation-derived hysteresis, which is based on a very short unloading/loading sequence. The potential for the ORA to contribute to improved clinical management appears to be considerable but so does the need for better understanding and further development of its functions and applications.

  5. Calculation of Moments from Measurements by the Los Alamos Magnetospheric Plasma Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    M. F. Thomsen; E. Noveroske; J. E. Borovsky; D. J. McComas

    1999-05-01

    The various steps involved in computing the moments (density, velocity, and temperature) of the ion and electron distributions measured with the Los Alamos Magnetospheric Plasma Analyzer (MPA) are described. The assumptions, constants, and algorithms contained in the FORTRAN code are presented, as well as the output parameters produced by the code.

  6. 40 CFR 86.223-94 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 86.223-94 Section 86.223-94 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Trucks and New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.223-94 Oxides of...

  7. Experimental Study of the Temperature Distribution and Microstructure of Plunge Stage in Friction Stir Welding Process by the Tool with Triangle Pin

    Directory of Open Access Journals (Sweden)

    Bisadi Hossain

    2014-12-01

    Full Text Available Considering the developing role of the friction stir welding in manufacturing industry, a complete study on the process is necessary. Studies on each stage of the process in particular, provide a better understanding of friction stir welding, and specially friction stir spot welding. In this study, plunge stage has been studied by experimental methods for investigating the temperature distribution around the tool during the plunge stage and microstructure changes of the workpiece. Experiments were performed on aluminium 7050 plates with coincident measurement of temperature. In the study, the tool which has a triangle pin is used. The results of this study are used as initial conditions for theoretical analysis of welding process. The results show that the temperature distribution around the tool is quite asymmetric. The asymmetric distribution of temperature is due to nonuniform load distribution underneath the tool and tilt angle of it. The temperatures of the points behind the tool are higher compared with points located forward the tool. Microstructural studies showed that four regions with different microstructures are formed around the tool during the process. These areas were separated based on differences in grain size and elongations. Grains near the tool are elongated in a particular direction that show the material flow direction.

  8. Thermography and sonic anemometry to analyze air heaters in Mediterranean greenhouses.

    Science.gov (United States)

    López, Alejandro; Valera, Diego L; Molina-Aiz, Francisco; Peña, Araceli

    2012-10-16

    The present work has developed a methodology based on thermography and sonic anemometry for studying the microclimate in Mediterranean greenhouses equipped with air heaters and polyethylene distribution ducts to distribute the warm air. Sonic anemometry allows us to identify the airflow pattern generated by the heaters and to analyze the temperature distribution inside the greenhouse, while thermography provides accurate crop temperature data. Air distribution by means of perforated polyethylene ducts at ground level, widely used in Mediterranean-type greenhouses, can generate heterogeneous temperature distributions inside the greenhouse when the system is not correctly designed. The system analyzed in this work used a polyethylene duct with a row of hot air outlet holes (all of equal diameter) that expel warm air toward the ground to avoid plant damage. We have observed that this design (the most widely used in Almería's greenhouses) produces stagnation of hot air in the highest part of the structure, reducing the heating of the crop zone. Using 88 kW heating power (146.7 W ∙ m(-2)) the temperature inside the greenhouse is maintained 7.2 to 11.2 °C above the outside temperature. The crop temperature (17.6 to 19.9 °C) was maintained above the minimum recommended value of 10 °C.

  9. Grinding temperature and energy ratio coefficient in MQL grinding of high-temperature nickel-base alloy by using different vegetable oils as base oil

    Directory of Open Access Journals (Sweden)

    Li Benkai

    2016-08-01

    Full Text Available Vegetable oil can be used as a base oil in minimal quantity of lubrication (MQL. This study compared the performances of MQL grinding by using castor oil, soybean oil, rapeseed oil, corn oil, sunflower oil, peanut oil, and palm oil as base oils. A K-P36 numerical-control precision surface grinder was used to perform plain grinding on a workpiece material with a high-temperature nickel base alloy. A YDM–III 99 three-dimensional dynamometer was used to measure grinding force, and a clip-type thermocouple was used to determine grinding temperature. The grinding force, grinding temperature, and energy ratio coefficient of MQL grinding were compared among the seven vegetable oil types. Results revealed that (1 castor oil-based MQL grinding yields the lowest grinding force but exhibits the highest grinding temperature and energy ratio coefficient; (2 palm oil-based MQL grinding generates the second lowest grinding force but shows the lowest grinding temperature and energy ratio coefficient; (3 MQL grinding based on the five other vegetable oils produces similar grinding forces, grinding temperatures, and energy ratio coefficients, with values ranging between those of castor oil and palm oil; (4 viscosity significantly influences grinding force and grinding temperature to a greater extent than fatty acid varieties and contents in vegetable oils; (5 although more viscous vegetable oil exhibits greater lubrication and significantly lower grinding force than less viscous vegetable oil, high viscosity reduces the heat exchange capability of vegetable oil and thus yields a high grinding temperature; (6 saturated fatty acid is a more efficient lubricant than unsaturated fatty acid; and (7 a short carbon chain transfers heat more effectively than a long carbon chain. Palm oil is the optimum base oil of MQL grinding, and this base oil yields 26.98 N tangential grinding force, 87.10 N normal grinding force, 119.6 °C grinding temperature, and 42.7% energy

  10. 21 CFR 884.2050 - Obstetric data analyzer.

    Science.gov (United States)

    2010-04-01

    ...) MEDICAL DEVICES OBSTETRICAL AND GYNECOLOGICAL DEVICES Obstetrical and Gynecological Monitoring Devices § 884.2050 Obstetric data analyzer. (a) Identification. An obstetric data analyzer (fetal status data analyzer) is a device used during labor to analyze electronic signal data obtained from fetal and maternal...

  11. 40 CFR 89.319 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 89... Equipment Provisions § 89.319 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall... and at least annually thereafter, adjust the FID hydrocarbon analyzer for optimum hydrocarbon response...

  12. 40 CFR 90.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... operation. Adjust the analyzer to optimize performance. (2) Zero the carbon dioxide analyzer with either... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Emission Test Equipment Provisions § 90.320 Carbon dioxide analyzer calibration. (a) Prior to its initial...

  13. 40 CFR 86.122-78 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... sensitive range to be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon monoxide analyzer calibration...

  14. 40 CFR 86.1322-84 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon monoxide analyzer calibration...

  15. 40 CFR 86.522-78 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... performance on the most sensitive range. (2) Zero the carbon monoxide analyzer with either zero grade air or... carbon monoxide analyzer shall be calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with either zero grade air or zero grade nitrogen. (3) Calibrate on each...

  16. Usefulness of thresholds for smear review of neutropenic samples analyzed with a Sysmex XN-10 analyzer.

    Science.gov (United States)

    Ronez, Emily; Geara, Carole; Coito, Sylvie; Jacqmin, Hugues; Cornet, Edouard; Troussard, Xavier; Chatelain, Bernard; Mullier, François

    2017-10-01

    Neutropenia is one of the main criteria for a blood smear review. The objective of this study was to compare the thresholds proposed by the international consensus group for hematology review (1.0 10(9)/L) and the French speaking Group for Cellular Haematology (1.5 10(9)/L) in terms of the number of useless smears. We collected 112,097 analyzed samples from four laboratories equipped with XN instruments (Sysmex, Kobe, Japan) during early 2016. The only exclusion criterion was a leucocyte count below 0.5 10(9)/L. In the absence of abnormal cells and/or morphology suggesting haematological disease, samples were classified as 'negative for morphology' and the differential from the XN-10 was reported. These smear procedures were considered as uninformative. Some 2202 samples met the criterion for neutropenia (review representing 1.96% of the total. These included 1031 with neutropenia alone and 1171 neutropenia plus other abnormalities. Of the 1031 with neutropenia alone, 886 had a neutrophil count between 1.0 10(9)/L and 1.5 10(9)/L. The smear was uninformative for all of these samples. In conclusion, microscopic examination of a blood smear provided very limited information in cases of neutropenia without other abnormalities.

  17. Superhigh Temperatures and Acoustic Cavitation

    CERN Document Server

    Belyaev, V B; Miller, M B; Sermyagin, A V; Topolnikov, A S

    2003-01-01

    The experimental results on thermonuclear synthesis under acoustic cavitation have been analyzed with the account of the latest data and their discussion. The analysis testifies that this avenue of research is a very promising one. The numerical calculations of the D(d, n)^{3}He reaction rate in the deuterated acetone (C_{3}D_{6}O) under the influence of ultrasound depending on T environment temperature within the range T=249-295 K have been carried out within the framework of hydrodynamic model. The results show that it is possible to improve substantially the effect/background relationship in experiments by decreasing the fluid temperature twenty-thirty degrees below zero.

  18. Crystal face temperature determination means

    Science.gov (United States)

    Nason, D.O.; Burger, A.

    1994-11-22

    An optically transparent furnace having a detection apparatus with a pedestal enclosed in an evacuated ampule for growing a crystal thereon is disclosed. Temperature differential is provided by a source heater, a base heater and a cold finger such that material migrates from a polycrystalline source material to grow the crystal. A quartz halogen lamp projects a collimated beam onto the crystal and a reflected beam is analyzed by a double monochromator and photomultiplier detection spectrometer and the detected peak position in the reflected energy spectrum of the reflected beam is interpreted to determine surface temperature of the crystal. 3 figs.

  19. A Quantitative Approach for Analyzing the Relationship between Urban Heat Islands and Land Cover

    Directory of Open Access Journals (Sweden)

    Vanessa da Silva Brum Bastos

    2012-11-01

    Full Text Available With more than 80% of Brazilians living in cities, urbanization has had an important impact on climatic variations. São José dos Campos is located in a region experiencing rapid urbanization, which has produced a remarkable Urban Heat Island (UHI effect. This effect influences the climate, environment and socio-economic development on a regional scale. In this study, the brightness temperatures and land cover types from Landsat TM images of São José dos Campos from 1986, 2001 and 2010 were analyzed for the spatial distribution of changes in temperature and land cover. A quantitative approach was used to explore the relationships among temperature, land cover areas and several indices, including the Normalized Difference Vegetation Index (NDVI, Normalized Difference Water Index (NDWI and Normalized Difference Built-up Index (NDBI. The results showed that urban and bare areas correlated positively with high temperatures. Conversely, areas covered in vegetation and water correlated positively with low temperatures. The indices showed that correlations between the NDVI and NDWI and temperature were low (<0.5; however, a moderate correlation was found between the NDBI and temperature.

  20. Temperature Dependence of Dark Current in Quantum Well Infrared Detectors

    National Research Council Canada - National Science Library

    Hickey, Thomas

    2002-01-01

    ...) /cu cm were gathered and analyzed for various temperatures. The device was cooled with a closed cycle refrigerator, and the data were acquired using the Agilent 4155B Semiconductor Parameter Analyzer...

  1. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  2. ARC Code TI: Inference Kernel for Open Static Analyzers (IKOS)

    Data.gov (United States)

    National Aeronautics and Space Administration — IKOS is a C++ library designed to facilitate the development of sound static analyzers based on Abstract Interpretation. Specialization of a static analyzer for an...

  3. Lab-on-a-chip Astrobiology Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer will...

  4. 40 CFR 89.318 - Analyzer interference checks.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... checks. Prior to its introduction into service and annually thereafter, the NDIR carbon monoxide analyzer...

  5. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A|info:eu-repo/dai/nl/255170653

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  6. Prototype partial one-third octave band spectrum analyzer for acoustic, vibration and other wideband data for flight applications

    Science.gov (United States)

    1973-01-01

    The design refinement of a compact frequency analyzer for measurement and analysis on board flight vehicles is discussed. The analyzer has been constructed in a partial one-third octave band configuration with six filters and detectors spaced by the square root of 10 from 316 Hz to 100,000 Hz and a broadband detector channel. The analyzer has been tested over a temperature range of 40 to 120 F at a pressure of one atmosphere, and at a temperature of 75 F at an absolute pressure of 0.000001 torr, and has demonstrated at least 60 db of dynamic range.

  7. 21 CFR 868.1400 - Carbon dioxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Carbon dioxide gas analyzer. 868.1400 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1400 Carbon dioxide gas analyzer. (a) Identification. A carbon dioxide gas analyzer is a device intended to measure the concentration of carbon dioxide...

  8. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food... DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1670 Neon gas analyzer. (a) Identification. A neon gas analyzer is a device intended to measure the concentration of neon in a gas mixture exhaled by a...

  9. 21 CFR 868.2385 - Nitrogen dioxide analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrogen dioxide analyzer. 868.2385 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Monitoring Devices § 868.2385 Nitrogen dioxide analyzer. (a) Identification. The nitrogen dioxide analyzer is a device intended to measure the concentration of nitrogen...

  10. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  11. 40 CFR 86.521-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.521-90 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall receive the following initial and periodic calibration. The...

  12. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86...-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1221-90 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the following initial and periodic calibrations. (a) Initial and periodic...

  13. 40 CFR 90.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 90... Equipment Provisions § 90.316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified in...

  14. 40 CFR 86.331-79 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86....331-79 Hydrocarbon analyzer calibration. The following steps are followed in sequence to calibrate the hydrocarbon analyzer. It is suggested, but not required, that efforts be made to minimize relative response...

  15. 40 CFR 1065.272 - Nondispersive ultraviolet analyzer.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Nondispersive ultraviolet analyzer. 1065.272 Section 1065.272 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Nondispersive ultraviolet analyzer. (a) Application. You may use a nondispersive ultraviolet (NDUV) analyzer to...

  16. 21 CFR 868.1640 - Helium gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Helium gas analyzer. 868.1640 Section 868.1640...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1640 Helium gas analyzer. (a) Identification. A helium gas analyzer is a device intended to measure the concentration of helium in a gas...

  17. 40 CFR 86.524-78 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... operation. Adjust the analyzer to optimize performance. (2) Zero the carbon dioxide analyzer with either zero grade air or zero grade nitrogen. (3) Calibrate on each normally used operating range with carbon... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration...

  18. 40 CFR 91.317 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... to optimize performance on the most sensitive range to be used. (2) Zero the carbon monoxide analyzer... optimize performance. (2) Zero the carbon monoxide analyzer with either purified synthetic air or zero... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon monoxide analyzer calibration...

  19. 40 CFR 86.124-78 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... optimize performance. (b) Zero the carbon dioxide analyzer with either zero-grade air or zero-grade... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Complete Heavy-Duty Vehicles; Test Procedures § 86.124-78 Carbon dioxide analyzer calibration. Prior to its...

  20. 40 CFR 89.322 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... instrument start-up and operation. Adjust the analyzer to optimize performance. (2) Zero the carbon dioxide... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Test Equipment Provisions § 89.322 Carbon dioxide analyzer calibration. (a) Prior to its introduction...

  1. 40 CFR 89.320 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3) Calibrate on each used... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon monoxide analyzer calibration... Test Equipment Provisions § 89.320 Carbon monoxide analyzer calibration. (a) Calibrate the NDIR carbon...

  2. 40 CFR 86.1324-84 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... performance. (b) Zero the carbon dioxide analyzer with either zero-grade air or zero-grade nitrogen. (c... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Exhaust Test Procedures § 86.1324-84 Carbon dioxide analyzer calibration. Prior to its introduction into...

  3. 40 CFR 91.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... the analyzer to optimize performance. (2) Zero the carbon dioxide analyzer with either purified synthetic air or zero-grade nitrogen. (3) Calibrate on each normally used operating range with carbon... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration...

  4. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of nitrogen analyzer as described in this section. (b) Initial and Periodic Interference...

  5. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of... introduction into service, and monthly thereafter, check the chemiluminescent oxides of nitrogen analyzer for...

  6. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent oxides of nitrogen analyzer shall receive the initial and periodic calibration described in this section...

  7. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  8. Temperature response of photosynthesis in C3, C4, and CAM plants: temperature acclimation and temperature adaptation.

    Science.gov (United States)

    Yamori, Wataru; Hikosaka, Kouki; Way, Danielle A

    2014-02-01

    Most plants show considerable capacity to adjust their photosynthetic characteristics to their growth temperatures (temperature acclimation). The most typical case is a shift in the optimum temperature for photosynthesis, which can maximize the photosynthetic rate at the growth temperature. These plastic adjustments can allow plants to photosynthesize more efficiently at their new growth temperatures. In this review article, we summarize the basic differences in photosynthetic reactions in C3, C4, and CAM plants. We review the current understanding of the temperature responses of C3, C4, and CAM photosynthesis, and then discuss the underlying physiological and biochemical mechanisms for temperature acclimation of photosynthesis in each photosynthetic type. Finally, we use the published data to evaluate the extent of photosynthetic temperature acclimation in higher plants, and analyze which plant groups (i.e., photosynthetic types and functional types) have a greater inherent ability for photosynthetic acclimation to temperature than others, since there have been reported interspecific variations in this ability. We found that the inherent ability for temperature acclimation of photosynthesis was different: (1) among C3, C4, and CAM species; and (2) among functional types within C3 plants. C3 plants generally had a greater ability for temperature acclimation of photosynthesis across a broad temperature range, CAM plants acclimated day and night photosynthetic process differentially to temperature, and C4 plants was adapted to warm environments. Moreover, within C3 species, evergreen woody plants and perennial herbaceous plants showed greater temperature homeostasis of photosynthesis (i.e., the photosynthetic rate at high-growth temperature divided by that at low-growth temperature was close to 1.0) than deciduous woody plants and annual herbaceous plants, indicating that photosynthetic acclimation would be particularly important in perennial, long-lived species that

  9. Research of fuel temperature control in fuel pipeline of diesel engine using positive temperature coefficient material

    Directory of Open Access Journals (Sweden)

    Xiaolu Li

    2016-01-01

    Full Text Available As fuel temperature increases, both its viscosity and surface tension decrease, and this is helpful to improve fuel atomization and then better combustion and emission performances of engine. Based on the self-regulated temperature property of positive temperature coefficient material, this article used a positive temperature coefficient material as electric heating element to heat diesel fuel in fuel pipeline of diesel engine. A kind of BaTiO3-based positive temperature coefficient material, with the Curie temperature of 230°C and rated voltage of 24 V, was developed, and its micrograph and element compositions were also analyzed. By the fuel pipeline wrapped in six positive temperature coefficient ceramics, its resistivity–temperature and heating characteristics were tested on a fuel pump bench. The experiments showed that in this installation, the surface temperature of six positive temperature coefficient ceramics rose to the equilibrium temperature only for 100 s at rated voltage. In rated power supply for six positive temperature coefficient ceramics, the temperature of injection fuel improved for 21°C–27°C within 100 s, and then could keep constant. Using positive temperature coefficient material to heat diesel in fuel pipeline of diesel engine, the injection mass per cycle had little change, approximately 0.3%/°C. This study provides a beneficial reference for improving atomization of high-viscosity liquids by employing positive temperature coefficient material without any control methods.

  10. Nuclear deformation at finite temperature.

    Science.gov (United States)

    Alhassid, Y; Gilbreth, C N; Bertsch, G F

    2014-12-31

    Deformation, a key concept in our understanding of heavy nuclei, is based on a mean-field description that breaks the rotational invariance of the nuclear many-body Hamiltonian. We present a method to analyze nuclear deformations at finite temperature in a framework that preserves rotational invariance. The auxiliary-field Monte Carlo method is used to generate a statistical ensemble and calculate the probability distribution associated with the quadrupole operator. Applying the technique to nuclei in the rare-earth region, we identify model-independent signatures of deformation and find that deformation effects persist to temperatures higher than the spherical-to-deformed shape phase-transition temperature of mean-field theory.

  11. Body temperature norms

    Science.gov (United States)

    Normal body temperature; Temperature - normal ... Morrison SF. Regulation of body temperature. In: Boron WF, Boulpaep EL, eds. Medical Physiology . 3rd ed. Philadelphia, PA: Elsevier; 2017:chap 59. Sajadi MM, Mackowiak ...

  12. Analyzing the Spectra of Accreting X-Ray Pulsars

    Science.gov (United States)

    Wolff, Michael

    , we will develop the new software module (essentially a computer code representing the theoretical model) necessary to perform the analysis of accretion-powered pulsar X-ray spectra in the XSPEC spectral analysis environment. Also in this first year we will analyze new Suzaku Cycle 6 Target of Opportunity observations of GX 304-1 and 4U 0115+63, two known cyclotron line sources, that we have recently carried out. In the second year of this study we will apply our new XSPEC spectral continuum module to the archival X-ray observational data from a number of accreting X-ray pulsars from the RXTE/PCA/HEXTE and Suzaku/XIS/HXD instruments to extract basic accretion parameters. Our source list contains eight pulsars, seven of which have observed cyclotron scattering lines. These pulsars span a range in magnetic field strength, luminosity, expected accretion rate, expected polar cap size, and Comptonizing temperature. In the second year of this work we also plan to make our new fully tested XSPEC continuum analysis module available to the Goddard Space Flight Center HEASARC for distribution to the astrophysical research community. The development and analysis tasks proposed here will provide for the first time a physical basis for the analysis and interpretation of data on accreting X-ray pulsar spectra.

  13. A Comparison of Infrared Gas Analyzers Above a Subalpine Forest

    Science.gov (United States)

    Burns, S. P.; Metzger, S.; Blanken, P.; Burba, G. G.; Swiatek, E.; Li, J.; Conrad, B.

    2014-12-01

    Infrared gas analyzers (IRGAs) are a key component in theeddy-covariance measurement of water vapor and carbon dioxide exchangebetween the surface and atmosphere. Historically, closed-path IRGAshave been used for the fast (> 10 Hz) measurement of atmospheric H2Oand CO2. In order to use them in the field, these IRGAs were typicallyhoused in temperature-controlled enclosures or buildings that weretens of meters away from the measurement location. This necessitatedthe use of long tubing and pumps to bring the air sample to the IRGAcell. Attenuation of H2O and CO2 fluctuations within the tubing was apersistent problem with such a setup, especially for H2O. As analternative, open-path IRGAs have frequently been utilized, but thekey trade-offs with the open-path design are: (i) precipitation anddew-related data gaps, and (ii) the need to account for WPL densityeffects. Over the past five years a new type of closed-path IRGA hasemerged which is weather-proof, compact, and low-maintenance. Becauseof its small size, short intake tubing can be used, which places thesampling cell close to the sonic anemometer and reduces high frequencysignal loss. Two such IRGAs are the LI-COR LI-7200 and the CampbellScientific EC155, which is part of the CPEC200 eddy covariance system.The Niwot Ridge AmeriFlux tower has used a LI-COR LI-6262 IRGA tomeasure CO2 fluxes above a subalpine forest since November, 1998.Starting in summer 2013, a LI-7200 (along with an open-path LI-7500)were deployed at 21.5 m on the AmeriFlux tower. In Fall 2013, aEC155/CPEC200 was added so that a side-by-side comparison between allfour IRGAs was possible. The preliminary results presented in ourstudy compare the CO2 and H2O mean and variance measured by each IRGA,the vertical wind statistics from three side-by-side sonicanemometers, as well as the corresponding spectra and cospectra fromthese sensors as well as other important aspects of systemperformance.

  14. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  15. Digital signal processing in the radio science stability analyzer

    Science.gov (United States)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  16. Oxygen analyzers: failure rates and life spans of galvanic cells.

    Science.gov (United States)

    Meyer, R M

    1990-07-01

    Competing technologies exist for measuring oxygen concentrations in breathing circuits. Over a 4-year period, two types of oxygen analyzers were studied prospectively in routine clinical use to determine the incidence and nature of malfunctions. Newer AC-powered galvanic analyzers (North American Dräger O2med) were compared with older, battery-powered polarographic analyzers (Ohmeda 201) by recording all failures and necessary repairs. The AC-powered galvanic analyzer had a significantly lower incidence of failures (0.12 +/- 0.04 failures per machine-month) than the battery-powered polarographic analyzer (4.0 +/- 0.3 failures per machine-month). Disposable capsules containing the active galvanic cells lasted 12 +/- 7 months. Although the galvanic analyzers tended to remain out of service longer, awaiting the arrival of costly parts, the polarographic analyzers were more expensive to keep operating when calculations included the cost of time spent on repairs. Stocking galvanic capsules would have decreased the amount of time the galvanic analyzers were out of service, while increasing costs. In conclusion, galvanic oxygen analyzers appear capable of delivering more reliable service at a lower overall cost. By keeping the galvanic capsules exposed to room air during periods of storage, it should be possible to prolong their life span, further decreasing the cost of using them. In addition, recognizing the aberrations in their performance that warn of the exhaustion of the galvanic cells should permit timely recording and minimize downtime.

  17. Quantum interferometric measurements of temperature

    Science.gov (United States)

    Jarzyna, Marcin; Zwierz, Marcin

    2015-09-01

    We provide a detailed description of the quantum interferometric thermometer, which is a device that estimates the temperature of a sample from the measurements of the optical phase. We rigorously analyze the operation of such a device by studying the interaction of the optical probe system prepared in a single-mode Gaussian state with a heated sample modeled as a dissipative thermal reservoir. We find that this approach to thermometry is capable of measuring the temperature of a sample in the nanokelvin regime. Furthermore, we compare the fundamental precision of quantum interferometric thermometers with the theoretical precision offered by the classical idealized pyrometers, which infer the temperature from a measurement of the total thermal radiation emitted by the sample. We find that the interferometric thermometer provides a superior performance in temperature sensing even when compared with this idealized pyrometer. We predict that interferometric thermometers will prove useful for ultraprecise temperature sensing and stabilization of quantum optical experiments based on the nonlinear crystals and atomic vapors.

  18. Analysis of temperature trends in Northern Serbia

    Science.gov (United States)

    Tosic, Ivana; Gavrilov, Milivoj; Unkašević, Miroslava; Marković, Slobodan; Petrović, Predrag

    2017-04-01

    An analysis of air temperature trends in Northern Serbia for the annual and seasonal time series is performed for two periods: 1949-2013 and 1979-2013. Three data sets of surface air temperatures: monthly mean temperatures, monthly maximum temperatures, and monthly minimum temperatures are analyzed at 9 stations that have altitudes varying between 75 m and 102 m. Monthly mean temperatures are obtained as the average of the daily mean temperatures, while monthly maximum (minimum) temperatures are the maximum (minimum) values of daily temperatures in corresponding month. Positive trends were found in 29 out of 30 time series, and the negative trend was found only in winter during the period 1979-2013. Applying the Mann-Kendall test, significant positive trends were found in 15 series; 7 in the period 1949-2013 and 8 in the period 1979-2013; and no significant trend was found in 15 series. Significant positive trends are dominated during the year, spring, and summer, where it was found in 14 out of 18 cases. Significant positive trends were found 7, 5, and 3 times in mean, maximum and minimum temperatures, respectively. It was found that the positive temperature trends are dominant in Northern Serbia.

  19. Highly Sensitive and Rapid Fluorescence Detection with a Portable FRET Analyzer.

    Science.gov (United States)

    Kim, Haseong; Han, Gui Hwan; Fu, Yaoyao; Gam, Jongsik; Lee, Seung Goo

    2016-10-01

    Recent improvements in Förster resonance energy transfer (FRET) sensors have enabled their use to detect various small molecules including ions and amino acids. However, the innate weak signal intensity of FRET sensors is a major challenge that prevents their application in various fields and makes the use of expensive, high-end fluorometers necessary. Previously, we built a cost-effective, high-performance FRET analyzer that can specifically measure the ratio of two emission wavelength bands (530 and 480 nm) to achieve high detection sensitivity. More recently, it was discovered that FRET sensors with bacterial periplasmic binding proteins detect ligands with maximum sensitivity in the critical temperature range of 50 - 55 °C. This report describes a protocol for assessing sugar content in commercially-available beverage samples using our portable FRET analyzer with a temperature-specific FRET sensor. Our results showed that the additional preheating process of the FRET sensor significantly increases the FRET ratio signal, to enable more accurate measurement of sugar content. The custom-made FRET analyzer and sensor were successfully applied to quantify the sugar content in three types of commercial beverages. We anticipate that further size reduction and performance enhancement of the equipment will facilitate the use of hand-held analyzers in environments where high-end equipment is not available.

  20. Analyzing atomic noise with a consumer sound card

    Science.gov (United States)

    Schulte, Carsten H. H.; Müller, Georg M.; Horn, Hauke; Hübner, Jens; Oestreich, Michael

    2012-03-01

    We discuss an advanced undergraduate project on spin noise spectroscopy of atomic rubidium vapor. The spin noise is digitized using a consumer sound card and analyzed by a fast Fourier transform. Students gain competence in digital data acquisition and processing, and the idea of analyzing a noise signal is emphasized.

  1. HPT as a Manager's Tool for Analyzing Individual Employee Performance

    Science.gov (United States)

    Kyle-Needs, Denise A.; Lindbeck, Robin

    2011-01-01

    Typically the human performance technology (HPT) process is regarded as a tool for use when analyzing performance gaps in functional or larger organizational units. This case study demonstrates the application of the HPT process in a one-to-one relationship between a manager and a direct report. Specifically, the process is used to analyze the…

  2. Using High-Content Imaging to Analyze Toxicological Tipping ...

    Science.gov (United States)

    Presentation at International Conference on Toxicological Alternatives & Translational Toxicology (ICTATT) held in China and Discussing the possibility of using High Content Imaging to Analyze Toxicological Tipping Points Slide Presentation at International Conference on Toxicological Alternatives & Translational Toxicology (ICTATT) held in China and Discussing the possibility of using High Content Imaging to Analyze Toxicological Tipping Points

  3. 40 CFR 86.317-79 - Hydrocarbon analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer specifications....317-79 Hydrocarbon analyzer specifications. (a) Hydrocarbon measurements are to be made with a heated... measures hydrocarbon emissions on a dry basis is permitted for gasoline-fueled testing; Provided, That...

  4. Analyzing Population Genetics Data: A Comparison of the Software

    Science.gov (United States)

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  5. Analyzing FCS Professionals in Higher Education: A Case Study

    Science.gov (United States)

    Hall, Scott S.; Harden, Amy; Pucciarelli, Deanna L.

    2016-01-01

    A national study of family and consumer sciences (FCS) professionals in higher education was analyzed as a case study to illustrate procedures useful for investigating issues related to FCS. The authors analyzed response rates of more than 1,900 FCS faculty and administrators by comparing those invited to participate and the 345 individuals who…

  6. 21 CFR 868.1120 - Indwelling blood oxyhemoglobin concentration analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indwelling blood oxyhemoglobin concentration... Indwelling blood oxyhemoglobin concentration analyzer. (a) Identification. An indwelling blood oxyhemoglobin concentration analyzer is a photoelectric device used to measure, in vivo, the oxygen-carrying capacity of...

  7. Empirical Validity of Ertl's Brain-Wave Analyzer (BWA02).

    Science.gov (United States)

    Fischer, Donald G.; And Others

    1978-01-01

    The empirical validity of Ertl's brain wave analyzer was investigated by the known contrasted groups method. Thirty-two academically talented and 16 academically handicapped were compared on four Primary Mental Abilities tests, two Sequential Tests of Educational Progress measures, and seven Brain Wave Analyzer measures. (Author/JKS)

  8. 40 CFR 89.311 - Analyzer calibration frequency.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Analyzer calibration frequency. 89.311 Section 89.311 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Equipment Provisions § 89.311 Analyzer calibration frequency. (a) Prior to initial use and after major...

  9. Theoretical study on ignition compensating temperature sensitivity

    Directory of Open Access Journals (Sweden)

    Mingfang Liu

    2015-09-01

    Full Text Available Temperature sensitivity of the propellant has significant influence on the interior ballistic performance of guns. Many physical and chemical approaches are employed to decrease this temperature sensitivity of the propellant. In this article, it is proposed that the temperature sensitivity of the propellant is changed by altering the factors required to ignition. A one-dimensional two-phase flow interior ballistic model is established to analyze the relation between ignition factors and temperature sensitivity. The simulation results show that the propellant temperature sensitivity is changed by altering the ignition factors. That is, the interior ballistic performance is affected by altering the size of fire hole, breaking liner pressure, and ignition location. Based on the simulation results, the temperature sensitivity can be controlled by matching of charges and intelligent control ignition system.

  10. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm)

    OpenAIRE

    Christian Lester D. Gimeno

    2017-01-01

    –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE) instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.Th...

  11. Parity-Violating in e - e + Scattering at Finite Temperature

    Science.gov (United States)

    Chekerker, M.; Santos, A. F.; Khanna, Faqir C.; Ladrem, M.

    2017-09-01

    Parity violation implies that physics laws are not invariant under spatial coordinate reversal. Electron-positron scattering is a process that displays parity violation. Using the Thermo Field Dynamics formalism this scattering at finite temperature is analyzed. The transition amplitude is calculated as a function of temperature. The parity violation at very high temperatures tend to go to zero.

  12. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  13. Electrooptic approach to an integrated optics spectrum analyzer.

    Science.gov (United States)

    Thylen, L; Stensland, L

    1981-05-15

    An integrated optics spectrum analyzer based on using the linear electrooptic effect is investigated. This spectrum analyzer performs Fourier analysis of sampled electronic signals, where each signal is fed to an electrode of an electrode array. The electrode array acts as a spatial light modulator, and the diffracted light field, representing a weighted discrete Fourier transform (DFT), is focused on a detector array by an integrated transform lens. The theory of operation of the spectrum analyzer is outlined, numerical results relating to this theory are presented, and questions concerning efficiency, dynamic range, design, and implementation are discussed.

  14. Emergency response training with the BNL plant analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, H.S.; Guppy, J.G.; Mallen, A.N.; Wulff, W.

    1987-01-01

    Presented is the experience in the use of the BNL Plant Analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training.

  15. Separation analysis, a tool for analyzing multigrid algorithms

    Science.gov (United States)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  16. Validation of ESR analyzer using Westergren ESR method.

    Science.gov (United States)

    Sikka, Meera; Tandon, Rajesh; Rusia, Usha; Madan, Nishi

    2007-07-01

    Erythrocyte sedimentation rate (ESR) is one of the most frequently ordered laboratory test. ESR analyzers were developed to provide a quick and efficient measure of ESR. We compared the results of ESR obtained by an ESR analyzer with those by the Westergren method in a group of 75 patients Linear regression analysis showed a good correlation between the two results (r = 0.818, p < 0.01). The intra class correlation was 0.82. The analyzer method had the advantages of safety, decreased technician time and improved patient care by providing quick results.

  17. Mini Total Organic Carbon Analyzer (miniTOCA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Total Organic Carbon (TOC) analyzers function by converting (oxidizing) all organic compounds (contaminants) in the water sample to carbon dioxide gas (CO2), then...

  18. Lab-on-a-chip astrobiology analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an astrobiology analyzer to measure chemical signatures of life in extraterrestrial settings. The...

  19. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  20. Analyzing the Romanian Healthcare Bureaucracy Using a Tree Diagram

    Directory of Open Access Journals (Sweden)

    Ruxandra Dinulescu

    2016-01-01

    Our paper tries to analyze the main problem, along with its root causes, in order to proposeviable solutions that could be adopted so our healthcare system could be among the otherEuropean healthcare systems.

  1. Testing and assessment of portable seismic property analyzer.

    Science.gov (United States)

    2014-02-01

    Investigator will thoroughly test and assess the Portable Seismic Property Analyzer (PSPA), a hand-held device that focuses on : pavement layer properties. The device can be utilized on both rigid and flexible pavements. When used on rigid pavements,...

  2. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  3. Airspace Analyzer for Assessing Airspace Directional Permeability Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  4. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  5. AmAMorph: Finite State Morphological Analyzer for Amazighe

    Directory of Open Access Journals (Sweden)

    Fatima Zahra Nejme

    2016-03-01

    Full Text Available This paper presents AmAMorph, a morphological analyzer for Amazighe language using a system based on the NooJ linguistic development environment. The paper begins with the development of Amazighe lexicons with large coverage formalization. The built electronic lexicons, named ‘NAmLex’, ‘VAmLex’ and ‘PAmLex’ which stand for ‘Noun Amazighe Lexicon’, ‘Verb Amazighe Lexicon’ and ‘Particles Amazighe Lexicon’, link inflectional, morphological, and syntacticsemantic information to the list of lemmas. Automated inflectional and derivational routines are applied to each lemma producing over inflected forms. To our knowledge,AmAMorph is the first morphological analyzer for Amazighe. It identifies the component morphemes of the forms using large coverage morphological grammars. Along with the description of how the analyzer is implemented, this paper gives an evaluation of the analyzer.

  6. Analyzing Spread of Influence in Social Networks for Transportation Application.

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  7. The Photo-Pneumatic CO2 Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  8. CASSINI COSMIC DUST ANALYZER CALIBRATED/RESAMPLED DATA

    Data.gov (United States)

    National Aeronautics and Space Administration — The Cosmic Dust Analyzer (CDA) is an instrument on the Cassini Orbiter that studies the physical properties of dust particles hitting the detector. This data set...

  9. GIOTTO JOHNSTONE PARTICLE ANALYZER MERGED DATA V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset contains results from the Implanted Ion Sensor (IIS) 4DH mode and the Fast Ion Sensor SW and HAR modes of the Three- Dimensional Particle Analyzer (JPA)...

  10. Mars & Multi-Planetary Electrical Environment Spectrum Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Our objective is to develop MENSA as a highly integrated planetary radio and digital spectrum analyzer cubesat payload that can be deployed as a satellite instrument...

  11. Analyzing Spread of Influence in Social Networks for Transportation Applications

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  12. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  13. Proteomics: an efficient tool to analyze nematode proteins

    Science.gov (United States)

    Proteomic technologies have been successfully used to analyze proteins structure and characterization in plants, animals, microbes and humans. We used proteomics methodologies to separate and characterize soybean cyst nematode (SCN) proteins. Optimizing the quantity of proteins required to separat...

  14. Triple Isotope Water Analyzer for Extraplanetary Studies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  15. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to develop highly-accurate, lightweight, low-power gas analyzers for measurements of carbon dioxide (CO2) and water vapor (H2O)...

  16. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for eddy flux covariance...

  17. Six tesla analyzing magnet for heavy-ion beam transport

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.P.; Bollinger, L.; Erskine, J.; Genens, L.; Hoffman, J.

    1980-01-01

    A superconducting analyzer magnet for particle beam deflection has been designed and is being fabricated for use at the Argonne Tandem-Linac Accelerator System (ATLAS). This six tesla magnet will provide 45/sup 0/ of deflection for the heavy-ion beams from the ATLAS tandem electrostatic accelerator and together with its twin will replace the existing conventional 90/sup 0/ analyzer magnet which will become inadequate when ATLAS is completed.

  18. Pseudomonas aeruginosa Virulence Analyzed in a Dictyostelium discoideum Host System

    OpenAIRE

    Cosson, Pierre; Zulianello, Laurence; Join-Lambert, Olivier; Faurisson, François; Gebbie, Leigh; Benghezal, Mohammed; Van Delden, Christian; Kocjancic Curty, Lasta; Köhler, Thilo

    2002-01-01

    Pseudomonas aeruginosa is an important opportunistic pathogen that produces a variety of cell-associated and secreted virulence factors. P. aeruginosa infections are difficult to treat effectively because of the rapid emergence of antibiotic-resistant strains. In this study, we analyzed whether the amoeba Dictyostelium discoideum can be used as a simple model system to analyze the virulence of P. aeruginosa strains. The virulent wild-type strain PAO1 was shown to inhibit growth of D. discoide...

  19. Analyzing Control Challenges for Thermal Energy Storage in Foodstuffs

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F. S.; Skovrup, Morten Juel

    2012-01-01

    of refrigerated goods in a supermarket to shift the load of the system in time without deteriorating the quality of the foodstuffs. The analyses in this paper go before closing any control loops. In the first part, we introduce and validate a new model with which we can estimate the actual temperatures...... foodstuffs make them behave differently when exposed to changes in air temperature. We present a novel analysis based on Biot and Fourier numbers for the different foodstuffs. This provides a valuable tool for determining how different items can be utilized in load-shifting schemes on different timescales...

  20. Maine River Temperature Monitoring

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — We collect seasonal and annual temperature measurements on an hourly or quarter hourly basis to monitor habitat suitability for ATS and other species. Temperature...

  1. High temperature measuring device

    Science.gov (United States)

    Tokarz, Richard D.

    1983-01-01

    A temperature measuring device for very high design temperatures (to 2,000.degree. C.). The device comprises a homogenous base structure preferably in the form of a sphere or cylinder. The base structure contains a large number of individual walled cells. The base structure has a decreasing coefficient of elasticity within the temperature range being monitored. A predetermined quantity of inert gas is confined within each cell. The cells are dimensionally stable at the normal working temperature of the device. Increases in gaseous pressure within the cells will permanently deform the cell walls at temperatures within the high temperature range to be measured. Such deformation can be correlated to temperature by calibrating similarly constructed devices under known time and temperature conditions.

  2. GISS Surface Temperature Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The GISTEMP dataset is a global 2x2 gridded temperature anomaly dataset. Temperature data is updated around the middle of every month using current data files from...

  3. Temperature-amplitude coupling for stable biological rhythms at different temperatures.

    Science.gov (United States)

    Kurosawa, Gen; Fujioka, Atsuko; Koinuma, Satoshi; Mochizuki, Atsushi; Shigeyoshi, Yasufumi

    2017-06-01

    Most biological processes accelerate with temperature, for example cell division. In contrast, the circadian rhythm period is robust to temperature fluctuation, termed temperature compensation. Temperature compensation is peculiar because a system-level property (i.e., the circadian period) is stable under varying temperature while individual components of the system (i.e., biochemical reactions) are usually temperature-sensitive. To understand the mechanism for period stability, we measured the time series of circadian clock transcripts in cultured C6 glioma cells. The amplitudes of Cry1 and Dbp circadian expression increased significantly with temperature. In contrast, other clock transcripts demonstrated no significant change in amplitude. To understand these experimental results, we analyzed mathematical models with different network topologies. It was found that the geometric mean amplitude of gene expression must increase to maintain a stable period with increasing temperatures and reaction speeds for all models studied. To investigate the generality of this temperature-amplitude coupling mechanism for period stability, we revisited data on the yeast metabolic cycle (YMC) period, which is also stable under temperature variation. We confirmed that the YMC amplitude increased at higher temperatures, suggesting temperature-amplitude coupling as a common mechanism shared by circadian and 4 h-metabolic rhythms.

  4. ICAN/DAMP-integrated composite analyzer with damping analysis capabilities: User's manual

    Science.gov (United States)

    Saravanos, Dimitrious A.; Sanfeliz, Jose G.

    1992-01-01

    This manual describes the use of the computer code ICAN/DAMP (Integrated Composite Analyzer with Damping Analysis Capabilities) for the prediction of damping in polymer-matrix composites. The code is written in FORTRAN 77 and is a version of the ICAN (Integrated Composite ANalyzer) computer program. The code incorporates a new module for synthesizing the material damping from micromechanics to laminate level. Explicit micromechanics equations based on hysteretic damping are programmed relating the on-axis damping capacities to the fiber and matrix properties and fiber volume ratio. The damping capacities of unidirectional composites subjected to off-axis loading are synthesized from on-axis damping values. The hygrothermal effect on the damping performance of unidirectional composites caused by temperature and moisture variation is modeled along with the damping contributions from interfacial friction between broken fibers and matrix. The temperature rise is continuously vibrating composite plies and composite laminates is also estimated. The ICAN/DAMP user's manual provides descriptions of the damping analysis module's functions, structure, input requirements, output interpretation, and execution requirements. It only addresses the changes required to conduct the damping analysis and is used in conjunction with the 'Second Generation Integrated Composite Analyzer (ICAN) Computer Code' user's manual (NASA TP-3290).

  5. Rescaling Temperature and Entropy

    Science.gov (United States)

    Olmsted, John, III

    2010-01-01

    Temperature and entropy traditionally are expressed in units of kelvin and joule/kelvin. These units obscure some important aspects of the natures of these thermodynamic quantities. Defining a rescaled temperature using the Boltzmann constant, T' = k[subscript B]T, expresses temperature in energy units, thereby emphasizing the close relationship…

  6. Development of a Thermal/Optical Carbon Analyzer with Multi-Wavelength Capabilities

    Science.gov (United States)

    Sumlin, B.; Chow, J. C.; Watson, J. G.; Wang, X.; Gronstal, S.; Chen, L. W. A. A.; Trimble, D.

    2014-12-01

    A thermal/optical carbon analyzer (DRI Model 2015) equipped with a novel seven-wavelength light source (405, 445, 532, 635, 780, 808, and 980 nm) was developed to analyze chemical and optical properties of particles collected on quartz-fiber filters. Based on the DRI Model 2001 carbon analyzer at 633 nm, major modifications were made on mechanical and electrical components, flow control, and the carbon detector to adopt modern technologies, increase instrument reliability, and reduce costs and maintenance. The correlation between wavelength-dependent light attenuation and organic and elemental carbon (OC and EC, respectively) content allows estimation of the amount of brown and black carbon (BrC and BC, respectively) on filters. Continuous monitoring of the light reflected from and transmitted through the filter along with carbon evolved from the filter when heated to different temperatures under either inert or oxidizing gas environments provides insights into the optical properties of the carbon released from the filter; it also allows examination of the charring process as pyrolyzed char has been one of the major uncertainties in quantifying OC and EC. The objectives of this study are: 1) establish performance equivalency between the Model 2015 and Model 2001 DRI carbon analyzers when comparing similar laser wavelength to maintain consistency for long-term network sample analysis; and 2) analyze the multi-wavelength signal to quantify BrC and BC, and to optimize char correction. A selection of samples, including standard chemicals, rural and urban ambient filters, and emission sources from biomass burning, diesel and gasoline engine exhaust, and resuspended dust were measured by both the Model 2015 and Model 2001 analyzers. The instrument design, calibration, comparison with legacy analyzer, and interpretation of the multi-wavelengths measurement will be presented.

  7. Thermochemical stability of Li-Cu-O ternary compounds stable at room temperature analyzed by experimental and theoretical methods

    Energy Technology Data Exchange (ETDEWEB)

    Lepple, Maren [Karlsruhe Institute of Technology, Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics; Technische Univ. Darmstadt (Germany). Eduard-Zintl-Inst. of Inorganic and Physical Chemistry; Rohrer, Jochen; Albe, Karsten [Technische Univ. Darmstadt (Germany). Fachgebiet Materialmodellierung; Adam, Robert; Rafaja, David [Technical Univ. Freiberg (Germany). Inst. of Materials Science; Cupid, Damian M. [Karlsruhe Institute of Technology, Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics; Austrian Institute of Technology GmbH, Vienna (Austria). Center for Low-Emission Transport TECHbase; Seifert, Hans J. [Karlsruhe Institute of Technology, Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics

    2017-11-15

    Compounds in the Li-Cu-O system are of technological interest due to their electrochemical properties which make them attractive as electrode materials, i.e., in future lithium ion batteries. In order to select promising compositions for such applications reliable thermochemical data are a prerequisite. Although various groups have investigated individual ternary phases using different experimental setups, up to now, no systematic study of all relevant phases is available in the literature. In this study, we combine drop solution calorimetry with density function theory calculations to systematically investigate the thermodynamic properties of ternary Li-Cu-O phases. In particular, we present a consistently determined set of enthalpies of formation, Gibbs energies and heat capacities for LiCuO, Li{sub 2}CuO{sub 2} and LiCu{sub 2}O{sub 2} and compare our results with existing literature.

  8. Contactless dynamic tests for analyzing effects of speed and temperature on the natural frequency of a machine tool spindle

    National Research Council Canada - National Science Library

    Atsushi Matsubara; Kohei Asano; Toshiyuki Muraki

    2015-01-01

    This paper presents a contactless dynamic test for characterizing several effects on the dynamic stiffness, in particular the first mode frequency, in the radial direction of a rigidly preloaded spindle...

  9. Analyzing microporosity with vapor thermogravimetry and gas pycnometry

    NARCIS (Netherlands)

    Dral, A. Petra; ten Elshof, Johan E.

    2018-01-01

    The complementary use of thermogravimetry and pycnometry is demonstrated to expand the toolbox for experimental micropore analysis <1 nm. Thermogravimetry is employed to assess the uptake of water, methanol, ethanol, 1-propanol and cyclohexane vapors in microporous structures at room temperature and

  10. Harmonic analysis utilizing a Phonodeik and an Henrici analyzer

    Science.gov (United States)

    Fickinger, William J.; Hanson, Roger J.; Hoekje, Peter L.

    2004-05-01

    Dayton C. Miller of the Case School of Applied Science assembled a series of instruments for accurate analysis of sound [D. C. Miller, J. Franklin Inst. 182, 285-322 (1916)]. He created the Phonodeik to display and record sound waveforms of musical instruments, voices, fog horns, and so on. Waveforms were analyzed with the Henrici harmonic analyzer, built in Switzerland by G. Coradi. In this device, the motion of a stylus along the curve to be analyzed causes a series of spheres to rotate; two moveable rollers in contact with the nth sphere record the contributions of the sine(nx) and cosine(nx) components of the wave. Corrections for the measured spectra are calculated from analysis of the response of the Phonodeik. Finally, the original waveform could be reconstructed from the corrected spectral amplitudes and phases by a waveform synthesizer, also built at Case. Videos will be presented that show the motion of the gears, spheres, and dials of a working Henrici analyzer, housed at the Department of Speech Pathology and Audiology at the University of Iowa. Operation of the Henrici analyzer and the waveform synthesizer will be explained.

  11. Health Services Cost Analyzing in Tabriz Health Centers 2008

    Directory of Open Access Journals (Sweden)

    Massumeh gholizadeh

    2015-08-01

    Full Text Available Background and objectives : Health Services cost analyzing is an important management tool for evidence-based decision making in health system. This study was conducted with the purpose of cost analyzing and identifying the proportion of different factors on total cost of health services that are provided in urban health centers in Tabriz. Material and Methods : This study was a descriptive and analytic study. Activity Based Costing method (ABC was used for cost analyzing. This cross–sectional survey analyzed and identified the proportion of different factors on total cost of health services that are provided in Tabriz urban health centers. The statistical population of this study was comprised of urban community health centers in Tabriz. In this study, a multi-stage sampling method was used to collect data. Excel software was used for data analyzing. The results were described with tables and graphs. Results : The study results showed the portion of different factors in various health services. Human factors by 58%, physical space 8%, medical equipment 1.3% were allocated with high portion of expenditures and costs of health services in Tabriz urban health centers. Conclusion : Based on study results, since the human factors included the highest portion of health services costs and expenditures in Tabriz urban health centers, balancing workload with staff number, institutionalizing performance-based management and using multidisciplinary staffs may lead to reduced costs of services. ​

  12. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  13. Investigation of Asphaltene Precipitation at Elevated Temperature

    DEFF Research Database (Denmark)

    Andersen, Simon Ivar; Lindeloff, Niels; Stenby, Erling Halfdan

    1998-01-01

    In order to obtain quantitative data on the asphaltene precipitation induced by the addition of n-alkane (heptane) at temperatures above the normal boiling point of the precipitant, a high temperature/high pressure filtration apparatus has been constructed. Oil and alkane are mixed at the appropr......In order to obtain quantitative data on the asphaltene precipitation induced by the addition of n-alkane (heptane) at temperatures above the normal boiling point of the precipitant, a high temperature/high pressure filtration apparatus has been constructed. Oil and alkane are mixed...... in the extracted fraction, hence there is no room for stirring. The equipment as well as solutions to some of the problems are presented along with precipitation data from 40 to 200 degrees C. The asphaltenes separated are analyzed using FT-ir. The filtrate containing the maltenes was cooled to room temperature...

  14. Effects of radiant temperature on thermal comfort

    Energy Technology Data Exchange (ETDEWEB)

    Atmaca, Ibrahim; Kaynakli, Omer; Yigit, Abdulvahap [Uludag University, Bursa (Turkey). Faculty of Engineering and Architecture, Department of Mechanical Engineering

    2007-09-15

    The aim of this paper is to investigate the local differences between body segments caused by high radiant temperature, and to analyze the interior surface temperatures for different wall and ceiling constructions with their effect on thermal comfort. For the segment-wise thermal interactions between human body and its surrounding, simulations have been conducted by appropriately modifying Gagge 2-node model to multi-segment case to demonstrate the local differences. Simulation results are found to be in good agreement with experimental and simulation results reported in the literature. To calculate the interior surface temperatures of the wall and ceiling, the sol-air temperature approach is used for convenience. It is shown in the paper that the body segments close the relatively hot surfaces are more affected than others and interior surface temperatures of un-insulated walls and ceilings exposed to a strong solar radiation reach high levels, all of which cause thermal discomfort for the occupants in buildings. (author)

  15. Room temperature and productivity in office work

    Energy Technology Data Exchange (ETDEWEB)

    Seppanen, O.; Fisk, W.J.; Lei, Q.H.

    2006-07-01

    Indoor temperature is one of the fundamental characteristics of the indoor environment. It can be controlled with a degree of accuracy dependent on the building and its HVAC system. The indoor temperature affects several human responses, including thermal comfort, perceived air quality, sick building syndrome symptoms and performance at work. In this study, we focused on the effects of temperature on performance at office work. We included those studies that had used objective indicators of performance that are likely to be relevant in office type work, such as text processing, simple calculations (addition, multiplication), length of telephone customer service time, and total handling time per customer for call-center workers. We excluded data from studies of industrial work performance. We calculated from all studies the percentage of performance change per degree increase in temperature, and statistically analyzed measured work performance with temperature. The results show that performance increases with temperature up to 21-22 C, and decreases with temperature above 23-24 C. The highest productivity is at temperature of around 22 C. For example, at the temperature of 30 C, the performance is only 91.1% of the maximum i.e. the reduction in performance is 8.9%.

  16. Temperature-Compensated Clock Skew Adjustment

    Directory of Open Access Journals (Sweden)

    Joaquín Olivares

    2013-08-01

    Full Text Available This work analyzes several drift compensation mechanisms in wireless sensor networks (WSN. Temperature is an environmental factor that greatly affects oscillators shipped in every WSN mote. This behavior creates the need of improving drift compensation mechanisms in synchronization protocols. Using the Flooding Time Synchronization Protocol (FTSP, this work demonstrates that crystal oscillators are affected by temperature variations. Thus, the influence of temperature provokes a low performance of FTSP in changing conditions of temperature. This article proposes an innovative correction factor that minimizes the impact of temperature in the clock skew. By means of this factor, two new mechanisms are proposed in this paper: the Adjusted Temperature (AT and the Advanced Adjusted Temperature (A2T. These mechanisms have been combined with FTSP to produce AT-FTSP and A2T-FTSP. Both have been tested in a network of TelosB motes running TinyOS. Results show that both AT-FTSP and A2T-FTSP improve the average synchronization errors compared to FTSP and other temperature-compensated protocols (Environment-Aware Clock Skew Estimation and Synchronization for WSN (EACS and Temperature Compensated Time Synchronization (TCTS.

  17. Analyzing the acoustic spectra of sound velocity and absorption in amphiphilic liquids

    Directory of Open Access Journals (Sweden)

    Sergei V. Mysik

    2015-10-01

    Full Text Available The paper analyzes the theoretical approaches to the study of the acoustic spectrum and the speed of sound absorption in the frequency range up to 10 GHz in liquid systems. For example ethoxylated derivatives of normal decyl alcohol EDDn, belonging to nonionic surfactants showed that at room temperature and low degrees of ethoxylation n acoustic spectra can be described in terms of the relaxation theory. It is shown that within the experimental error of the acoustic spectrum of EDDn, in the studied range of frequencies and temperature, are composed of two prime areas of acoustic dispersion. The results of calculations of relaxation and thermodynamic parameters of fast and ultrafast processes of restructuring EDDn can be used in the development of combined technologies of enhanced oil recovery using surfactant solutions and various physical fields and factors.

  18. Advanced Stirling Radioisotope Generator Thermal Power Model in Thermal Desktop SINDA/FLUINT Analyzer

    Science.gov (United States)

    Wang, Xiao-Yen; Fabanich, William A.; Schmitz, Paul C.

    2012-01-01

    This paper presents a three-dimensional Advanced Stirling Radioisotope Generator (ASRG) thermal power model that was built using the Thermal Desktop SINDA/FLUINT thermal analyzer. The model was correlated with ASRG engineering unit (EU) test data and ASRG flight unit predictions from Lockheed Martin's Ideas TMG thermal model. ASRG performance under (1) ASC hot-end temperatures, (2) ambient temperatures, and (3) years of mission for the general purpose heat source fuel decay was predicted using this model for the flight unit. The results were compared with those reported by Lockheed Martin and showed good agreement. In addition, the model was used to study the performance of the ASRG flight unit for operations on the ground and on the surface of Titan, and the concept of using gold film to reduce thermal loss through insulation was investigated.

  19. Extreme thermal episodes analyzed with MODIS products during the boreal winter (2000-2016

    Directory of Open Access Journals (Sweden)

    J. Gomis-Cebolla

    2016-06-01

    Full Text Available The beginning of the XXI century is characterized by the intensification of the existing global warming situation and for a series of drastic global meteorological events. Particularly, during the winter season a series of extreme temperature episodes affecting large areas of the northern hemisphere have been produced. In this paper, these episodes are studied by analyzing the thermal anomalies spatial distribution and temporal evolution in the period 2001-2016 from Land Surface Temperature (LST products obtained from the Moderate Resolution Imaging Spectroradiometer (MODIS sensor. The study regions considered in this investigation are eight of the northern hemisphere. The results obtained for the heating and cooling episodes do not reveal an important discrepancy, however, an increase in the area affected by heating versus cooling is observed.

  20. Electrooptic integrated optics spectrum analyzer: an experimental investigation.

    Science.gov (United States)

    Arvidsson, G; Thylén, L

    1982-03-01

    An integrated optics spectrum analyzer based on using the linear electrooptic effect was theoretically investigated earlier. This spectrum analyzer performs Fourier analysis of sampled electronic signals, where each voltage sample is fed to an electrode of a metal electrode array. The electrode array acts as a spatial light modulator, and the diffracted light field is focused on a detector array. The intensity values at the detector elements represent a weighted discrete Fourier transform of the voltage values on the electrodes. The first experimental results to verify the principle of operation of the device are now reported and compared with theory. The prism coupling technique has been used to investigate the influence of an electrode array with voltages on a collimated beam in a LiNbO(3) waveguide. The voltage distributions analyzed correspond to a square wave function and rect functions. The theory has also been extended, and some pertinent signal processing properties of the device are discussed.

  1. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    Directory of Open Access Journals (Sweden)

    Jaehyo Jung

    2017-10-01

    Full Text Available In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V converter to convert the current generated by the reduction-oxidation (redox reaction of the buffer solution to a voltage signal. This signal is then digitized by the processor. The configuration of the power and ground of the printed circuit board (PCB layer is divided into digital and analog parts to minimize the noise interference of each part. The proposed analyzer occupies an area of 5.9 × 3.25 cm2 with a current resolution of 0.4 nA. A potential of 0~2.1 V can be applied between the working and the counter electrodes. The results of this study showed the accuracy of the proposed analyzer by measuring the Ruthenium(III chloride ( Ru III concentration in 10 mM phosphate-buffered saline (PBS solution with a pH of 7.4. The measured data can be transmitted to a PC or a mobile such as a smartphone or a tablet PC using the included Bluetooth module. The proposed analyzer uses a 3.7 V, 120 mAh lithium polymer battery and can be operated for 60 min when fully charged, including data processing and wireless communication.

  2. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer.

    Science.gov (United States)

    Jung, Jaehyo; Lee, Jihoon; Shin, Siho; Kim, Youn Tae

    2017-10-23

    In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO) glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V) converter to convert the current generated by the reduction-oxidation (redox) reaction of the buffer solution to a voltage signal. This signal is then digitized by the processor. The configuration of the power and ground of the printed circuit board (PCB) layer is divided into digital and analog parts to minimize the noise interference of each part. The proposed analyzer occupies an area of 5.9 × 3.25 cm² with a current resolution of 0.4 nA. A potential of 0~2.1 V can be applied between the working and the counter electrodes. The results of this study showed the accuracy of the proposed analyzer by measuring the Ruthenium(III) chloride ( Ru III ) concentration in 10 mM phosphate-buffered saline (PBS) solution with a pH of 7.4. The measured data can be transmitted to a PC or a mobile such as a smartphone or a tablet PC using the included Bluetooth module. The proposed analyzer uses a 3.7 V, 120 mAh lithium polymer battery and can be operated for 60 min when fully charged, including data processing and wireless communication.

  3. Evaluation of performance of veterinary in-clinic hematology analyzers.

    Science.gov (United States)

    Rishniw, Mark; Pion, Paul D

    2016-12-01

    A previous study provided information regarding the quality of in-clinic veterinary biochemistry testing. However, no similar studies for in-clinic veterinary hematology testing have been conducted. The objective of this study was to assess the quality of hematology testing in veterinary in-clinic laboratories using results obtained from testing 3 levels of canine EDTA blood samples. Clinicians prepared blood samples to achieve measurand concentrations within, below, and above their RIs and evaluated the samples in triplicate using their in-clinic analyzers. Quality was assessed by comparison of calculated total error with quality requirements, determination of sigma metrics, use of a quality goal index, and agreement between in-clinic and reference laboratory instruments. Suitability for statistical quality control was determined using adaptations from the computerized program, EZRules3. Evaluation of 10 veterinary in-clinic hematology analyzers showed that these instruments often fail to meet quality requirements. At least 60% of analyzers reasonably determined RBC, WBC, HCT, and HGB, when assessed by most quality goal criteria; platelets were less reliably measured, with 80% deemed suitable for low platelet counts, but only 30% for high platelet counts, and automated differential leukocyte counts were generally considered unsuitable for clinical use with fewer than 40% of analyzers meeting the least stringent quality goal requirements. Fewer than 50% of analyzers were able to meet requirements for statistical quality control for any measurand. These findings reflect the current status of in-clinic hematology analyzer performance and provide a basis for future evaluations of the quality of veterinary laboratory testing. © 2016 American Society for Veterinary Clinical Pathology.

  4. JaDA – the Java Deadlock Analyzer

    OpenAIRE

    Garcia, Abel; Laneve, Cosimo

    2017-01-01

    International audience; JaDA is a static deadlock analyzer that targets Java byte-code. The core of JaDA is a behavioral type system especially designed to record dependencies between concurrent code. These behavioural types are thereafter analyzed by means of a fixpoint algorithm that reports potential deadlocks in the original Java code. We give a practical presentation of JaDA, highlighting the main connections between the tool and the theory behind it. We also present some of the features...

  5. Collecting, Analyzing and Visualizing Tweets using Open Source Tools

    OpenAIRE

    Yang, Seungwon; Kavanaugh, Andrea L.

    2011-01-01

    This tutorial will teach participants how to collect, analyze and visualize results from twitter data. We will demonstrate several different free, open-source web-based tools that participants can use to collect twitter data (e.g., Archivist, 140kit.com, TwapperKeeper), and show them a few different methods, tools or programs they can use to analyze the data in a given collection. Finally, we will show participants visualization tools and programs they can use to present the analyses, such ...

  6. Error Sources in the ETA Energy Analyzer Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Nexsen, W E

    2004-12-13

    At present the ETA beam energy as measured by the ETA energy analyzer and the DARHT spectrometer differ by {approx}12%. This discrepancy is due to two sources, an overestimate of the effective length of the ETA energy analyzer bending-field, and data reduction methods that are not valid. The discrepancy can be eliminated if we return to the original process of measuring the angular deflection of the beam and use a value of 43.2cm for the effective length of the axial field profile.

  7. International Space Station Major Constituent Analyzer On-Orbit Performance

    Science.gov (United States)

    Gardner, Ben D.; Erwin, Phillip M.; Cougar, Tamara; Ulrich, BettyLynn

    2017-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies in the LAB MCA are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Finally, the Node 3 MCA is being brought to an operational configuration.

  8. Wind energy system time-domain (WEST) analyzers

    Science.gov (United States)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  9. Design and Research on Automotive Controller Area Network Bus Analyzer

    Directory of Open Access Journals (Sweden)

    Hongwei CUI

    2014-03-01

    Full Text Available The detection method of automotive controller area network bus is researched in this paper. Failure identifying of CAN bus under different working conditions has been realized. In order to realizing intelligent failure diagnosis, data fusion means has been put forward in this paper. The composition of analysis and detection system is introduced. By analyzing and processing the data of CAN bus and sensors, work condition of automotive is achieved. Multi-pattern data fusion model and algorithm for failure diagnosis are researched. The analyzer and detection system designed in this paper can be applied to automotive fault analysis, troubleshooting and maintenance.

  10. [Correlativity analysis based on radiation spectrum of correlated color temperature and thermodynamic temperature of a radiating source].

    Science.gov (United States)

    Fu, Tai-ran; Cheng, Xiao-fang; Zhong, Mao-hua; Yang, Zang-jian

    2006-11-01

    Correlated color temperature, which describes the chromaticity characteristics of a radiating source, is different from its thermodynamic temperature derived from primary spectrum pyrometry. However, establishing their mathematical relationship is feasible. Therefore, the authors theoretically analyzed the variation rule of the correlativity difference between the correlated color temperature of the source and its thermodynamic temperature with the emissivity parameter. And the authors gave the corresponding numerical simulation results. The above theoretical and numerical discussions will make it possible that a colorimeter used to measure the correlated color temperature serves as a pyrometer to realize the measurement of the thermodynamic temperature.

  11. Influence of temperature changes on ambient air NOx chemiluminescence measurements.

    Science.gov (United States)

    Miñarro, Marta Doval; Ferradás, Enrique González; Martínez, Francisco J Marzal

    2012-09-01

    Users of automatic air pollution monitors are largely unaware of how certain parameters, like temperature, can affect readings. The present work examines the influence of temperature changes on chemiluminescence NO(x) measurements made with a Thermo Scientific 42i analyzer, a model widely used in air monitoring networks and air pollution studies. These changes are grouped into two categories according to European Standard EN 14211: (1) changes in the air surrounding the analyzers and (2) changes in the sampled air. First, the sensitivity tests described in Standard EN 14211 were performed to determine whether the analyzer performance was adapted to the requirements of the standard. The analyzer met the performance criteria of both tests; however, some differences were detected in readings with temperature changes even though the temperature compensator was on. Sample temperature changes were studied more deeply as they were the most critical (they cannot be controlled and differences of several tens of degrees can be present in a single day). Significant differences in readings were obtained when changing sample temperature; however, maximum deviations were around 3% for temperature ranges of 15°C. If other possible uncertainty contributions are controlled and temperature variations with respect to the calibration temperature are not higher than 15°C, the effect of temperature changes could be acceptable and no data correction should have to be applied.

  12. Lock-in amplifier- based rotating- analyzer spectroscopic ellipsometer with micro-controlled angular frequency

    Energy Technology Data Exchange (ETDEWEB)

    Flores C, J.M.; Nunez O, O.F.; Rodriguez P, G.; Lastras M, A.; Lastras M, L.F. [Instituto de Investigacion en Comunicacion Optica, Universidad Autonoma de San Luis Potosi, Alvaro Obregon 64, 78000 San Luis Potosi (Mexico)

    2005-07-01

    We report on the development of a full operational rotating analyzer spectroscopic ellipsometer. This instrument employs a phase-sensitive amplifier to process the optical signal as an alternative to Fast Fourier Transform analysis. We describe electronic hardware designed to stabilize the rotation frequency of the analyzer prism as well as to drive the device for the positioning of the polarizer prism azimuth. The ellipsometer allows for dielectric function measurement in the energy range from 1.7-5.5 eV, in both ambient air and Ultra High Vacuum (UHV). UHV measurements can be carried out at a temperature as low as 150 K. To evaluate the ellipsometer performance we present results of the determination of the complex dielectric function of a number of semiconductors, namely, GaSb, GaAs, InGaAs, CdTe and CdHgTe. (Author)

  13. Comparison of blood gas, electrolyte and metabolite results measured with two different blood gas analyzers and a core laboratory analyzer.

    Science.gov (United States)

    Uyanik, Metin; Sertoglu, Erdim; Kayadibi, Huseyin; Tapan, Serkan; Serdar, Muhittin A; Bilgi, Cumhur; Kurt, Ismail

    2015-04-01

    Blood gas analyzers (BGAs) are important in assessing and monitoring critically ill patients. However, the random use of BGAs to measure blood gases, electrolytes and metabolites increases the variability in test results. Therefore, this study aimed to investigate the correlation of blood gas, electrolyte and metabolite results measured with two BGAs and a core laboratory analyzer. A total of 40 arterial blood gas samples were analyzed with two BGAs [(Nova Stat Profile Critical Care Xpress (Nova Biomedical, Waltham, MA, USA) and Siemens Rapidlab 1265 (Siemens Healthcare Diagnostics Inc., Tarrytown, NY, USA)) and a core laboratory analyzer [Olympus AU 2700 autoanalyzer (Beckman-Coulter, Inc., Fullerton, CA, USA)]. The results of pH, pCO₂, pO₂, SO₂, sodium (Na⁺), potassium (K⁺), calcium (Ca⁺²), chloride (Cl⁻), glucose, and lactate were compared by Passing-Bablok regression analysis and Bland-Altman plots. The present study showed that there was negligible variability of blood gases (pCO₂, pO₂, SO₂), K⁺ and lactate values between the blood gas and core laboratory analyzers. However, the differences in pH were modest, while Na⁺, Cl⁻, Ca²⁺ and glucose showed poor correlation according to the concordance correlation coefficient. BGAs and core laboratory autoanalyzer demonstrated variable performances and not all tests met minimum performance goals. It is important that clinicians and laboratories are aware of the limitations of their assays.

  14. Partitioning and analyzing temporal variability of wash and bed ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 124; Issue 7. Partitioning and analyzing temporal variability of wash and bed material loads in a forest watershed in Iran ... Keywords. Laser particle size distribution; sand and gravel mining; sediment dynamic; suspended sediment; watershed management.

  15. DEVELOPMENT ANALYZERS TRANSACTIONS IN MONITORING THE BUSINESS ACTIVITIES OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    L. E. Sovik

    2013-01-01

    Full Text Available In the article there are marked the features and prerequisites of implementation in food production technologies devoted to monitor business activity in the realtime. The methodical approach to the development of analyzers transactional business processes of the organization is offered, monitoring scheme for one of the basic types of business events in the procurement process is constructed.

  16. Analyzing Properties of Stochastic Business Processes By Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    This chapter presents an approach to precise formal analysis of business processes with stochastic properties. The method presented here allows for both qualitative and quantitative properties to be individually analyzed at design time without requiring a full specification. This provides an effe...

  17. A New Methodology to Analyze Instabilities in SEM Imaging

    NARCIS (Netherlands)

    Mansilla, Catalina; Ocelik, Vaclav; De Hosson, Jeff T. M.

    2014-01-01

    This paper presents a statistical method to analyze instabilities that can be introduced during imaging in scanning electron microscopy (SEM). The method is based on the correlation of digital images and it can be used at different length scales. It consists of the evaluation of three different

  18. Dealing with Piaget: Analyzing Card Games for Understanding Concepts.

    Science.gov (United States)

    Weisskirch, Robert S.

    Students who take developmental psychology courses have difficulty applying theoretical concepts to situations separate from the context of theory. When learning about Piagetian theory, students often confine their understanding to demonstrations of conservation tasks. Analyzing Card Games, an active learning activity, allows students to apply the…

  19. Analyzing Perceptions of Prospective Teachers about Their Media Literacy Competencies

    Science.gov (United States)

    Recepoglu, Ergun; Ergun, Muammer

    2013-01-01

    The purpose of this study is to analyze perceptions of prospective teachers about their media literacy competencies in terms of different variables. This is a descriptive research in the survey model which tries to detect the current situation. Study group includes 580 prospective teachers from Turkish, Primary School, Social Studies, Science,…

  20. Analyzing 5 years of EC-TEL proceedings

    NARCIS (Netherlands)

    Reinhardt, Wolfgang; Meier, Christian; Drachsler, Hendrik; Sloep, Peter

    2011-01-01

    Reinhardt, W., Meier, C., Drachsler, H., & Sloep, P. B. (2011). Analyzing 5 years of EC-TEL proceedings. In C. D. Kloos, D. Gillet, R. M. Crespo García, F. Wild, & M. Wolpers (Eds.), Towards Ubiquitous Learning: 6th European Conference of Technology Enhanced Learning, EC-TEL 2011 (pp. 531-536).

  1. CTG Analyzer: A graphical user interface for cardiotocography.

    Science.gov (United States)

    Sbrollini, Agnese; Agostinelli, Angela; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most commonly used test for establishing the good health of the fetus during pregnancy and labor. CTG consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions (UC; mmHg). FHR is characterized by baseline, baseline variability, tachycardia, bradycardia, acceleration and decelerations. Instead, UC signal is characterized by presence of contractions and contractions period. Such parameters are usually evaluated by visual inspection. However, visual analysis of CTG recordings has a well-demonstrated poor reproducibility, due to the complexity of physiological phenomena affecting fetal heart rhythm and being related to clinician's experience. Computerized tools in support of clinicians represents a possible solution for improving correctness in CTG interpretation. This paper proposes CTG Analyzer as a graphical tool for automatic and objective analysis of CTG tracings. CTG Analyzer was developed under MATLAB®; it is a very intuitive and user friendly graphical user interface. FHR time series and UC signal are represented one under the other, on a grid with reference lines, as usually done for CTG reports printed on paper. Colors help identification of FHR and UC features. Automatic analysis is based on some unchangeable features definitions provided by the FIGO guidelines, and other arbitrary settings whose default values can be changed by the user. Eventually, CTG Analyzer provides a report file listing all the quantitative results of the analysis. Thus, CTG Analyzer represents a potentially useful graphical tool for automatic and objective analysis of CTG tracings.

  2. The Pitch: How To Analyze Ads. 2nd Edition.

    Science.gov (United States)

    Rank, Hugh

    This book probes the ways ads persuade people to purchase, and attempts to teach individuals to become more discerning consumers. Critical thinking, when applied to analyzing ads, benefits consumers by helping them recognize patterns of persuasion and sort incoming information in order to get to the hidden message. The book s basic premise is that…

  3. Virtual Environment of Real Sport Hall and Analyzing Rendering Quality

    Directory of Open Access Journals (Sweden)

    Filip Popovski

    2015-02-01

    Full Text Available Here is presented virtual environment of a real sport hall created in Quest3D VR Edition. All analyzes of the rendering quality, techniques of interaction and performance of the system in real time are presented. We made critical analysis on all of these techniques on different machines and have excellent results.

  4. Watch out for superman: first visualize, then analyze.

    Science.gov (United States)

    Kozak, Marcin

    2012-01-01

    A visit from Superman shows why data visualization should come before data analysis. The Web extra is a dataset that comprises 100 observations of the quantitative variables y and x plus the qualitative variable group. When analyzed correctly, this dataset exhibits an interesting pattern.

  5. Analyzing the Hidden Curriculum of Screen Media Advertising

    Science.gov (United States)

    Mason, Lance E.

    2015-01-01

    This media literacy article introduces a questioning framework for analyzing screen media with students and provides an example analysis of two contemporary commercials. Investigating screen conventions can help students understand the persuasive functions of commercials, as well as how the unique sensory experience of screen viewing affects how…

  6. On the Rationality of Traditional Akan Religion: Analyzing the ...

    African Journals Online (AJOL)

    ANDCORPgh changing the world

    P a g e | 127. DOI: http://dx.doi.org/10.4314/ljh.v25i1.7. On the Rationality of Traditional Akan Religion: Analyzing the Concept of God. Hasskei M. Majeed. Lecturer, Department of Philosophy and Classics,. University of Ghana, Legon. Abstract. This paper is an attempt to show how logically acceptable (or rational) belief in.

  7. Analyzing the Change-Proneness of APIs and web APIs

    NARCIS (Netherlands)

    Romano, D.

    2015-01-01

    Analyzing the Change-Proneness of APIs and web APIs APIs and web APIs are used to expose existing business logic and, hence, to ease the reuse of functionalities across multiple software systems. Software systems can use the business logic of legacy systems by binding their APIs and web APIs. With

  8. Analyzing the security posture of South African websites

    CSIR Research Space (South Africa)

    Mtsweni, Jabu, S

    2015-08-12

    Full Text Available defense mechanisms employed by the chosen websites. This approach was chosen because the client-side security policies, which may give an indication of the security posture of a website, can be analyzed without actively scanning multiple websites...

  9. Modeling and analyzing autogenous shrinkage of hardening cement paste

    NARCIS (Netherlands)

    Lu, T.; Koenders, E.A.B.

    2014-01-01

    In this paper, a conceptual model for analyzing the plastic part of autogenous deformation of cement paste based on the Arrhenius rate theory will be presented. The autogenous deformation will be calculated from the elastic deformations with inclusion of creep. Different kinds of cement paste with a

  10. Analyzing discussions on twitter: Case study on HPV vaccinations

    NARCIS (Netherlands)

    Kaptein, R.; Boertjes, E.; Langley, D.

    2014-01-01

    In this work we analyze the discussions on Twitter around the Human papillomavirus (HPV) vaccinations. We collect a dataset consisting of tweets related to the HPV vaccinations by searching for relevant keywords, by retrieving the conversations on Twitter, and by retrieving tweets from our user

  11. Some Problems in Recording and Analyzing South African English ...

    African Journals Online (AJOL)

    1994-05-24

    May 24, 1994 ... Some Problems in Recording and Analyzing South African English Vocabulary. 37 outsider (however, since D.C. Hauptfleisch was kind enough to react to. :w.ple entries which I had sent him. from an earlier draft of this article, his marks, valuable as always, are given here for the benefit of others too).

  12. Besieged by burqas: Analyzing representations of the burqa

    NARCIS (Netherlands)

    Mazurski, L.E.

    2015-01-01

    In this thesis, I analyze the ways in which various discourses produce knowledge about the burqa. Particularly, since the attacks on the twin towers and the London bombings, Orientalist and neo- Orientalist tropes have been revitalized and propagagated by ideologies of Islamophobia at work to

  13. Purdue Plane Structures Analyzer II : a computerized wood engineering system

    Science.gov (United States)

    S. K. Suddarth; R. W. Wolfe

    1984-01-01

    The Purdue Plane Structures Analyzer (PPSA) is a computer program developed specifically for the analysis of wood structures. It uses recognized analysis procedures, in conjunction with recommendations of the 1982 National Design Specification for Wood Construction, to determine stresses and deflections of wood trusses and frames. The program offers several options for...

  14. Analyzing the drivers of green manufacturing with fuzzy approach

    DEFF Research Database (Denmark)

    Govindan, Kannan; Diabat, Ali; Madan Shankar, K.

    2015-01-01

    India, and aided by their replies; a pair-wise comparison was made among the drivers. The pair-wise comparison is used as an input data and the drivers were analyzed on its basis. The analysis resorted to the use of a fuzzy Multi Criteria Decision Making (MCDM) approach. The obtained results...

  15. Analyzing Digital Library Initiatives: 5S Theory Perspective

    Science.gov (United States)

    Isah, Abdulmumin; Mutshewa, Athulang; Serema, Batlang; Kenosi, Lekoko

    2015-01-01

    This article traces the historical development of Digital Libraries (DLs), examines some DL initiatives in developed and developing countries and uses 5S Theory as a lens for analyzing the focused DLs. The analysis shows that present-day systems, in both developed and developing nations, are essentially content and user centric, with low level…

  16. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  17. Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives

    DEFF Research Database (Denmark)

    Angelini, Marco; Ferro, Nicola; Larsen, Birger

    2014-01-01

    Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact......, a methodology for measuring their scholarly impact, and tools exploiting visual analytics to analyze the outcomes....

  18. Analyzing the use of pins in safety bearings

    DEFF Research Database (Denmark)

    da Fonseca, Cesar A. L. L.; Weber, Hans I.; Fleischer, Philip F.

    2015-01-01

    A new concept for safety bearings is analyzed: useful in emergency situations, it shall protect the bearing from destruction by the use of pins which impact with a disc, both capable of good energy dissipation. Results of work in progress are presented by validating partial stages of the developm...

  19. Probability model for analyzing fire management alternatives: theory and structure

    Science.gov (United States)

    Frederick W. Bratten

    1982-01-01

    A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...

  20. 40 CFR 86.1522 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED... engineering practice for instrument start-up and operation. Adjust the analyzer to optimize performance on the... § 86.1514-84. (3) Adjust the electrical span network such that the electrical span point is correct...

  1. Analyzing Parental Involvement Dimensions in Early Childhood Education

    Science.gov (United States)

    Kurtulmus, Zeynep

    2016-01-01

    The importance of parental involvement in children's academic and social development has been widely accepted. For children's later school success, the first years are crucial. Majority of the research focuses on enhancing and supporting parental involvement in educational settings. The purpose of this study was to analyze dimensions of parental…

  2. Analyzing Single-Event Gate Ruptures In Power MOSFET's

    Science.gov (United States)

    Zoutendyk, John A.

    1993-01-01

    Susceptibilities of power metal-oxide/semiconductor field-effect transistors (MOSFET's) to single-event gate ruptures analyzed by exposing devices to beams of energetic bromine ions while applying appropriate bias voltages to source, gate, and drain terminals and measuring current flowing into or out of each terminal.

  3. Analyzing the influence of institutions on health policy development ...

    African Journals Online (AJOL)

    Analyzing the influence of institutions on health policy development in Uganda: A case study of the decision to abolish user fees. ... Multiple data sources were used including: academic literature, key government documents, grey literature, and a variety of print media. Results: According to the analytical frameworks ...

  4. Assessment of Measurement Error when Using the Laser Spectrum Analyzers

    Directory of Open Access Journals (Sweden)

    A. A. Titov

    2015-01-01

    Full Text Available The article dwells on assessment of measurement errors when using the laser spectrum analyzers. It presents the analysis results to show that it is possible to carry out a spectral analysis of both amplitudes and phases of frequency components of signals and to analyze a changing phase of frequency components of radio signals using interferential methods of measurements. It is found that the interferometers with Mach-Zehnder arrangement are most widely used for measurement of signal phase. A possibility to increase resolution when using the combined method as compared to the other considered methods is shown since with its application spatial integration is performed over one coordinate while time integration is done over the other coordinate that is reached by the orthogonal arrangement of modulators relative each other. The article defines a drawback of this method. It is complicatedness and low-speed because of integrator that disables measurement of spectral components of a radio pulse if its width is less than a temporary aperture. There is a proposal to create an advanced option of the spectrum analyzer in which phase is determined through the signal processing. The article presents resolution when using such a spectrum analyzer. It also reviews the possible options for creating devices to measure the phase components of a spectrum depending on the methods applied to measure a phase. The analysis has shown that for phase measurement a time-pulse method is the most perspective. It is found that the known circuits of digital phase-meters using this method cannot be directly used in spectrum analyzers as they are designed for measurement of the phase only of one signal frequency. In this regard a number of circuits were developed to measure the amplitude and phase of frequency components of the radio signal. It is shown that the perspective option of creating a spectrum analyzer is device in which the phase is determined through the signal

  5. Field intercomparison of four methane gas analyzers suitable for eddy covariance flux measurements

    Directory of Open Access Journals (Sweden)

    O. Peltola

    2013-06-01

    Full Text Available Performances of four methane gas analyzers suitable for eddy covariance measurements are assessed. The assessment and comparison was performed by analyzing eddy covariance data obtained during summer 2010 (1 April to 26 October at a pristine fen, Siikaneva, Southern Finland. High methane fluxes with pronounced seasonality have been measured at this fen. The four participating methane gas analyzers are commercially available closed-path units TGA-100A (Campbell Scientific Inc., USA, RMT-200 (Los Gatos Research, USA, G1301-f (Picarro Inc., USA and an early prototype open-path unit Prototype-7700 (LI-COR Biosciences, USA. The RMT-200 functioned most reliably throughout the measurement campaign, during low and high flux periods. Methane fluxes from RMT-200 and G1301-f had the smallest random errors and the fluxes agree remarkably well throughout the measurement campaign. Cospectra and power spectra calculated from RMT-200 and G1301-f data agree well with corresponding temperature spectra during a high flux period. None of the gas analyzers showed statistically significant diurnal variation for methane flux. Prototype-7700 functioned only for a short period of time, over one month, in the beginning of the measurement campaign during low flux period, and thus, its overall accuracy and season-long performance were not assessed. The open-path gas analyzer is a practical choice for measurement sites in remote locations due to its low power demand, whereas for G1301-f methane measurements interference from water vapor is straightforward to correct since the instrument measures both gases simultaneously. In any case, if only the performance in this intercomparison is considered, RMT-200 performed the best and is the recommended choice if a new fast response methane gas analyzer is needed.

  6. Field intercomparison of four methane gas analyzers suitable for eddy covariance flux measurements

    Science.gov (United States)

    Peltola, O.; Mammarella, I.; Haapanala, S.; Burba, G.; Vesala, T.

    2013-06-01

    Performances of four methane gas analyzers suitable for eddy covariance measurements are assessed. The assessment and comparison was performed by analyzing eddy covariance data obtained during summer 2010 (1 April to 26 October) at a pristine fen, Siikaneva, Southern Finland. High methane fluxes with pronounced seasonality have been measured at this fen. The four participating methane gas analyzers are commercially available closed-path units TGA-100A (Campbell Scientific Inc., USA), RMT-200 (Los Gatos Research, USA), G1301-f (Picarro Inc., USA) and an early prototype open-path unit Prototype-7700 (LI-COR Biosciences, USA). The RMT-200 functioned most reliably throughout the measurement campaign, during low and high flux periods. Methane fluxes from RMT-200 and G1301-f had the smallest random errors and the fluxes agree remarkably well throughout the measurement campaign. Cospectra and power spectra calculated from RMT-200 and G1301-f data agree well with corresponding temperature spectra during a high flux period. None of the gas analyzers showed statistically significant diurnal variation for methane flux. Prototype-7700 functioned only for a short period of time, over one month, in the beginning of the measurement campaign during low flux period, and thus, its overall accuracy and season-long performance were not assessed. The open-path gas analyzer is a practical choice for measurement sites in remote locations due to its low power demand, whereas for G1301-f methane measurements interference from water vapor is straightforward to correct since the instrument measures both gases simultaneously. In any case, if only the performance in this intercomparison is considered, RMT-200 performed the best and is the recommended choice if a new fast response methane gas analyzer is needed.

  7. Comparison of chemistry analytes between 2 portable, commercially available analyzers and a conventional laboratory analyzer in reptiles.

    Science.gov (United States)

    McCain, Stephanie L; Flatland, Bente; Schumacher, Juergen P; Clarke Iii, Elsburgh O; Fry, Michael M

    2010-12-01

    Advantages of handheld and small bench-top biochemical analyzers include requirements for smaller sample volume and practicality for use in the field or in practices, but little has been published on the performance of these instruments compared with standard reference methods in analysis of reptilian blood. The aim of this study was to compare reptilian blood biochemical values obtained using the Abaxis VetScan Classic bench-top analyzer and a Heska i-STAT handheld analyzer with values obtained using a Roche Hitachi 911 chemical analyzer. Reptiles, including 14 bearded dragons (Pogona vitticeps), 4 blue-tongued skinks (Tiliqua gigas), 8 Burmese star tortoises (Geochelone platynota), 10 Indian star tortoises (Geochelone elegans), 5 red-tailed boas (Boa constrictor), and 5 Northern pine snakes (Pituophis melanoleucus melanoleucus), were manually restrained, and a single blood sample was obtained and divided for analysis. Results for concentrations of albumin, bile acids, calcium, glucose, phosphates, potassium, sodium, total protein, and uric acid and activities of aspartate aminotransferase and creatine kinase obtained from the VetScan Classic and Hitachi 911 were compared. Results for concentrations of chloride, glucose, potassium, and sodium obtained from the i-STAT and Hitachi 911 were compared. Compared with results from the Hitachi 911, those from the VetScan Classic and i-STAT had variable correlations, and constant or proportional bias was found for many analytes. Bile acid data could not be evaluated because results for 44 of 45 samples fell below the lower linearity limit of the VetScan Classic. Although the 2 portable instruments might provide measurements with clinical utility, there were significant differences compared with the reference analyzer, and development of analyzer-specific reference intervals is recommended. ©2010 American Society for Veterinary Clinical Pathology.

  8. Chapter 6: Temperature

    Science.gov (United States)

    Jones, Leslie A.; Muhlfeld, Clint C.; Hauer, F. Richard; F. Richard Hauer,; Lamberti, G.A.

    2017-01-01

    Stream temperature has direct and indirect effects on stream ecology and is critical in determining both abiotic and biotic system responses across a hierarchy of spatial and temporal scales. Temperature variation is primarily driven by solar radiation, while landscape topography, geology, and stream reach scale ecosystem processes contribute to local variability. Spatiotemporal heterogeneity in freshwater ecosystems influences habitat distributions, physiological functions, and phenology of all aquatic organisms. In this chapter we provide an overview of methods for monitoring stream temperature, characterization of thermal profiles, and modeling approaches to stream temperature prediction. Recent advances in temperature monitoring allow for more comprehensive studies of the underlying processes influencing annual variation of temperatures and how thermal variability may impact aquatic organisms at individual, population, and community based scales. Likewise, the development of spatially explicit predictive models provide a framework for simulating natural and anthropogenic effects on thermal regimes which is integral for sustainable management of freshwater systems.

  9. Automatic temperature adjustment apparatus

    Science.gov (United States)

    Chaplin, James E.

    1985-01-01

    An apparatus for increasing the efficiency of a conventional central space heating system is disclosed. The temperature of a fluid heating medium is adjusted based on a measurement of the external temperature, and a system parameter. The system parameter is periodically modified based on a closed loop process that monitors the operation of the heating system. This closed loop process provides a heating medium temperature value that is very near the optimum for energy efficiency.

  10. Temperature measurement and control

    CERN Document Server

    Leigh, JR

    1988-01-01

    This book treats the theory and practice of temperature measurement and control and important related topics such as energy management and air pollution. There are no specific prerequisites for the book although a knowledge of elementary control theory could be useful. The first half of the book is an application oriented survey of temperature measurement techniques and devices. The second half is concerned mainly with temperature control in both simple and complex situations.

  11. Empirical Temperature Measurement in Protoplanetary Disks

    Science.gov (United States)

    Weaver, Erik; Isella, Andrea; Boehler, Yann

    2018-02-01

    The accurate measurement of temperature in protoplanetary disks is critical to understanding many key features of disk evolution and planet formation, from disk chemistry and dynamics, to planetesimal formation. This paper explores the techniques available to determine temperatures from observations of single, optically thick molecular emission lines. Specific attention is given to issues such as the inclusion of optically thin emission, problems resulting from continuum subtraction, and complications of real observations. Effort is also made to detail the exact nature and morphology of the region emitting a given line. To properly study and quantify these effects, this paper considers a range of disk models, from simple pedagogical models to very detailed models including full radiative transfer. Finally, we show how the use of the wrong methods can lead to potentially severe misinterpretations of data, leading to incorrect measurements of disk temperature profiles. We show that the best way to estimate the temperature of emitting gas is to analyze the line peak emission map without subtracting continuum emission. Continuum subtraction, which is commonly applied to observations of line emission, systematically leads to underestimation of the gas temperature. We further show that once observational effects such as beam dilution and noise are accounted for, the line brightness temperature derived from the peak emission is reliably within 10%–15% of the physical temperature of the emitting region, assuming optically thick emission. The methodology described in this paper will be applied in future works to constrain the temperature, and related physical quantities, in protoplanetary disks observed with ALMA.

  12. Cardiac arrhythmogenesis and temperature.

    Science.gov (United States)

    Shah, Ujas; Bien, Harold; Entcheva, Emilia

    2006-01-01

    Fast processes in cardiac electrophysiology are often studied at temperatures lower than physiological. Extrapolation of values is based on widely accepted Q10 (Arrhenius) model of temperature dependence (ratio of kinetic properties for a 10 degrees C change in temperature). In this study, we set out to quantify the temperature dependence of essential parameters that define spatiotemporal behavior of cardiac excitation. Additionally, we examined temperature's effects on restitution dynamics. We employed fast fluorescence imaging with voltage-and calcium-sensitive dyes in neonatal rat cardiomyocyte sheets. Conduction velocity (CV), calcium transient duration (CTD), action potential duration (APD) and wavelength (W=CV*duration) change as functions of temperature were quantified. Using 24 degrees C as a reference point, we found a strong temperature-driven increase of CV (Q10=2.3) with smaller CTD and APD changes (Q10=1.33, 1.24, respectively). The spatial equivalents of voltage and calcium duration, wavelength, were slightly less sensitive to temperature with Q10=2.05 and 1.78, respectively, due to the opposing influences of decreasing duration with increased velocity. More importantly, we found that Q10 varies as a function of diastolic interval. Our results indicate the importance of examining temperature sensitivity across several frequencies. Armed with our results, experimentalists and modelers alike have a tool for reconciling different environmental conditions. In a broader sense, these data help better understand thermal influences on arrhythmia development or suppression such as during hibernation or cardiac surgery.

  13. [Dynamic Pulse Signal Processing and Analyzing in Mobile System].

    Science.gov (United States)

    Chou, Yongxin; Zhang, Aihua; Ou, Jiqing; Qi, Yusheng

    2015-09-01

    In order to derive dynamic pulse rate variability (DPRV) signal from dynamic pulse signal in real time, a method for extracting DPRV signal was proposed and a portable mobile monitoring system was designed. The system consists of a front end for collecting and wireless sending pulse signal and a mobile terminal. The proposed method is employed to extract DPRV from dynamic pulse signal in mobile terminal, and the DPRV signal is analyzed both in the time domain and the frequency domain and also with non-linear method in real time. The results show that the proposed method can accurately derive DPRV signal in real time, the system can be used for processing and analyzing DPRV signal in real time.

  14. Methods for Analyzing Multivariate Phenotypes in Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Qiong Yang

    2012-01-01

    Full Text Available Multivariate phenotypes are frequently encountered in genetic association studies. The purpose of analyzing multivariate phenotypes usually includes discovery of novel genetic variants of pleiotropy effects, that is, affecting multiple phenotypes, and the ultimate goal of uncovering the underlying genetic mechanism. In recent years, there have been new method development and application of existing statistical methods to such phenotypes. In this paper, we provide a review of the available methods for analyzing association between a single marker and a multivariate phenotype consisting of the same type of components (e.g., all continuous or all categorical or different types of components (e.g., some are continuous and others are categorical. We also reviewed causal inference methods designed to test whether the detected association with the multivariate phenotype is truly pleiotropy or the genetic marker exerts its effects on some phenotypes through affecting the others.

  15. Analyzing Resiliency of the Smart Grid Communication Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-08-01

    Smart grids are susceptible to cyber-attack as a result of new communication, control and computation techniques employed in the grid. In this paper, we characterize and analyze the resiliency of smart grid communication architecture, specifically an RF mesh based architecture, under cyber attacks. We analyze the resiliency of the communication architecture by studying the performance of high-level smart grid functions such as metering, and demand response which depend on communication. Disrupting the operation of these functions impacts the operational resiliency of the smart grid. Our analysis shows that it takes an attacker only a small fraction of meters to compromise the communication resiliency of the smart grid. We discuss the implications of our result to critical smart grid functions and to the overall security of the smart grid.

  16. Analyzing the financial crisis using the entropy density function

    Science.gov (United States)

    Oh, Gabjin; Kim, Ho-yong; Ahn, Seok-Won; Kwak, Wooseop

    2015-02-01

    The risk that is created by nonlinear interactions among subjects in economic systems is assumed to increase during an abnormal state of a financial market. Nevertheless, investigating the systemic risk in financial markets following the global financial crisis is not sufficient. In this paper, we analyze the entropy density function in the return time series for several financial markets, such as the S&P500, KOSPI, and DAX indices, from October 2002 to December 2011 and analyze the variability in the entropy value over time. We find that the entropy density function of the S&P500 index during the subprime crisis exhibits a significant decrease compared to that in other periods, whereas the other markets, such as those in Germany and Korea, exhibit no significant decrease during the market crisis. These findings demonstrate that the S&P500 index generated a regular pattern in the return time series during the financial crisis.

  17. Analyzing Bullwhip Effect in Supply Networks under Exogenous Uncertainty

    Directory of Open Access Journals (Sweden)

    Mitra Darvish

    2014-05-01

    Full Text Available This paper explains a model for analyzing and measuring the propagation of order amplifications (i.e. bullwhip effect for a single-product supply network topology considering exogenous uncertainty and linear and time-invariant inventory management policies for network entities. The stream of orders placed by each entity of the network is characterized assuming customer demand is ergodic. In fact, we propose an exact formula in order to measure the bullwhip effect in the addressed supply network topology considering the system in Markovian chain framework and presenting a matrix of network member relationships and relevant order sequences. The formula turns out using a mathematical method called frequency domain analysis. The major contribution of this paper is analyzing the bullwhip effect considering exogenous uncertainty in supply networks and using the Fourier transform in order to simplify the relevant calculations. We present a number of numerical examples to assess the analytical results accuracy in quantifying the bullwhip effect.

  18. Constructing and Analyzing Uncertain Social Networks from Unstructured Textual Data

    Science.gov (United States)

    Johansson, Fredrik; Svenson, Pontus

    Social network analysis and link diagrams are popular tools among intelligence analysts for analyzing and understanding criminal and terrorist organizations. A bottleneck in the use of such techniques is the manual effort needed to create the network to analyze from available source information. We describe how text mining techniques can be used for extraction of named entities and the relations among them, in order to enable automatic construction of networks from unstructured text. Since the text mining techniques used, viz. algorithms for named entity recognition and relation extraction, are not perfect, we also describe a method for incorporating information about uncertainty when constructing the networks and when doing the social network analysis. The presented approach is applied on text documents describing terrorist activities in Indonesia.

  19. Analyzing land use change using grid-digitized method

    Directory of Open Access Journals (Sweden)

    Orawit Thinnukool

    2014-04-01

    Full Text Available This study aims to analyze land-use change by a digitized-grid method, a simple technique that can be used for such analysis. We describe a procedure for restructuring land-use data comprising polygonal “shape files” containing successive (x, y boundary points of plots for geographic land-use categories as grid-digitized data, and illustrate this method using data from Thailand. The new data comprise a rectangular grid of geographical coordinates with land-use codes and plot identifiers as fields in database tables indexed by the grid coordinates. Having such a database overcomes difficulties land-use researchers face when querying, analyzing and forecasting land-use change.

  20. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  1. BWR plant analyzer development at BNL (Brookhaven National Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1986-01-01

    An engineering plant analyzer has been developed at BNL for realistically and accurately simulating transients and severe abnormal events in BWR power plants. Simulations are being carried out routinely with high fidelity, high simulation speed, at low cost and with unsurpassed user convenience. The BNL Plant Analyzer is the only operating facility which (a) simulates more than two orders-of-magnitude faster than the CDC-7600 mainframe computer, (b) is accessible and fully operational in on-line interactive mode, remotely from anywhere in the US, from Europe or the Far East (Korea), via widely available IBM-PC compatible personal computers, standard modems and telephone lines, (c) simulates both slow and rapid transients seven times faster than real-time in direct access, and four times faster in remote access modes, (d) achieves high simulation speed without compromising fidelity, and (e) is available to remote access users at the low cost of $160 per hour.

  2. Analyzing Influenza Virus Sequences using Binary Encoding Approach

    Directory of Open Access Journals (Sweden)

    Ham Ching Lam

    2012-01-01

    Full Text Available Capturing mutation patterns of each individual influenza virus sequence is often challenging; in this paper, we demonstrated that using a binary encoding scheme coupled with dimension reduction technique, we were able to capture the intrinsic mutation pattern of the virus. Our approach looks at the variance between sequences instead of the commonly used p-distance or Hamming distance. We first convert the influenza genetic sequences to a binary strings and form a binary sequence alignment matrix and then apply Principal Component Analysis (PCA to this matrix. PCA also provides identification power to identify reassortant virus by using data projection technique. Due to the sparsity of the binary string, we were able to analyze large volume of influenza sequence data in a very short time. For protein sequences, our scheme also allows the incorporation of biophysical properties of each amino acid. Here, we present various encouraging results from analyzing influenza nucleotide, protein and genome sequences using the proposed approach.

  3. Efficiency of biparental crossing in sugarcane analyzed by SSR markers

    Directory of Open Access Journals (Sweden)

    João Messias dos Santos

    2014-07-01

    Full Text Available Sugarcane has hermaphrodite flowers, however, selfing and cross pollination may occur, resulting in selfed or hybrid progeny. The aim of this study was to analyze the paternity of progenies from biparental crosses, in order to identify true hybrids or progenies originating from pollen of unknown origin. Seventy-six progenies from four crosses were analyzed using three highly polymorphic microsatellite markers (SSR. Progenies showed moderate genetic similarity and were grouped into four distinct groups, according to the crosses. Transmission of alleles from parents to offspring was clearly observed, in which selfed individuals were not observed, and only true hybrids or progeny resulting from fertilization with pollen uncommon to both parents were. Results showed that there was contamination with pollen from unknown parents in sugarcane crosses, suggesting that errors in the pedigree may occur, and adjustment in the crossing procedure would decrease progenies from pollen of unknown origin.

  4. A Novel Logic for Analyzing Electronic Payment Protocols

    Directory of Open Access Journals (Sweden)

    Liu Yi

    2016-01-01

    Full Text Available A novel formal method which can be used to analyze security properties such as accountability, fairness and timeliness in electronic payment protocols is proposed. The novel method extends Qing-Zhou approach based on logic reasoning by adding a simple time expression and analysis method. It increases the ability to describe the event time, and extends the time characteristics of the logical inference rules. An anonymous electronic cash payment protocol is analyzed by the novel logic, and the result shows that the fairness of the protocol is not satisfied due to the timeliness problem in protocol. The novel logic method proposed in this paper has a certain theoretical and practical significance for the design and formal analysis of electronic payment protocols. At the same time, its idea has a certain guiding value for improving the security of other security protocols.

  5. Second-harmonic patterned polarization-analyzed reflection confocal microscope

    Science.gov (United States)

    Okoro, Chukwuemeka; Toussaint, Kimani C.

    2017-08-01

    We introduce the second-harmonic patterned polarization-analyzed reflection confocal (SPPARC) microscope-a multimodal imaging platform that integrates Mueller matrix polarimetry with reflection confocal and second-harmonic generation (SHG) microscopy. SPPARC microscopy provides label-free three-dimensional (3-D), SHG-patterned confocal images that lend themselves to spatially dependent, linear polarimetric analysis for extraction of rich polarization information based on the Mueller calculus. To demonstrate its capabilities, we use SPPARC microscopy to analyze both porcine tendon and ligament samples and find differences in both circular degree-of-polarization and depolarization parameters. Moreover, using the collagen-generated SHG signal as an endogenous counterstain, we show that the technique can be used to provide 3-D polarimetric information of the surrounding extrafibrillar matrix plus cells or EFMC region. The unique characteristics of SPPARC microscopy holds strong potential for it to more accurately and quantitatively describe microstructural changes in collagen-rich samples in three spatial dimensions.

  6. Ecoupling server: A tool to compute and analyze electronic couplings.

    Science.gov (United States)

    Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor

    2016-07-05

    Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. A sequential method to analyze the kinetics of biomass pyrolysis.

    Science.gov (United States)

    Huang, Y F; Kuan, W H; Chiueh, P T; Lo, S L

    2011-10-01

    The kinetics of biomass pyrolysis was studied via a sequential method including two stages. Stage one is to analyze the kinetics of biomass pyrolysis and starts with the determination of unreacted fraction of sample at the maximum reaction rate, (1-α)(m). Stage two provides a way to simulate the reaction rate profile and to verify the appropriateness of kinetic parameters calculated in the previous stage. Filter paper, xylan, and alkali lignin were used as representatives of cellulose, hemicellulose, and lignin whose pyrolysis was analyzed with the assumption of the orders of reaction being 1, 2, and 3, respectively. For most of the biomass pyrolysis, kinetic parameters were properly determined and reaction rate profiles were adequately simulated by regarding the order of reaction as 1. This new method should be applicable to most of the biomass pyrolysis and similar reactions whose (1-α)(m) is acquirable, representative, and reliable. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. ANALYZER OF QUANTITY AND QUALITY OF THE ELECTRIC POWER

    Directory of Open Access Journals (Sweden)

    A. I. Semilyak

    2013-01-01

    Full Text Available One of the activities of the research center for “Energy Saving Technologies and Smart Metering in Electrical Power Engineering" is research work on the use of electronic devices and systems of intelligent power distribution, produced by Analog Devices and equipped with the accurate energy consumption measurement feature. The article focuses on the development of the analyzer of quantity and quality of electric energy.The main part of the analyzer is a metering IC by Analog Devices ADE7878, designed for use in commercial and industrial smart electricity meters. Such counters measure the amount of consumed or produced electric energy with high accuracy and have the means of remote meter reading.

  9. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  10. Analyzing the BBOB results by means of benchmarking concepts.

    Science.gov (United States)

    Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C

    2015-01-01

    We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.

  11. Analyzing and Building Nucleic Acid Structures with 3DNA

    OpenAIRE

    Colasanti, Andrew V.; Lu, Xiang-Jun; Olson, Wilma K.

    2013-01-01

    The 3DNA software package is a popular and versatile bioinformatics tool with capabilities to analyze, construct, and visualize three-dimensional nucleic acid structures. This article presents detailed protocols for a subset of new and popular features available in 3DNA, applicable to both individual structures and ensembles of related structures. Protocol 1 lists the set of instructions needed to download and install the software. This is followed, in Protocol 2, by the analysis of a nucleic...

  12. Analyzing the Change-Proneness of APIs and web APIs

    OpenAIRE

    Romano, D.

    2015-01-01

    Analyzing the Change-Proneness of APIs and web APIs APIs and web APIs are used to expose existing business logic and, hence, to ease the reuse of functionalities across multiple software systems. Software systems can use the business logic of legacy systems by binding their APIs and web APIs. With the emergence of a new programming paradigm called service-oriented, APIs are exposed as web APIs hiding the technologies used to implement legacy systems. As a consequence, web APIs establish contr...

  13. User Behavior Analysis from Web Log using Log Analyzer Tool

    OpenAIRE

    Brijesh Bakariya; Ghanshyam Singh Thakur

    2013-01-01

    Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage minin...

  14. Features and applications of the integrated composites analyzer (ICAN) code

    Science.gov (United States)

    Ginty, Carol A.

    1988-01-01

    The thermal behavior of composite spacecraft antenna reflectors was investigated with the Integrated Composites Analyzer (ICAN) computer code. Parametric studies were conducted on the face sheets and honeycomb core which constitute the sandwich-type structures. Selected thermal and mechanical properties of the composite faces and sandwich structures are presented graphically as functions of varying fiber volume ratio, laminate configuration, fabrication factors, and environmental conditions.

  15. EFFECTIVE MYANMAR KEY LAYOUT DESIGN ANALYZING FOR ANDROID SOFT KEYBOARD

    OpenAIRE

    NANDAR PWINT OO; NI LAR THEIN

    2012-01-01

    In mobile phone soft keyboard layout, some Myanmar characters are behind the keyboard layout scene and it is needed to switch with some control key. For mobile phone text entry system, optimizing the fit between technology and the user is critical for realizing the potential benefits of assistive technology. It is a necessarytask to try out the effective key layout design that can enhance the text entry speed. Moreover, there is also weak in analyzing of key layout design for Myanmar Language...

  16. A fully integrated standalone portable cavity ringdown breath acetone analyzer.

    Science.gov (United States)

    Sun, Meixiu; Jiang, Chenyu; Gong, Zhiyong; Zhao, Xiaomeng; Chen, Zhuying; Wang, Zhennan; Kang, Meiling; Li, Yingxin; Wang, Chuji

    2015-09-01

    Breath analysis is a promising new technique for nonintrusive disease diagnosis and metabolic status monitoring. One challenging issue in using a breath biomarker for potential particular disease screening is to find a quantitative relationship between the concentration of the breath biomarker and clinical diagnostic parameters of the specific disease. In order to address this issue, we need a new instrument that is capable of conducting real-time, online breath analysis with high data throughput, so that a large scale of clinical test (more subjects) can be achieved in a short period of time. In this work, we report a fully integrated, standalone, portable analyzer based on the cavity ringdown spectroscopy technique for near-real time, online breath acetone measurements. The performance of the portable analyzer in measurements of breath acetone was interrogated and validated by using the certificated gas chromatography-mass spectrometry. The results show that this new analyzer is useful for reliable online (online introduction of a breath sample without pre-treatment) breath acetone analysis with high sensitivity (57 ppb) and high data throughput (one data per second). Subsequently, the validated breath analyzer was employed for acetone measurements in 119 human subjects under various situations. The instrument design, packaging, specifications, and future improvements were also described. From an optical ringdown cavity operated by the lab-set electronics reported previously to this fully integrated standalone new instrument, we have enabled a new scientific tool suited for large scales of breath acetone analysis and created an instrument platform that can even be adopted for study of other breath biomarkers by using different lasers and ringdown mirrors covering corresponding spectral fingerprints.

  17. A new role for businesses? Analyzing corporate social responsibility

    OpenAIRE

    Sucuoğlu, Gizem

    2002-01-01

    Cataloged from PDF version of article. This study analyzes the concept of corporate social responsibility related to Transnational Corporations (TNCs); why businesses are undertaking new responsibilities related to the social realm. As the power and visibility of TNCs have increased and their influence on society has grown, public expectations concerning their operations also rose. The constructivist theory of International Relations is used in order to approach the issues i...

  18. R Package clickstream: Analyzing Clickstream Data with Markov Chains

    Directory of Open Access Journals (Sweden)

    Michael Scholz

    2016-10-01

    Full Text Available Clickstream analysis is a useful tool for investigating consumer behavior, market research and software testing. I present the clickstream package which provides functionality for reading, clustering, analyzing and writing clickstreams in R. The package allows for a modeling of lists of clickstreams as zero-, first- and higher-order Markov chains. I illustrate the application of clickstream for a list of representative clickstreams from an online store.

  19. Resegmenting assimilation : analyzing second generation education from a binational perspective

    OpenAIRE

    Silva, Travis Scott

    2010-01-01

    This thesis analyzes the educational aspirations and outcomes among people associated with Tlacuitapa, a small Mexican town with a long history of immigration to the United States. Second generation Tlacuitapenses raised and educated in the United States are compared to similarly aged co-ethnics who grew up in the origin community. Results indicate that Tlacuitapenses set high educational goals for themselves regardless of where they live, though aspirations are slightly higher in the United ...

  20. Application of Finite Element Method to Analyze Inflatable Waveguide Structures

    Science.gov (United States)

    Deshpande, M. D.

    1998-01-01

    A Finite Element Method (FEM) is presented to determine propagation characteristics of deformed inflatable rectangular waveguide. Various deformations that might be present in an inflatable waveguide are analyzed using the FEM. The FEM procedure and the code developed here are so general that they can be used for any other deformations that are not considered in this report. The code is validated by applying the present code to rectangular waveguide without any deformations and comparing the numerical results with earlier published results.