WorldWideScience

Sample records for workpiece temperature analyzer

  1. Development and evaluation of a workpiece temperature analyzer for industrial furnaces

    Energy Technology Data Exchange (ETDEWEB)

    1990-05-01

    An instrument capable of measuring the bulk temperature of a workpiece while it is being heated could have a variety of applications. If such an instrument were reasonably priced, it would have a tremendous impact upon national energy usage. The Department of Energy has realized the importance of this type of instrument and has sponsored three concurrent programs to evaluate three different technologies for this type of instrument. In one of these programs, Surface Combustion is the prime contractor to develop a pulsed laser, polarizing interferometer based sensor to be used as a workpiece temperature analyzer (WPTA). The overall goal of the program is to develop a workpiece temperature analyzer for industrial furnaces to significantly improve product quality, productivity and energy efficiency. The workpiece temperature analyzer concept in this program uses a pulsed laser polarizing interferometer (PLPI) for measuring sound velocity through a workpiece. This type of instrument has a high resolution and could detect surface motion of as small as 10 picometer. The sound velocity measurement can be converted to an average workpiece temperature through a mathematical equation programmed into the microprocessor used for control. 76 refs., 12 figs., 14 tabs.

  2. Development and evaluation of a workpiece temperature analyzer (WPTA) for industrial furances (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This project is directed toward the research, development, and evaluation of a viable commercial product-a workpiece temperature measurement analyzer (WPTA) for fired furnaces based on unique radiation properties of surfaces. This WPTA will provide for more uniform, higher quality products and reduce product rejects as well as permit the optimum use of energy. The WPTA may also be utilized in control system applications including metal heat treating, forging furnaces, and ceramic firing furnaces. A large market also exists in the chemical process and refining industry. WPTA applications include the verification of product temperature/time cycles, and use as a front-end sensor for automatic feedback control systems. This report summarizes the work performed in Phase 1 of this three-phase project. The work Phase 1 included the application evaluation; the evaluation of present technologies and limitations; and the development of a preliminary conceptual WPTA design, including identification of technical and economic benefits. Recommendations based on the findings of this report include near-term enhancement of the capabilities of the Pyrolaser, and long-term development of an instrument based on Raman Spectroscopy. Development of the Pyrofiber, fiberoptics version of the Pyrolaser, will be a key to solving present problems involving specularity, measurement angle, and costs of multipoint measurement. Extending the instrument's measurement range to include temperatures below 600{degrees}C will make the product useful for a wider range of applications. The development of Raman Spectroscopy would result in an instrument that could easily be adapted to incorporate a wealth of additional nondestructive analytical capabilities, including stress/stain indication, crystallography, species concentrations, corrosion studies, and catalysis studies, in addition to temperature measurement. 9 refs., 20 figs., 16 tabs.

  3. Workpiece Temperature Variations During Flat Peripheral Grinding

    Science.gov (United States)

    Smirnov, Vitalii A.; Repko, Aleksandr V.

    2018-06-01

    The paper presents the results of researches of temperature variations during flat peripheral grinding. It is shown that the temperature variations of the workpiece can reach 25...30% of the average values, which can lead to some thermal defects. A nonlinear two-dimensional thermophysical grinding model is suggested. It takes into account local changes in the cutting conditions: the fluctuation of the cut layer and the cutting force, the thermal impact of the cutting grains, and the presence of surface cavities in the intermittent wheel. For the numerical solution of the problem, the method of finite differences is adapted. Researches of the method stability and convergence are made, taking into account the specific nature of the problem. A high accuracy of the approximation of the boundary conditions and the nonlinear heat equation is provided. An experimental verification of the proposed thermophysical model was carried out with the use of installation for simultaneous measurement of the grinding force and temperature. It is shown that the discrepancy between the theoretical and experimental values of the grinding temperature does not exceed 5%. The proposed thermophysical model makes it possible to predict with high accuracy the temperature variations during grinding by the wheel periphery.

  4. The Influence Study of Ultrasonic honing parameters to workpiece surface temperature

    Directory of Open Access Journals (Sweden)

    Zhang Xiaoqiang

    2016-01-01

    Full Text Available Ultrasonic vibration honing(UVH, a machine technology, has a lot of advantages. Lower grinding temperature is a significant character and is beneficial for both processing and workpiece surface. But the high temperature caused by big honing pressure becomes the main factor to produce workpiece heat damage in grinding zone. In various honing parameter combinations, the showing effect is different. Based on the thermodynamics classical theory, established the heat transfer equation for grinding zone, simplified the model and obtained the two-dimenssion temperature field expression for workpiece, then simulated the temperature changing trend in a variety of conditions. It is shown that themain temp is in a range of 700K to 1200K. In addition, the variation is huge for every parameter. The study provides a theoretical basis for deeply seeking reasonable machining parameter and obtaining better workpiece quality.

  5. Parameter study of temperature distribution in a work-piece during dry hyperbaric GTA-welding

    International Nuclear Information System (INIS)

    Fulfs, H.

    1989-01-01

    In a sensitivity study the influence of initial and boundary welding parameters upon the spatial and temporal temperature distribution in a work-piece during dry hyperbaric GTA-welding is investigated. It will be shown that at constant arc current a variation of pressure (1-60 bar), arc length (3-10 mm), welding speed (1-2.5 mm/s) or the initial temperature (20-200deg C) of the work-piece to some extend significantly influences the size of melt and heat affected zone as well as the maximum temperature and cooling behaviour of the work-piece; compared to this no mentionable effects of shielding gas temperature (20-300deg C) or flow rate (10-500 dm N 3 /min) on the thermal condition of the work-piece can be recognized. The discovered relationships have been approximated by simple correlations, which can be used for parameter optimization and process control. (orig.) With 33 figs., 4 tabs [de

  6. Effects of high power ultrasonic vibration on temperature distribution of workpiece in dry creep feed up grinding.

    Science.gov (United States)

    Paknejad, Masih; Abdullah, Amir; Azarhoushang, Bahman

    2017-11-01

    Temperature history and distribution of steel workpiece (X20Cr13) was measured by a high tech infrared camera under ultrasonic assisted dry creep feed up grinding. For this purpose, a special experimental setup was designed and fabricated to vibrate only workpiece along two directions by a high power ultrasonic transducer. In this study, ultrasonic effects with respect to grinding parameters including depth of cut (a e ), feed speed (v w ), and cutting speed (v s ) has been investigated. The results indicate that the ultrasonic vibration has considerable effect on reduction of temperature, depth of thermal damage of workpiece and width of temperature contours. Maximum temperature reduction of 25.91% was reported at condition of v s =15m/s, v w =500mm/min, a e =0.4mm in the presence of ultrasonic vibration. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Apparatus for guiding workpieces

    International Nuclear Information System (INIS)

    Misty, G.C.

    1984-01-01

    Workpieces are guided to the tool by a resiliently mounted support guide which accommodates irregularities in the profiles of the workpieces to maintain the axes of the workpieces in alignment with the centre line of the tool. (author)

  8. Temperature measurement of flat glass edge during grinding and effect of wheel and workpiece speeds

    International Nuclear Information System (INIS)

    Moussa, Tala; Garnier, Bertrand; Peerhossaini, Hassan

    2017-01-01

    Flat glass temperature at the vicinity of the grinding wheel during grinding can become very high and reach that of the glass transition (typically around 550–600 °C). In such cases, the mechanical strength of glass is greatly affected and the grinding process cannot be carried out properly. Hence, thermal phenomena must be managed by adjusting the machining parameters to avoid overheating. For this purpose, it is very important to be able to measure the glass temperature, especially at the grinding interface. However, measuring the interfacial glass temperature is difficult and none of the existing methods for metal grinding is adequate for glass grinding. This work shows a novel temperature method that uses constantan and copper strips on both sides of the glass plates; thermoelectric contact being provided by the metallic binder of diamond particles in the grinding wheel. This new technique allows the measurement of the glass edge temperature during the wheel displacement around the glass plate. The experimental results show an average glass edge temperature between 300 and 600 °C depending on the value of the machining parameters such as work speed, wheel speed, depth of cut and water coolant flow rate. As this new thermal instrumentation is rather intrusive, glass temperature biases were analysed using a 3D heat transfer model with a moving source. Model computations performed using finite elements show that the temperature biases are less than 70 °C, which is smaller than the standard deviation of the glass edge temperatures measured during grinding. (paper)

  9. Influence of the cutting parameters on the workpiece temperature during face milling

    Directory of Open Access Journals (Sweden)

    Nowakowski Lukasz

    2017-01-01

    Full Text Available This thesis presents the outcome of experimental research of the impact of changes in cutting speed and volume of material processed during a face milling process on the temperature of the processed object made of copper of M1Ez4 class. Measurement of the temperature of the processed object was conducted in six points with K-type thermocouples. The theoretical amount of released heat per unit of time for particular parameters of machining was also calculated.

  10. Evaluation of Workpiece Temperature during Drilling of GLARE Fiber Metal Laminates Using Infrared Techniques: Effect of Cutting Parameters, Fiber Orientation and Spray Mist Application

    Science.gov (United States)

    Giasin, Khaled; Ayvar-Soberanis, Sabino

    2016-01-01

    The rise in cutting temperatures during the machining process can influence the final quality of the machined part. The impact of cutting temperatures is more critical when machining composite-metal stacks and fiber metal laminates due to the stacking nature of those hybrids which subjects the composite to heat from direct contact with metallic part of the stack and the evacuated hot chips. In this paper, the workpiece surface temperature of two grades of fiber metal laminates commercially know as GLARE is investigated. An experimental study was carried out using thermocouples and infrared thermography to determine the emissivity of the upper, lower and side surfaces of GLARE laminates. In addition, infrared thermography was used to determine the maximum temperature of the bottom surface of machined holes during drilling GLARE under dry and minimum quantity lubrication (MQL) cooling conditions under different cutting parameters. The results showed that during the machining process, the workpiece surface temperature increased with the increase in feed rate and fiber orientation influenced the developed temperature in the laminate. PMID:28773757

  11. Analyzing the effect of tool edge radius on cutting temperature in micro-milling process

    Science.gov (United States)

    Liang, Y. C.; Yang, K.; Zheng, K. N.; Bai, Q. S.; Chen, W. Q.; Sun, G. Y.

    2010-10-01

    Cutting heat is one of the important physical subjects in the cutting process. Cutting heat together with cutting temperature produced by the cutting process will directly have effects on the tool wear and the life as well as on the workpiece processing precision and surface quality. The feature size of the workpiece is usually several microns. Thus, the tiny changes of cutting temperature will affect the workpiece on the surface quality and accuracy. Therefore, cutting heat and temperature generated in micro-milling will have significantly different effect than the one in the traditional tools cutting. In this paper, a two-dimensional coupled thermal-mechanical finite element model is adopted to determine thermal fields and cutting temperature during the Micro-milling process, by using software Deform-2D. The effect of tool edge radius on effective stress, effective strain, velocity field and cutting temperature distribution in micro-milling of aluminum alloy Al2024-T6 were investigated and analyzed. Also, the transient cutting temperature distribution was simulated dynamically. The simulation results show that the cutting temperature in Micro-milling is lower than those occurring in conventional milling processes due to the small loads and low cutting velocity. With increase of tool edge radius, the maximum temperature region gradually occurs on the contact region between finished surfaced and flank face of micro-cutter, instead of the rake face or the corner of micro-cutter. And this phenomenon shows an obvious size effect.

  12. Experimental and Numerical Investigations in Shallow Cut Grinding by Workpiece Integrated Infrared Thermopile Array.

    Science.gov (United States)

    Reimers, Marcel; Lang, Walter; Dumstorff, Gerrit

    2017-09-30

    The purpose of our study is to investigate the heat distribution and the occurring temperatures during grinding. Therefore, we did both experimental and numerical investigations. In the first part, we present the integration of an infrared thermopile array in a steel workpiece. Experiments are done by acquiring data from the thermopile array during grinding of a groove in a workpiece made of steel. In the second part, we present numerical investigations in the grinding process to further understand the thermal characteristic during grinding. Finally, we conclude our work. Increasing the feed speed leads to two things: higher heat flux densities in the workpiece and higher temperature gradients in the material.

  13. Experimental and Numerical Investigations in Shallow Cut Grinding by Workpiece Integrated Infrared Thermopile Array

    Directory of Open Access Journals (Sweden)

    Marcel Reimers

    2017-09-01

    Full Text Available The purpose of our study is to investigate the heat distribution and the occurring temperatures during grinding. Therefore, we did both experimental and numerical investigations. In the first part, we present the integration of an infrared thermopile array in a steel workpiece. Experiments are done by acquiring data from the thermopile array during grinding of a groove in a workpiece made of steel. In the second part, we present numerical investigations in the grinding process to further understand the thermal characteristic during grinding. Finally, we conclude our work. Increasing the feed speed leads to two things: higher heat flux densities in the workpiece and higher temperature gradients in the material.

  14. Virtual-reality displaying of workpiece by reverse modeling

    International Nuclear Information System (INIS)

    Wu Huimin; Zhang Li; Chen Zhiqiang; Zhao Ziran

    2006-01-01

    The authors first propose a suit of CT data processing system: virtual-reality-based testing of workpiece by Reverse Modeling. For reverse modeling module, the authors propose two solutions: integrating Medical CT Modeling software and using VTK library to develop independently. Then, the authors analyze the required functions and characteristics of CT-based Reverse Modeling module, and the key technologies for developing. For virtual-reality module, the authors study characteristics of CT data and the needs of CT users, and describe the required functions and key techniques as for virtual reality displaying module. The authors still analyze the problems and prospective of development. (authors)

  15. Motion characteristic between die and workpiece in spline rolling process with round dies

    Directory of Open Access Journals (Sweden)

    Da-Wei Zhang

    2016-06-01

    Full Text Available In the spline rolling process with round dies, additional kinematic compensation is an essential mechanism for improving the division of teeth and pitch accuracy as well as surface quality. The motion characteristic between the die and workpiece under varied center distance in the spline rolling process was investigated. Mathematical models of the instantaneous center of rotation, transmission ratio, and centrodes in the rolling process were established. The models were used to analyze the rolling process of the involute spline with circular dedendum, and the results indicated that (1 with the reduction in the center distance, the instantaneous center moves toward workpiece, and the transmission ratio increases at first and then decreases; (2 the variations in the instantaneous center and transmission ratio are discontinuous, presenting an interruption when the involute flank begins to be formed; (3 the change in transmission ratio at the forming stage of the workpiece with the involute flank can be negligible; and (4 the centrode of the workpiece is an Archimedes line whose polar radius reduces, and the centrode of the rolling die is similar to Archimedes line when the workpiece is with the involute flank.

  16. Clinical measuring system for the form and position errors of circular workpieces using optical fiber sensors

    Science.gov (United States)

    Tan, Jiubin; Qiang, Xifu; Ding, Xuemei

    1991-08-01

    Optical sensors have two notable advantages in modern precision measurement. One is that they can be used in nondestructive measurement because the sensors need not touch the surfaces of workpieces in measuring. The other one is that they can strongly resist electromagnetic interferences, vibrations, and noises, so they are suitable to be used in machining sites. But the drift of light intensity and the changing of the reflection coefficient at different measuring positions of a workpiece may have great influence on measured results. To solve the problem, a spectroscopic differential characteristic compensating method is put forward. The method can be used effectively not only in compensating the measuring errors resulted from the drift of light intensity but also in eliminating the influence to measured results caused by the changing of the reflection coefficient. Also, the article analyzes the possibility of and the means of separating data errors of a clinical measuring system for form and position errors of circular workpieces.

  17. Properties isotropy of magnesium alloy strip workpieces

    Directory of Open Access Journals (Sweden)

    Р. Кавалла

    2016-12-01

    Full Text Available The paper discusses the issue of obtaining high quality cast workpieces of magnesium alloys produced by strip roll-casting. Producing strips of magnesium alloys by combining the processes of casting and rolling when liquid melt is fed continuously to fast rolls is quite promising and economic. In the process of sheet stamping considerable losses of metal occur on festoons formed due to anisotropy of properties of foil workpiece, as defined by the macro- and microstructure and modes of rolling and annealing. The principal causes of anisotropic mechanical properties of metal strips produced by the combined casting and rolling technique are the character of distribution of intermetallic compounds in the strip, orientation of phases of metal defects and the residual tensions. One of the tasks in increasing the output of fit products during stamping operations consists in minimizing the amount of defects. To lower the level of anisotropy in mechanical properties various ways of treating the melt during casting are suggested. Designing the technology of producing strips of magnesium alloys opens a possibility of using them in automobile industry to manufacture light-weight body elements instead of those made of steel.

  18. Multi-part mask for implanting workpieces

    Science.gov (United States)

    Webb, Aaron P.; Carlson, Charles T.

    2016-05-10

    A multi-part mask has a pattern plate, which includes a planar portion that has the desired aperture pattern to be used during workpiece processing. The multi-part mask also has a mounting frame, which is used to hold the pattern plate. Prior to assembly, the pattern plate has an aligning portion, which has one or more holes through which reusable alignment pins are inserted. These alignment pins enter kinematic joints disposed on the mounting frame, which serve to precisely align the pattern plate to the mounting frame. After the pattern plate has been secured to the mounting frame, the aligning portion can be detached from the pattern plate. The alignment pins can be reused at a later time. In some embodiments, the pattern plate can later be removed from the mounting frame, so that the mounting frame may be reused.

  19. Traceability investigation in Computed Tomography using industry-inspired workpieces

    DEFF Research Database (Denmark)

    Kraemer, Alexandra; Stolfi, Alessandro; Schneider, Timm

    2017-01-01

    This paper concerns an investigation of the accuracy of Computed Tomography (CT) measurements using four industry-inspired workpieces. A total of 16 measurands were selected and calibrated using CMMs. CT measurements on industry-inspired workpieces were carried out using two CTs having different...

  20. Local total and radiative heat-transfer coefficients during the heat treatment of a workpiece in a fluidised bed

    International Nuclear Information System (INIS)

    Gao, W.M.; Kong, L.X.; Hodgson, P.D.

    2006-01-01

    The heat-transfer coefficients around a workpiece immersed in an electrically heated heat treatment fluidised bed were studied. A suspension probe designed to simulate a workpiece of complex geometry was developed to measure local total and radiative heat-transfer coefficients at a high bed temperature. The probe consisted of an energy-storage region separated by insulation from the fluidised bed, except for the measuring surface, and a multi-thermocouple measurement system. Experiments in the fluidised bed were performed for a fluidising medium of 120-mesh alumina, a wide temperature range of 110-1050 deg. C and a fluidising number range of 1.18-4.24. It was found that the workpiece surface temperature has a more significant effect on heat transfer than the bed temperature. The total heat-transfer coefficient at the upper surface of the workpiece sharply decreased at the start of heating, and then steadily increased as heating progressed, while a sharp decrease became a rapid increase and then a slow increase for the radiative heat-transfer coefficient. A great difference in the heat-transfer coefficients around the workpiece was observed

  1. Mass analyzer ``MASHA'' high temperature target and plasma ion source

    Science.gov (United States)

    Semchenkov, A. G.; Rassadov, D. N.; Bekhterev, V. V.; Bystrov, V. A.; Chizov, A. Yu.; Dmitriev, S. N.; Efremov, A. A.; Guljaev, A. V.; Kozulin, E. M.; Oganessian, Yu. Ts.; Starodub, G. Ya.; Voskresensky, V. M.; Bogomolov, S. L.; Paschenko, S. V.; Zelenak, A.; Tikhonov, V. I.

    2004-05-01

    A new separator and mass analyzer of super heavy atoms (MASHA) has been created at the FLNR JINR Dubna to separate and measure masses of nuclei and molecules with precision better than 10-3. First experiments with the FEBIAD plasma ion source have been done and give an efficiency of ionization of up to 20% for Kr with a low flow test leak (6 particle μA). We suppose a magnetic field optimization, using the additional electrode (einzel lens type) in the extracting system, and an improving of the vacuum conditions in order to increase the ion source efficiency.

  2. Mass analyzer 'MASHA' high temperature target and plasma ion source

    International Nuclear Information System (INIS)

    Semchenkov, A.G.; Rassadov, D.N.; Bekhterev, V.V.; Bystrov, V.A.; Chizov, A.Yu.; Dmitriev, S.N.; Efremov, A.A.; Guljaev, A.V.; Kozulin, E.M.; Oganessian, Yu.Ts.; Starodub, G.Ya.; Voskresensky, V.M.; Bogomolov, S.L.; Paschenko, S.V.; Zelenak, A.; Tikhonov, V.I.

    2004-01-01

    A new separator and mass analyzer of super heavy atoms (MASHA) has been created at the FLNR JINR Dubna to separate and measure masses of nuclei and molecules with precision better than 10 -3 . First experiments with the FEBIAD plasma ion source have been done and give an efficiency of ionization of up to 20% for Kr with a low flow test leak (6 particle μA). We suppose a magnetic field optimization, using the additional electrode (einzel lens type) in the extracting system, and an improving of the vacuum conditions in order to increase the ion source efficiency

  3. Machining of high performance workpiece materials with CBN coated cutting tools

    International Nuclear Information System (INIS)

    Uhlmann, E.; Fuentes, J.A. Oyanedel; Keunecke, M.

    2009-01-01

    The machining of high performance workpiece materials requires significantly harder cutting materials. In hard machining, the early tool wear occurs due to high process forces and temperatures. The hardest known material is the diamond, but steel materials cannot be machined with diamond tools because of the reactivity of iron with carbon. Cubic boron nitride (cBN) is the second hardest of all known materials. The supply of such PcBN indexable inserts, which are only geometrically simple and available, requires several work procedures and is cost-intensive. The development of a cBN coating for cutting tools, combine the advantages of a thin film system and of cBN. Flexible cemented carbide tools, in respect to the geometry can be coated. The cBN films with a thickness of up to 2 μm on cemented carbide substrates show excellent mechanical and physical properties. This paper describes the results of the machining of various workpiece materials in turning and milling operations regarding the tool life, resultant cutting force components and workpiece surface roughness. In turning tests of Inconel 718 and milling tests of chrome steel the high potential of cBN coatings for dry machining was proven. The results of the experiments were compared with common used tool coatings for the hard machining. Additionally, the wear mechanisms adhesion, abrasion, surface fatigue and tribo-oxidation were researched in model wear experiments.

  4. Numerical Simulation of a Grinding Process for the Spatial Work-pieces: a Model of the Workpiece and Grinding Wheel

    Directory of Open Access Journals (Sweden)

    I. A. Kiselev

    2015-01-01

    Full Text Available The paper describes a spatial grinding dynamics mathematical model. This model includes a grinding wheel dynamics model, a work-piece dynamics model, and a numerical algorithm of geometric modeling as well. The geometric modeling algorithm is based on the Z-buffer method with author’s modifications. This algorithm allows us to simulate the formation of a new workpiece surface when removing material and as well as to determine the cutting layer thickness for each abrasive grain of the grinding wheel. The use of the surface cell bilinear approximation and the simultaneous use of multiple projection directions are the special features of the algorithm. These features improve modeling quality of machined surface. The grinding wheel model is represented as cutting micro-edges (grains set. Abrasive grains are randomly distributed on the wheel outer surface. Grains size, shape, wheel structure and graininess are taken into account. To determine the uncut chip thickness, which is cut off by each grain of the grinding wheel is used the algorithm, which finds intersection point of uncut work-piece surface with radial ray passing through the grain cutting edge. Grinding forces for each grain are defined based on the cutting layer thickness value using the phenomenological models described in the literature. Using transformations described in the article, grinding forces determined for each grain are reduced to the total grinding force, which acts on the tool and machined work-piece in the appropriate coordinate systems. Work-piece dynamics is modeled with the help of the finite element method using quadratic tetrahedral elements. The described model of spatial grinding dynamics makes it possible to evaluate the level of vibration and grinding forces, as well as the shape errors and surface quality of machined work-piece.

  5. Computer-Aided Manufacturing of 3D Workpieces

    OpenAIRE

    Cornelia Victoria Anghel Drugarin; Mihaela Dorica Stroia

    2017-01-01

    Computer-Aided Manufacturing (CAM) assumes to use dedicated software for controlling machine tools and similar devices in the process of workpieces manufacturing. CAM is, in fact, an application technology that uses computer software and machinery to simplify and automate manufacturing processes. CAM is the inheritor of computer-aided engineering (CAE) and is often used conjunctively with computer-aided design (CAD). Advanced CAM solutions are forthcoming and have a large ...

  6. Resultant geometric variation of a fixtured workpiece Part I: a simulation

    Directory of Open Access Journals (Sweden)

    Supapan Sangnui Chaiprapat

    2006-01-01

    Full Text Available When a workpiece is fixtured for a machining or inspection operation, the accuracy of an operation is mainly determined by the efficiency of the fixturing method. Variability in manufactured workpiece is hardly inevitable. When such variability is found at contact areas between the workpiece and the fixture, errors in location are expected. The errors will affect quality of features to be produced. This paper developed an algorithm to determine variant final locations of a displaced workpiece given normally distributed errorsat contact points. Resultant geometric variation of workpiece location reveals interesting information which is beneficial in tolerance planning.

  7. Separating heat stress from moisture stress: analyzing yield response to high temperature in irrigated maize

    Science.gov (United States)

    Carter, Elizabeth K.; Melkonian, Jeff; Riha, Susan J.; Shaw, Stephen B.

    2016-09-01

    Several recent studies have indicated that high air temperatures are limiting maize (Zea mays L.) yields in the US Corn Belt and project significant yield losses with expected increases in growing season temperatures. Further work has suggested that high air temperatures are indicative of high evaporative demand, and that decreases in maize yields which correlate to high temperatures and vapor pressure deficits (VPD) likely reflect underlying soil moisture limitations. It remains unclear whether direct high temperature impacts on yields, independent of moisture stress, can be observed under current temperature regimes. Given that projected high temperature and moisture may not co-vary the same way as they have historically, quantitative analyzes of direct temperature impacts are critical for accurate yield projections and targeted mitigation strategies under shifting temperature regimes. To evaluate yield response to above optimum temperatures independent of soil moisture stress, we analyzed climate impacts on irrigated maize yields obtained from the National Corn Growers Association (NCGA) corn yield contests for Nebraska, Kansas and Missouri. In irrigated maize, we found no evidence of a direct negative impact on yield by daytime air temperature, calculated canopy temperature, or VPD when analyzed seasonally. Solar radiation was the primary yield-limiting climate variable. Our analyses suggested that elevated night temperature impacted yield by increasing rates of phenological development. High temperatures during grain-fill significantly interacted with yields, but this effect was often beneficial and included evidence of acquired thermo-tolerance. Furthermore, genetics and management—information uniquely available in the NCGA contest data—explained more yield variability than climate, and significantly modified crop response to climate. Thermo-acclimation, improved genetics and changes to management practices have the potential to partially or completely

  8. Experiments for practical education in process parameter optimization for selective laser sintering to increase workpiece quality

    Science.gov (United States)

    Reutterer, Bernd; Traxler, Lukas; Bayer, Natascha; Drauschke, Andreas

    2016-04-01

    Selective Laser Sintering (SLS) is considered as one of the most important additive manufacturing processes due to component stability and its broad range of usable materials. However the influence of the different process parameters on mechanical workpiece properties is still poorly studied, leading to the fact that further optimization is necessary to increase workpiece quality. In order to investigate the impact of various process parameters, laboratory experiments are implemented to improve the understanding of the SLS limitations and advantages on an educational level. Experiments are based on two different workstations, used to teach students the fundamentals of SLS. First of all a 50 W CO2 laser workstation is used to investigate the interaction of the laser beam with the used material in accordance with varied process parameters to analyze a single-layered test piece. Second of all the FORMIGA P110 laser sintering system from EOS is used to print different 3D test pieces in dependence on various process parameters. Finally quality attributes are tested including warpage, dimension accuracy or tensile strength. For dimension measurements and evaluation of the surface structure a telecentric lens in combination with a camera is used. A tensile test machine allows testing of the tensile strength and the interpreting of stress-strain curves. The developed laboratory experiments are suitable to teach students the influence of processing parameters. In this context they will be able to optimize the input parameters depending on the component which has to be manufactured and to increase the overall quality of the final workpiece.

  9. Simulation of Grasping Prismatic Workpieces by a Pneumatically Driven 3-Finger Robotic Gripper

    Directory of Open Access Journals (Sweden)

    Calin-Octavian Miclosina

    2017-12-01

    Full Text Available The paper presents the 3D model of a robotic gripper and a way to determine the value of prehension force by using the SolidWorks software. A set of prismatic workpieces is considered, the contact force finger-workpiece being determined in SolidWorks Motion module for the most disadvantageous case - the heaviest workpiece, as well as von Mises stress that occurs in fingers gripper.

  10. Analyzing the LiF thin films deposited at different substrate temperatures using multifractal technique

    Energy Technology Data Exchange (ETDEWEB)

    Yadav, R.P. [Department of Physics, University of Allahabad, Allahabad, UP 211002 (India); Dwivedi, S., E-mail: suneetdwivedi@gmail.com [K Banerjee Centre of Atmospheric and Ocean Studies, University of Allahabad, Allahabad, UP 211002 (India); Mittal, A.K. [Department of Physics, University of Allahabad, Allahabad, UP 211002 (India); K Banerjee Centre of Atmospheric and Ocean Studies, University of Allahabad, Allahabad, UP 211002 (India); Kumar, Manvendra [Nanotechnology Application Centre, University of Allahabad, Allahabad, UP 211002 (India); Pandey, A.C. [K Banerjee Centre of Atmospheric and Ocean Studies, University of Allahabad, Allahabad, UP 211002 (India); Nanotechnology Application Centre, University of Allahabad, Allahabad, UP 211002 (India)

    2014-07-01

    The Atomic Force Microscopy technique is used to characterize the surface morphology of LiF thin films deposited at substrate temperatures 77 K, 300 K and 500 K, respectively. It is found that the surface roughness of thin film increases with substrate temperature. The multifractal nature of the LiF thin film at each substrate temperature is investigated using the backward two-dimensional multifractal detrended moving average analysis. The strength of multifractility and the non-uniformity of the height probabilities of the thin films increase as the substrate temperature increases. Both the width of the multifractal spectrum and the difference of fractal dimensions of the thin films increase sharply as the temperature reaches 500 K, indicating that the multifractility of the thin films becomes more pronounced at the higher substrate temperatures with greater cluster size. - Highlights: • Analyzing LiF thin films using multifractal detrended moving average technique • Surface roughness of LiF thin film increases with substrate temperature. • LiF thin films at each substrate temperature exhibit multifractality. • Multifractility becomes more pronounced at the higher substrate temperatures.

  11. Use of a fluidized bed combustor and thermogravimetric analyzer for the study of coal ignition temperature

    International Nuclear Information System (INIS)

    Ávila, Ivonete; Crnkovic, Paula M.; Luna, Carlos M.R.; Milioli, Fernando E.

    2017-01-01

    Highlights: • Coal ignition tests were conducted in a fluidized bed and thermogravimetric conditions. • The use of two different ignition criteria showed a similar coal ignition temperature. • Coal ignition temperature was obtained by the changes of gas concentrations in FBC. • Ignition temperatures were associated with the activation energy of coal combustion. - Abstract: Ignition experiments with two bituminous coals were carried out in an atmospheric bubbling fluidized bed combustor (FBC) and a thermogravimetric analyzer (TGA). In the FBC tests, the rapid increase in O_2, CO_2, and SO_2 concentrations is an indication of the coal ignition. In the TGA technique, the ignition temperature was determined by the evaluation of the TGA curves in both combustion and pyrolysis processes. Model-Free Kinetics was applied and the coal ignition temperatures were associated with changes in the activation energy values during the combustion process. The results show the coal with the lowest activation energy also showed the lowest ignition temperature, highest values of volatile content and a higher heating value. The application of two different ignition criteria (TGA and FBC) resulted in similar ignition temperatures. The FBC curves indicated the high volatile coal ignites in the freeboard, i.e. during the feeding in the reactor, whereas the low volatile coal ignites in the bed. Finally, the physicochemical characteristics of the investigated coal types were correlated with their reactivities for the prediction of the ignition temperatures behaviors under different operating conditions as those in FBC.

  12. Ion Temperature Measurements in the Tore Supra Scrape-Off Layer Using a Retarding Field Analyzer

    International Nuclear Information System (INIS)

    Kocan, M.; Gunn, J.P.; Pascal, J.Y.; Gauthier, E.

    2010-01-01

    The retarding field analyzer (RFA) is one of the only widely accepted diagnostics for measuring the ion temperature T i )in the tokamak scrape-off layer. An overview of the outstanding RFA performance over ten years of operation in Tore Supra tokamak is given and the validation of T i measurements is addressed. The RFA measurements in Tore Supra are found to be well reproducible. The ion-to-electron temperature ratio is higher than one at low-to-moderate ion-electron collisionality regime and converges to unity at high collisionality regime. (authors)

  13. Finite grid radius and thickness effects on retarding potential analyzer measured suprathermal electron density and temperature

    International Nuclear Information System (INIS)

    Knudsen, W.C.

    1992-01-01

    The effect of finite grid radius and thickness on the electron current measured by planar retarding potential analyzers (RPAs) is analyzed numerically. Depending on the plasma environment, the current is significantly reduced below that which is calculated using a theoretical equation derived for an idealized RPA having grids with infinite radius and vanishingly small thickness. A correction factor to the idealized theoretical equation is derived for the Pioneer Venus (PV) orbiter RPA (ORPA) for electron gases consisting of one or more components obeying Maxwell statistics. The error in density and temperature of Maxwellian electron distributions previously derived from ORPA data using the theoretical expression for the idealized ORPA is evaluated by comparing the densities and temperatures derived from a sample of PV ORPA data using the theoretical expression with and without the correction factor

  14. Ion temperature measurement by neutral energy analyzer in high-field tokamak TRIAM-1

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, K; Hiraki, N; Toi, K; Itoh, S [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics

    1980-02-01

    The measurement of the ion temperature of the TRIAM-1 tokamak plasma is carried out by using a seven-channel neutral energy analyzer. The temporal and spatial variations of the ion temperature have been obtained with the spatial resolution of +-4.3 mm and the temporal resolution of 100 ..mu..sec. The energy range of the analyzed neutral particles is from 0.2 to 8 keV. The energy spectrum in the TRIAM-1 plasma without the strong gas puffing usually consists of two-component Maxwellian; the one represents the thermal part which is a superposition of the contribution from a hot region (T sub(i) = 100 - 300 eV) and that from an edge region (T sub(i) asymptotically equals 50 eV), and the other represents the superthermal part (T sub(i) asymptotically equals 1 keV). The neutral particle energy spectra at several vertical positions are obtained by scanning the analyzer in the vertical direction. From those spectra, the radial profile of the ion temperature is derived by means of the nonlinear optimization method.

  15. Analyzing the impact of ambient temperature indicators on transformer life in different regions of Chinese mainland.

    Science.gov (United States)

    Bai, Cui-fen; Gao, Wen-Sheng; Liu, Tong

    2013-01-01

    Regression analysis is applied to quantitatively analyze the impact of different ambient temperature characteristics on the transformer life at different locations of Chinese mainland. 200 typical locations in Chinese mainland are selected for the study. They are specially divided into six regions so that the subsequent analysis can be done in a regional context. For each region, the local historical ambient temperature and load data are provided as inputs variables of the life consumption model in IEEE Std. C57.91-1995 to estimate the transformer life at every location. Five ambient temperature indicators related to the transformer life are involved into the partial least squares regression to describe their impact on the transformer life. According to a contribution measurement criterion of partial least squares regression, three indicators are conclusively found to be the most important factors influencing the transformer life, and an explicit expression is provided to describe the relationship between the indicators and the transformer life for every region. The analysis result is applicable to the area where the temperature characteristics are similar to Chinese mainland, and the expressions obtained can be applied to the other locations that are not included in this paper if these three indicators are known.

  16. Analyzing the Impact of Ambient Temperature Indicators on Transformer Life in Different Regions of Chinese Mainland

    Science.gov (United States)

    Bai, Cui-fen; Gao, Wen-Sheng; Liu, Tong

    2013-01-01

    Regression analysis is applied to quantitatively analyze the impact of different ambient temperature characteristics on the transformer life at different locations of Chinese mainland. 200 typical locations in Chinese mainland are selected for the study. They are specially divided into six regions so that the subsequent analysis can be done in a regional context. For each region, the local historical ambient temperature and load data are provided as inputs variables of the life consumption model in IEEE Std. C57.91-1995 to estimate the transformer life at every location. Five ambient temperature indicators related to the transformer life are involved into the partial least squares regression to describe their impact on the transformer life. According to a contribution measurement criterion of partial least squares regression, three indicators are conclusively found to be the most important factors influencing the transformer life, and an explicit expression is provided to describe the relationship between the indicators and the transformer life for every region. The analysis result is applicable to the area where the temperature characteristics are similar to Chinese mainland, and the expressions obtained can be applied to the other locations that are not included in this paper if these three indicators are known. PMID:23843729

  17. Analyzes of students’ higher-order thinking skills of heat and temperature concept

    Science.gov (United States)

    Slamet Budiarti, Indah; Suparmi, A.; Sarwanto; Harjana

    2017-11-01

    High order thinking skills refer to three highest domains of the revised Bloom Taxonomy. The aims of the research were to analyze the student’s higher-order thinking skills of heat and temperature concept. The samples were taken by purposive random sampling technique consisted of 85 high school students from 3 senior high schools in Jayapura city. The descriptive qualitative method was employed in this study. The data were collected by using tests and interviews regarding the subject matters of heat and temperature. Based on the results of data analysis, it was concluded that 68.24% of the students have a high order thinking skills in the analysis, 3.53% of the students have a high order thinking skills in evaluating, and 0% of the students have a high order thinking skills in creation.

  18. Influence of Workpiece Material on Tool Wear Performance and Tribofilm Formation in Machining Hardened Steel

    Directory of Open Access Journals (Sweden)

    Junfeng Yuan

    2016-04-01

    Full Text Available In addition to the bulk properties of a workpiece material, characteristics of the tribofilms formed as a result of workpiece material mass transfer to the friction surface play a significant role in friction control. This is especially true in cutting of hardened materials, where it is very difficult to use liquid based lubricants. To better understand wear performance and the formation of beneficial tribofilms, this study presents an assessment of uncoated mixed alumina ceramic tools (Al2O3+TiC in the turning of two grades of steel, AISI T1 and AISI D2. Both workpiece materials were hardened to 59 HRC then machined under identical cutting conditions. Comprehensive characterization of the resulting wear patterns and the tribofilms formed at the tool/workpiece interface were made using X-ray Photoelectron Spectroscopy and Scanning Electron Microscopy. Metallographic studies on the workpiece material were performed before the machining process and the surface integrity of the machined part was investigated after machining. Tool life was 23% higher when turning D2 than T1. This improvement in cutting tool life and wear behaviour was attributed to a difference in: (1 tribofilm generation on the friction surface and (2 the amount and distribution of carbide phases in the workpiece materials. The results show that wear performance depends both on properties of the workpiece material and characteristics of the tribofilms formed on the friction surface.

  19. Effect of feed rate, workpiece hardness and cutting edge on subsurface residual stress in the hard turning of bearing steel using chamfer + hone cutting edge geometry

    International Nuclear Information System (INIS)

    Hua Jiang; Shivpuri, Rajiv; Cheng Xiaomin; Bedekar, Vikram; Matsumoto, Yoichi; Hashimoto, Fukuo; Watkins, Thomas R.

    2005-01-01

    Residual stress on the machined surface and the subsurface is known to influence the service quality of a component, such as fatigue life, tribological properties, and distortion. Therefore, it is essential to predict and control it for enhanced performance. In this paper, a newly proposed hardness based flow stress model is incorporated into an elastic-viscoplastic finite element model of hard turning to analyze process variables that affect the residual stress profile of the machined surface. The effects of cutting edge geometry and workpiece hardness as well as cutting conditions, such as feed rate and cutting speed, are investigated. Numerical analysis shows that hone edge plus chamfer cutting edge and aggressive feed rate help to increase both compressive residual stress and penetration depth. These predictions are validated by face turning experiments which were conducted using a chamfer with hone cutting edge for different material hardness and cutting parameters. The residual stresses under the machined surface are measured by X-ray diffraction/electropolishing method. A maximum circumferential residual stress of about 1700 MPa at a depth of 40 μm is reached for hardness of 62 HRc and feed rate of 0.56 mm/rev. This represents a significant increase from previously reported results in literatures. It is found from this analysis that using medium hone radius (0.02-0.05 mm) plus chamfer is good for keeping tool temperature and cutting force low, while obtaining desired residual stress profile

  20. Mathematical simulation and optimization of cutting mode in turning of workpieces made of nickel-based heat-resistant alloy

    Science.gov (United States)

    Bogoljubova, M. N.; Afonasov, A. I.; Kozlov, B. N.; Shavdurov, D. E.

    2018-05-01

    A predictive simulation technique of optimal cutting modes in the turning of workpieces made of nickel-based heat-resistant alloys, different from the well-known ones, is proposed. The impact of various factors on the cutting process with the purpose of determining optimal parameters of machining in concordance with certain effectiveness criteria is analyzed in the paper. A mathematical model of optimization, algorithms and computer programmes, visual graphical forms reflecting dependences of the effectiveness criteria – productivity, net cost, and tool life on parameters of the technological process - have been worked out. A nonlinear model for multidimensional functions, “solution of the equation with multiple unknowns”, “a coordinate descent method” and heuristic algorithms are accepted to solve the problem of optimization of cutting mode parameters. Research shows that in machining of workpieces made from heat-resistant alloy AISI N07263, the highest possible productivity will be achieved with the following parameters: cutting speed v = 22.1 m/min., feed rate s=0.26 mm/rev; tool life T = 18 min.; net cost – 2.45 per hour.

  1. Tool breakage detection from 2D workpiece profile using vision method

    International Nuclear Information System (INIS)

    Lee, W K; Ratnam, M M; Ahmad, Z A

    2016-01-01

    In-process tool breakage monitoring can significantly save cost and prevent damages to machine tool. In this paper, a machine vision approach was employed to detect the tool fracture in commercial aluminium oxide ceramic cutting tool during turning of AISI 52100 hardened steel. The contour of the workpiece profile was captured with the aid of backlighting during turning using a high-resolution DSLR camera with a shutter speed of 1/4000 s. The surface profile of the workpiece was extracted to sub-pixel accuracy using the invariant moment method. The effect of fracture in ceramic cutting tools on the surface profile signature of the machined workpiece using autocorrelation was studied. Fracture in the aluminum oxide ceramic tool was found to cause the peaks of autocorrelation function of the workpiece profile to decrease rapidly as the lag distance increased. The envelope of the peaks of the autocorrelation function was observed to deviate significantly from one another at different workpiece angles when the tool has fractured due to the continuous fracture of ceramic cutting insert during machining. (paper)

  2. Comparative Study of White Layer Characteristics for Static and Rotating Workpiece during Electric Discharge Machining

    Directory of Open Access Journals (Sweden)

    SHAHID MEHMOOD

    2017-10-01

    Full Text Available EDMed (Electric Discharge Machined surfaces are unique in their appearance and metallurgical characteristics, which depend on different parameter such as electric parameters, flushing method, and dielectric type. Conventionally, in static workpiece method the EDM (Electric Discharge Machining is performed by submerging both of the tool and workpiece in dielectric liquid and side flushing is provided by impinging pressurized dielectric liquid into the gap. Another flushing method has been investigated in this study, in which, instead of side flushing the rotation motion is provided to the workpiece. Surface characteristics for both flushing methods are determined and compared in this study. The investigated surface characteristics are: surface roughness, crater size, surface morphology, white layer thickness and composition. These investigations are performed using optical and SEM (Scanning Electron Microscope. Statistical confidence limits are determined for scattered data of surface roughness. It is found that the white layer thickness and surface roughness are directly proportional to discharge current for both flushing methods. The comparison has shown that the side flushing of statics workpiece gives thicker white layer and lower surface finish as compared to the flushing caused by the rotation of workpiece

  3. Comparative study of white layer characteristics for static and rotating workpiece during electric discharge machining

    International Nuclear Information System (INIS)

    Mehmood, S.; Shah, M.; Anjum, N.A.

    2017-01-01

    EDMed (Electric Discharge Machined) surfaces are unique in their appearance and metallurgical characteristics, which depend on different parameter such as electric parameters, flushing method, and dielectric type. Conventionally, in static workpiece method the EDM (Electric Discharge Machining) is performed by submerging both of the tool and workpiece in dielectric liquid and side flushing is provided by impinging pressurized dielectric liquid into the gap. Another flushing method has been investigated in this study, in which, instead of side flushing the rotation motion is provided to the workpiece. Surface characteristics for both flushing methods are determined and compared in this study. The investigated surface characteristics are: surface roughness, crater size, surface morphology, white layer thickness and composition. These investigations are performed using optical and SEM (Scanning Electron Microscope). Statistical confidence limits are determined for scattered data of surface roughness. It is found that the white layer thickness and surface roughness are directly proportional to discharge current for both flushing methods. The comparison has shown that the side flushing of statics workpiece gives thicker white layer and lower surface finish as compared to the flushing caused by the rotation of workpiece. (author)

  4. Cutting temperature measurement and material machinability

    Directory of Open Access Journals (Sweden)

    Nedić Bogdan P.

    2014-01-01

    Full Text Available Cutting temperature is very important parameter of cutting process. Around 90% of heat generated during cutting process is then away by sawdust, and the rest is transferred to the tool and workpiece. In this research cutting temperature was measured with artificial thermocouples and question of investigation of metal machinability from aspect of cutting temperature was analyzed. For investigation of material machinability during turning artificial thermocouple was placed just below the cutting top of insert, and for drilling thermocouples were placed through screw holes on the face surface. In this way was obtained simple, reliable, economic and accurate method for investigation of cutting machinability.

  5. Essentiality of Temperature Management while Modeling and Analyzing Tires Contact Forces

    OpenAIRE

    Corollaro, Alfredo

    2014-01-01

    The influence of temperature on tire performance is subject of matter in Research for many years. It is well known that the temperature affects the grip level of the tire and the cornering stiffness at the same time. Anyway, while the influence of temperature on grip level has been deeply investigated in different activities, the influence on cornering stiffness seems to be not sufficiently discussed yet. As shown in this work, the reason could be that the cornering stiffness is not influ...

  6. Crossed, Small-Deflection Energy Analyzer for Wind/Temperature Spectrometer

    Science.gov (United States)

    Herrero, Federico A.; Finne, Theodore T.

    2010-01-01

    Determination of neutral winds and ion drifts in low-Earth-orbit missions requires measurements of the angular and energy distributions of the flux of neutrals and ions entering the satellite from the ram direction. The magnitude and direction of the neutral-wind (or ion-drift) determine the location of the maximum in the angular distribution of the flux. Knowledge of the angle of maximum flux with respect to satellite coordinates (pointing) is essential to determine the wind (or ion-drift) vector. The crossed Small-Deflection Energy Analyzer (SDEA) spectrometer (see Figure 1) occupies minimal volume and consumes minimal power. Designed for upper atmosphere/ionosphere investigations at Earth altitudes above 100 km, the spectrometer operates by detecting the angular and energy distributions of neutral atoms/molecules and ions in two mutually perpendicular planes. In this configuration, the two detection planes actually cross at the spectrometer center. It is possible to merge two SDEAs so they share a common optical axis and alternate measurements between two perpendicular planes, and reduce the number of ion sources from two to one. This minimizes the volume and footprint significantly and reduces the ion source power by a factor of two. The area of the entrance aperture affects the number of ions detected/second and also determines the energy resolution. Thermionic emitters require heater power of about 100 mW to produce 1 mA of electron beam current. Typically, electron energy is about 100 eV and requires a 100-V supply for electron acceleration to supply an additional 100 mW of power. Thus, ion source power is at most 200 mW. If two ion sources were to be used, the ion source power would be, at most, 400 mW. Detector power, deflection voltage power, and microcontroller and other functions require less than 150 mW. A WTS (wind/ temperature spectrometer) with two separate optical axes would consume about 650 mW, while the crossed SDEA described here consumes about

  7. Extracting and Analyzing the Warming Trend in Global and Hemispheric Temperatures

    NARCIS (Netherlands)

    Estrada, Francisco; Perron, Pierre

    2017-01-01

    This article offers an updated and extended attribution analysis based on recently published versions of temperature and forcing datasets. It shows that both temperature and radiative forcing variables can be best represented as trend stationary processes with structural changes occurring in the

  8. Stress analysis and deformation prediction of sheet metal workpieces based on finite element simulation

    Directory of Open Access Journals (Sweden)

    Ren Penghao

    2017-01-01

    Full Text Available After aluminum alloy sheet metal parts machining, the residual stress release will cause a large deformation. To solve this problem, this paper takes a aluminum alloy sheet aerospace workpiece as an example, establishes the theoretical model of elastic deformation and the finite element model, and places quantitative initial stress in each element of machining area, analyses stress release simulation and deformation. Through different initial stress release simulative analysis of deformation of the workpiece, a linear relationship between initial stress and deformation is found; Through simulative analysis of coupling direction-stress release, the superposing relationship between the deformation caused by coupling direction-stress and the deformation caused by single direction stress is found. The research results provide important theoretical support for the stress threshold setting and deformation controlling of the workpieces in the production practice.

  9. Elevated temperature forming method and preheater apparatus

    Science.gov (United States)

    Krajewski, Paul E; Hammar, Richard Harry; Singh, Jugraj; Cedar, Dennis; Friedman, Peter A; Luo, Yingbing

    2013-06-11

    An elevated temperature forming system in which a sheet metal workpiece is provided in a first stage position of a multi-stage pre-heater, is heated to a first stage temperature lower than a desired pre-heat temperature, is moved to a final stage position where it is heated to a desired final stage temperature, is transferred to a forming press, and is formed by the forming press. The preheater includes upper and lower platens that transfer heat into workpieces disposed between the platens. A shim spaces the upper platen from the lower platen by a distance greater than a thickness of the workpieces to be heated by the platens and less than a distance at which the upper platen would require an undesirably high input of energy to effectively heat the workpiece without being pressed into contact with the workpiece.

  10. Method for analyzing passive silicon carbide thermometry with a continuous dilatometer to determine irradiation temperature

    Science.gov (United States)

    Campbell, Anne A.; Porter, Wallace D.; Katoh, Yutai; Snead, Lance L.

    2016-03-01

    Silicon carbide is used as a passive post-irradiation temperature monitor because the irradiation defects will anneal out above the irradiation temperature. The irradiation temperature is determined by measuring a property change after isochronal annealing, i.e., lattice spacing, dimensions, electrical resistivity, thermal diffusivity, or bulk density. However, such methods are time-consuming since the steps involved must be performed in a serial manner. This work presents the use of thermal expansion from continuous dilatometry to calculate the SiC irradiation temperature, which is an automated process requiring minimal setup time. Analysis software was written that performs the calculations to obtain the irradiation temperature and removes possible user-introduced error while standardizing the analysis. This method has been compared to an electrical resistivity and isochronal annealing investigation, and the results revealed agreement of the calculated temperatures. These results show that dilatometry is a reliable and less time-intensive process for determining irradiation temperature from passive SiC thermometry.

  11. Model-based chatter stability prediction and detection for the turning of a flexible workpiece

    Science.gov (United States)

    Lu, Kaibo; Lian, Zisheng; Gu, Fengshou; Liu, Hunju

    2018-02-01

    Machining long slender workpieces still presents a technical challenge on the shop floor due to their low stiffness and damping. Regenerative chatter is a major hindrance in machining processes, reducing the geometric accuracies and dynamic stability of the cutting system. This study has been motivated by the fact that chatter occurrence is generally in relation to the cutting position in straight turning of slender workpieces, which has seldom been investigated comprehensively in literature. In the present paper, a predictive chatter model of turning a tailstock supported slender workpiece considering the cutting position change during machining is explored. Based on linear stability analysis and stiffness distribution at different cutting positions along the workpiece, the effect of the cutting tool movement along the length of the workpiece on chatter stability is studied. As a result, an entire stability chart for a single cutting pass is constructed. Through this stability chart the critical cutting condition and the chatter onset location along the workpiece in a turning operation can be estimated. The difference between the predicted tool locations and the experimental results was within 9% at high speed cutting. Also, on the basis of the predictive model the dynamic behavior during chatter that when chatter arises at some cutting location it will continue for a period of time until another specified location is arrived at, can be inferred. The experimental observation is in good agreement with the theoretical inference. In chatter detection respect, besides the delay strategy and overlap processing technique, a relative threshold algorithm is proposed to detect chatter by comparing the spectrum and variance of the acquired acceleration signals with the reference saved during stable cutting. The chatter monitoring method has shown reliability for various machining conditions.

  12. Influence of Workpiece Surface Topography on the Mechanisms of Liquid Lubrication in Strip Drawing

    DEFF Research Database (Denmark)

    Shimizu, I; Andreasen, Jan Lasson; Bech, Jakob Ilsted

    2001-01-01

    The workpiece surface topography is an important factor controlling the mechanisms of lubrication in metal forming processes. In the present work, the microscopic lubrication mechanisms induced by lubricant trapped in pockets of the surface in strip drawing are studied. The experiments are perfor......The workpiece surface topography is an important factor controlling the mechanisms of lubrication in metal forming processes. In the present work, the microscopic lubrication mechanisms induced by lubricant trapped in pockets of the surface in strip drawing are studied. The experiments...

  13. Diagnostics of flexible workpiece using acoustic emission, acceleration and eddy current sensors in milling operation

    Science.gov (United States)

    Filippov, A. V.; Tarasov, S. Yu.; Filippova, E. O.; Chazov, P. A.; Shamarin, N. N.; Podgornykh, O. A.

    2016-11-01

    Monitoring of the edge clamped workpiece deflection during milling has been carried our using acoustic emission, accelerometer and eddy current sensors. Such a monitoring is necessary in precision machining of vital parts used in air-space engineering where a majority of them made by milling. The applicability of the AE, accelerometers and eddy current sensors has been discussed together with the analysis of measurement errors. The appropriate sensor installation diagram has been proposed for measuring the workpiece elastic deflection exerted by the cutting force.

  14. Temperature variation in metal ceramic technology analyzed using time domain optical coherence tomography

    Science.gov (United States)

    Sinescu, Cosmin; Topala, Florin I.; Negrutiu, Meda Lavinia; Duma, Virgil-Florin; Podoleanu, Adrian G.

    2014-01-01

    The quality of dental prostheses is essential in providing good quality medical services. The metal ceramic technology applied in dentistry implies ceramic sintering inside the dental oven. Every ceramic material requires a special sintering chart which is recommended by the producer. For a regular dental technician it is very difficult to evaluate if the temperature inside the oven remains the same as it is programmed on the sintering chart. Also, maintaining the calibration in time is an issue for the practitioners. Metal ceramic crowns develop a very accurate pattern for the ceramic layers depending on the temperature variation inside the oven where they are processed. Different patterns were identified in the present study for the samples processed with a variation in temperature of +30 °C to +50 °C, respectively - 30 0°C to -50 °C. The OCT imagistic evaluations performed for the normal samples present a uniform spread of the ceramic granulation inside the ceramic materials. For the samples sintered at a higher temperature an alternation between white and darker areas between the enamel and opaque layers appear. For the samples sintered at a lower temperature a decrease in the ceramic granulation from the enamel towards the opaque layer is concluded. The TD-OCT methods can therefore be used efficiently for the detection of the temperature variation due to the ceramic sintering inside the ceramic oven.

  15. Using basic metrics to analyze high-resolution temperature data in the subsurface

    Science.gov (United States)

    Shanafield, Margaret; McCallum, James L.; Cook, Peter G.; Noorduijn, Saskia

    2017-08-01

    Time-series temperature data can be summarized to provide valuable information on spatial variation in subsurface flow, using simple metrics. Such computationally light analysis is often discounted in favor of more complex models. However, this study demonstrates the merits of summarizing high-resolution temperature data, obtained from a fiber optic cable installation at several depths within a water delivery channel, into daily amplitudes and mean temperatures. These results are compared to fluid flux estimates from a one-dimensional (1D) advection-conduction model and to the results of a previous study that used a full three-dimensional (3D) model. At a depth of 0.1 m below the channel, plots of amplitude suggested areas of advective water movement (as confirmed by the 1D and 3D models). Due to lack of diurnal signal at depths below 0.1 m, mean temperature was better able to identify probable areas of water movement at depths of 0.25-0.5 m below the channel. The high density of measurements provided a 3D picture of temperature change over time within the study reach, and would be suitable for long-term monitoring in man-made environments such as constructed wetlands, recharge basins, and water-delivery channels, where a firm understanding of spatial and temporal variation in infiltration is imperative for optimal functioning.

  16. Analyzing the electrophysiological effects of local epicardial temperature in experimental studies with isolated hearts

    International Nuclear Information System (INIS)

    Tormos, Alvaro; Millet, José; Guill, Antonio; Chorro, Francisco J; Cánoves, Joaquín; Mainar, Luis; Such, Luis; Alberola, Antonio; Trapero, Isabel; Such-Miquel, Luis

    2008-01-01

    As a result of their modulating effects upon myocardial electrophysiology, both hypo- and hyperthermia can be used to study the mechanisms that generate or sustain cardiac arrhythmias. The present study describes an original electrode developed with thick-film technology and capable of controlling regional temperature variations in the epicardium while simultaneously registering its electrical activity. In this way, it is possible to measure electrophysiological parameters of the heart at different temperatures. The results obtained with this device in a study with isolated and perfused rabbit hearts are reported. An exploration has been made of the effects of local temperature changes upon the electrophysiological parameters implicated in myocardial conduction. Likewise, an analysis has been made of the influence of local temperature upon ventricular fibrillation activation frequency. It is concluded that both regional hypo- and hyperthermia exert reversible and opposite effects upon myocardial refractoriness and conduction velocity in the altered zone. The ventricular activation wavelength determined during constant pacing at 250 ms cycles is not significantly modified, however. During ventricular fibrillation, the changes in the fibrillatory frequency do not seem to be transmitted to normal temperature zones

  17. Specific Features of Chip Making and Work-piece Surface Layer Formation in Machining Thermal Coatings

    Directory of Open Access Journals (Sweden)

    V. M. Yaroslavtsev

    2016-01-01

    Full Text Available A wide range of unique engineering structural and performance properties inherent in metallic composites characterizes wear- and erosion-resistant high-temperature coatings made by thermal spraying methods. This allows their use both in manufacturing processes to enhance the wear strength of products, which have to operate under the cyclic loading, high contact pressures, corrosion and high temperatures and in product renewal.Thermal coatings contribute to the qualitative improvement of the technical level of production and product restoration using the ceramic composite materials. However, the possibility to have a significantly increased product performance, reduce their factory labour hours and materials/output ratio in manufacturing and restoration is largely dependent on the degree of the surface layer quality of products at their finishing stage, which is usually provided by different kinds of machining.When machining the plasma-sprayed thermal coatings, a removing process of the cut-off layer material is determined by its distinctive features such as a layered structure, high internal stresses, low ductility material, high tendency to the surface layer strengthening and rehardening, porosity, high abrasive properties, etc. When coatings are machined these coating properties result in specific characteristics of chip formation and conditions for formation of the billet surface layer.The chip formation of plasma-sprayed coatings was studied at micro-velocities using an experimental tool-setting microscope-based setup, created in BMSTU. The setup allowed simultaneous recording both the individual stages (phases of the chip formation process and the operating force factors.It is found that formation of individual chip elements comes with the multiple micro-cracks that cause chipping-off the small particles of material. The emerging main crack in the cut-off layer of material leads to separation of the largest chip element. Then all the stages

  18. Effect of changing polarity of graphite tool/ Hadfield steel workpiece couple on machining performances in die sinking EDM

    Directory of Open Access Journals (Sweden)

    Özerkan Haci Bekir

    2017-01-01

    Full Text Available In this study, machining performance ouput parameters such as machined surface roughness (SR, material removal rate (MRR, tool wear rate (TWR, were experimentally examined and analyzed with the diversifying and changing machining parameters in (EDM. The processing parameters (input par. of this research are stated as tool material, peak current (I, pulse duration (ton and pulse interval (toff. The experimental machinings were put into practice by using Hadfield steel workpiece (prismatic and cylindrical graphite electrodes with kerosene dielectric at different machining current, polarity and pulse time settings. The experiments have shown that the type of tool material, polarity (direct polarity forms higher MRR, SR and TWR, current (high current lowers TWR and enhances MRR, TWR and pulse on time (ton=48□s is critical threshold value for MRR and TWR were influential on machining performance in electrical discharge machining.

  19. Abnormal Condition Monitoring of Workpieces Based on RFID for Wisdom Manufacturing Workshops

    Directory of Open Access Journals (Sweden)

    Cunji Zhang

    2015-12-01

    Full Text Available Radio Frequency Identification (RFID technology has been widely used in many fields. However, previous studies have mainly focused on product life cycle tracking, and there are few studies on real-time status monitoring of workpieces in manufacturing workshops. In this paper, a wisdom manufacturing model is introduced, a sensing-aware environment for a wisdom manufacturing workshop is constructed, and RFID event models are defined. A synthetic data cleaning method is applied to clean the raw RFID data. The Complex Event Processing (CEP technology is adopted to monitor abnormal conditions of workpieces in real time. The RFID data cleaning method and data mining technology are examined by simulation and physical experiments. The results show that the synthetic data cleaning method preprocesses data well. The CEP based on the Rifidi® Edge Server technology completed abnormal condition monitoring of workpieces in real time. This paper reveals the importance of RFID spatial and temporal data analysis in real-time status monitoring of workpieces in wisdom manufacturing workshops.

  20. Abnormal Condition Monitoring of Workpieces Based on RFID for Wisdom Manufacturing Workshops

    Science.gov (United States)

    Zhang, Cunji; Yao, Xifan; Zhang, Jianming

    2015-01-01

    Radio Frequency Identification (RFID) technology has been widely used in many fields. However, previous studies have mainly focused on product life cycle tracking, and there are few studies on real-time status monitoring of workpieces in manufacturing workshops. In this paper, a wisdom manufacturing model is introduced, a sensing-aware environment for a wisdom manufacturing workshop is constructed, and RFID event models are defined. A synthetic data cleaning method is applied to clean the raw RFID data. The Complex Event Processing (CEP) technology is adopted to monitor abnormal conditions of workpieces in real time. The RFID data cleaning method and data mining technology are examined by simulation and physical experiments. The results show that the synthetic data cleaning method preprocesses data well. The CEP based on the Rifidi® Edge Server technology completed abnormal condition monitoring of workpieces in real time. This paper reveals the importance of RFID spatial and temporal data analysis in real-time status monitoring of workpieces in wisdom manufacturing workshops. PMID:26633418

  1. A case study of analyzing 11th graders’ problem solving ability on heat and temperature topic

    Science.gov (United States)

    Yulianawati, D.; Muslim; Hasanah, L.; Samsudin, A.

    2018-05-01

    Problem solving ability must be owned by students after the process of physics learning so that the concept of physics becomes meaningful. Consequently, the research aims to describe their problem solving ability. Metacognition is contributed to physics learning to the success of students in solving problems. This research has already been implemented to 37 science students (30 women and 7 men) of eleventh grade from one of the secondary schools in Bandung. The research methods utilized the single case study with embedded research design. The instrument is Heat and Temperature Problem Solving Ability Test (HT-PSAT) which consists of twelve questions from three context problems. The result shows that the average value of the test is 8.27 out of the maximum total value of 36. In conclusion, eleventh graders’ problem-solving ability is still under expected. The implication of the findings is able to create learning situations which are probably developing students to embrace better problem solving ability.

  2. Analyzing the Radiation Properties of High-Z Impurities in High-Temperature Plasmas

    International Nuclear Information System (INIS)

    Reinke, M. L.; Ince-Cushman, A.; Podpaly, Y.; Rice, J. E.; Bitter, M.; Hill, K. W.; Fournier, K. B.; Gu, M. F.

    2009-01-01

    Most tokamak-based reactor concepts require the use of noble gases to form either a radiative mantle or divertor to reduce conductive heat exhaust to tolerable levels for plasma facing components. Predicting the power loss necessary from impurity radiation is done using electron temperature-dependent 'cooling-curves' derived from ab initio atomic physics models. We present here a technique to verify such modeling using highly radiative, argon infused discharges on Alcator C-Mod. A novel x-ray crystal imaging spectrometer is used to measure spatially resolved profiles of line-emissivity, constraining impurity transport simulations. Experimental data from soft x-ray diodes, bare AXUV diodes and foil bolometers are used to determine the local emissivity in three overlapping spectral bands, which are quantitatively compared to models. Comparison of broadband measurements show agreement between experiment and modeling in the core, but not over the entire profile, with the differences likely due to errors in the assumed radial impurity transport outside of the core. Comparison of Ar 16+ x-ray line emission modeling to measurements suggests an additional problem with the collisional-radiative modeling of that charge state.

  3. Pregnant women models analyzed for RF exposure and temperature increase in 3T RF shimmed birdcages.

    Science.gov (United States)

    Murbach, Manuel; Neufeld, Esra; Samaras, Theodoros; Córcoles, Juan; Robb, Fraser J; Kainz, Wolfgang; Kuster, Niels

    2017-05-01

    MRI is increasingly used to scan pregnant patients. We investigated the effect of 3 Tesla (T) two-port radiofrequency (RF) shimming in anatomical pregnant women models. RF shimming improves B 1 + uniformity, but may at the same time significantly alter the induced current distribution and result in large changes in both the level and location of the absorbed RF energy. In this study, we evaluated the electrothermal exposure of pregnant women in the third, seventh, and ninth month of gestation at various imaging landmarks in RF body coils, including modes with RF shimming. Although RF shimmed configurations may lower the local RF exposure for the mother, they can increase the thermal load on the fetus. In worst-case configurations, whole-body exposure and local peak temperatures-up to 40.8°C-are equal in fetus and mother. Two-port RF shimming can significantly increase the fetal exposure in pregnant women, requiring further research to derive a very robust safety management. For the time being, restriction to the CP mode, which reduces fetal SAR exposure compared with linear-horizontal polarization modes, may be advisable. Results from this study do not support scanning pregnant patients above the normal operating mode. Magn Reson Med 77:2048-2056, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  4. A study on the nuclear fusion reactor - Development of the neutral particle analyzer for the measurement of plasma temperature

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hee Dong [Kyungpook National University, Taegu (Korea, Republic of); Kim, Do Sung [Taegu University, Taegu (Korea, Republic of)

    1996-09-01

    For measurements of the plasma ion temperature of KT-1 tokamak the charge exchange neutral particle analyzer was made. The NPA was contain stripping cell, cylinderical electrostatic plate type energy analyzer, and detector. The stripping cell has three beam path. The one is empty, the one is covered with Ni-mesh, and the other is covered with Ni-mesh and carbon foil. The mesh no. of the Ni-mesh is 70 lines/inch and the thickness of the carbon foil is 50 A . The radii of the cylinderical plate of the energy analyzer are 112 mm, 95 mm, and the height of the plate is 50 mm. The voltage of the plate is 0 {approx} 1 kV. The ion and neutral particle detector are channeltron (Galileo 4839). 36 refs., 1 tab., 43 figs. (author)

  5. Estimating the workpiece-backingplate heat transfer coefficient in friction stirwelding

    DEFF Research Database (Denmark)

    Larsen, Anders; Stolpe, Mathias; Hattel, Jesper Henri

    2012-01-01

    Purpose - The purpose of this paper is to determine the magnitude and spatial distribution of the heat transfer coefficient between the workpiece and the backingplate in a friction stir welding process using inverse modelling. Design/methodology/approach - The magnitude and distribution of the heat...... in an inverse modeling approach to determine the heat transfer coefficient in friction stir welding. © Emerald Group Publishing Limited....

  6. Accuracy of Setting Work-pieces on Automatic Lines with Pallet-fixtures

    Directory of Open Access Journals (Sweden)

    L. A. Kolesnikov

    2015-01-01

    Full Text Available The accuracy of positioning surfaces to be processed on automatic lines with pallet-fixtures essentially depends on the setting error of the pallet-fixtures with work-pieces in ready-to-work position.The applied methods for calculating the setting error do not give a complete picture of the possible coordinates of the point when in the pallet is displaced in different directions.The aim of the work was to determine an accuracy of the setting work-pieces on automatic lines with pallets-fixtures, using a computational and analytical method, to improve a manufacturing precision of parts.The currently used methods to calculate the setting error do not give a complete picture of the possible coordinates of the point of the pallet displacement in different directions. The paper offers a method of equivalent mechanism to determine all the variety of options for displacements in the horizontal plane with a diverse combination of angular and plane-parallel displacements.Using a four-bar linkage, as an equivalent mechanism, allows us to define a zone of the possible positions of any point of the work-piece pallet platform, as the zone bounded by the coupler curve. In case the gaps in the nodes of the two fixtures are equal the zone of possible positions of the point in the parallel displacement of the platform is determined by the circumference and at an angular displacement by the ellipse.The obtained analytical dependences allow us to determine the error at the stage of design with the certain gaps in the fixture nodes.The above method of calculation makes it possible to define a zone of the appropriate placement of the work-piece on its platform for the specified parameters of the pallet to meet conditions for ensuring the coordinate accuracy of the processed axes of holes.

  7. The influence of mechanical properties of workpiece material on the main cutting force in face milling

    Directory of Open Access Journals (Sweden)

    M. Sekulić

    2010-10-01

    Full Text Available The paper presents the research into cutting forces in face milling of three different materials: steel Č 4732 (EN42CrMo4, nodular cast iron NL500 (EN-GJS-500-7 and silumine AlSi10Mg (EN AC-AlSi10Mg. Obtained results show that hardness and tensile strength values of workpiece material have a significant influence on the main cutting force, and thereby on the cutting energy in machining.

  8. Stress analysis and deformation prediction of sheet metal workpieces based on finite element simulation

    OpenAIRE

    Ren Penghao; Wang Aimin; Wang Xiaolong; Zhang Yanlin

    2017-01-01

    After aluminum alloy sheet metal parts machining, the residual stress release will cause a large deformation. To solve this problem, this paper takes a aluminum alloy sheet aerospace workpiece as an example, establishes the theoretical model of elastic deformation and the finite element model, and places quantitative initial stress in each element of machining area, analyses stress release simulation and deformation. Through different initial stress release simulative analysis of deformation ...

  9. A Precise Visual Method for Narrow Butt Detection in Specular Reflection Workpiece Welding

    Directory of Open Access Journals (Sweden)

    Jinle Zeng

    2016-09-01

    Full Text Available During the complex path workpiece welding, it is important to keep the welding torch aligned with the groove center using a visual seam detection method, so that the deviation between the torch and the groove can be corrected automatically. However, when detecting the narrow butt of a specular reflection workpiece, the existing methods may fail because of the extremely small groove width and the poor imaging quality. This paper proposes a novel detection method to solve these issues. We design a uniform surface light source to get high signal-to-noise ratio images against the specular reflection effect, and a double-line laser light source is used to obtain the workpiece surface equation relative to the torch. Two light sources are switched on alternately and the camera is synchronized to capture images when each light is on; then the position and pose between the torch and the groove can be obtained nearly at the same time. Experimental results show that our method can detect the groove effectively and efficiently during the welding process. The image resolution is 12.5 μm and the processing time is less than 10 ms per frame. This indicates our method can be applied to real-time narrow butt detection during high-speed welding process.

  10. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  11. Effect of cutting parameters on workpiece and tool properties during drilling of Ti-6Al-4V

    International Nuclear Information System (INIS)

    Celik, Yahya Hisman; Yildiz, Hakan

    2016-01-01

    The main aim of machining is to provide the dimensional preciseness together with surface and geometric quality of the workpiece to be manufactured within the desired limits. Today, it is quite hard to drill widely utilized Ti-6Al-4 V alloys owing to their superior features. Therefore, in this study, the effects of temperature, chip formation, thrust forces, surface roughness, burr heights, hole diameter deviations and tool wears on the drilling of Ti-6Al-4 V were investigated under dry cutting conditions with different cutting speeds and feed rates by using tungsten carbide (WC) and high speed steel (HSS) drills. Moreover, the mathematical modeling of thrust force, surface roughness, burr height and tool wear were formed using Matlab. It was found that the feed rate, cutting speed and type of drill have a major effect on the thrust forces, surface roughness, burr heights, hole diameter deviations and tool wears. Optimum results in the Ti-6Al-4 V alloy drilling process were obtained using the WC drill.

  12. Effect of cutting parameters on workpiece and tool properties during drilling of Ti-6Al-4V

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Yahya Hisman; Yildiz, Hakan [Batman Univ. (Turkey). Dept. of Mechanical Engineering; Oezek, Cebeli [Firat Univ., Elazig (Turkey)

    2016-08-01

    The main aim of machining is to provide the dimensional preciseness together with surface and geometric quality of the workpiece to be manufactured within the desired limits. Today, it is quite hard to drill widely utilized Ti-6Al-4 V alloys owing to their superior features. Therefore, in this study, the effects of temperature, chip formation, thrust forces, surface roughness, burr heights, hole diameter deviations and tool wears on the drilling of Ti-6Al-4 V were investigated under dry cutting conditions with different cutting speeds and feed rates by using tungsten carbide (WC) and high speed steel (HSS) drills. Moreover, the mathematical modeling of thrust force, surface roughness, burr height and tool wear were formed using Matlab. It was found that the feed rate, cutting speed and type of drill have a major effect on the thrust forces, surface roughness, burr heights, hole diameter deviations and tool wears. Optimum results in the Ti-6Al-4 V alloy drilling process were obtained using the WC drill.

  13. Deformation micro-mechanism for compression of magnesium alloys at room temperature analyzed by electron backscatter diffraction

    International Nuclear Information System (INIS)

    Song, G.S.; Chen, Q.Q.; Zhang, S.H.; Xu, Y.

    2015-01-01

    Highlights: • In-situ tracking on the evolution of grains orientation of magnesium alloy was carried out by EBSD. • Distributions of twin bands were closely related to the activation of extension twin variants. • Activation of extension twin significantly changes the order of Schmid factor of slips. • Pyramidal slips become the dominant deformation mode at the late stage of compression. - Abstract: In-situ tracking on the evolution of grains orientation of rolled magnesium alloy sheets compressed uniaxially at room temperature was carried out by the method of electron backscatter diffraction (EBSD), and meanwhile, distributions of twin bands, activations of twin and slips were also analyzed. The results show that the distributions of twin bands were closely related to the activation of extension twin variants. The activation of extension twin significantly changes the order of Schmid factor of different slips, and accordingly affects the activation of slips during the subsequent deformation

  14. Thermally joining and/or coating or thermally separating the workpieces having heat-sensitive coating, comprises restoring coating by thermally coating the coating material after thermally joining and/or coating or thermally separating

    OpenAIRE

    Riedel, Frank; Winkelmann, Ralf; Puschmann, Markus

    2011-01-01

    The method for thermally joining and/or coating or thermally separating the workpieces (1), which have a heat-sensitive coating (2), comprises restoring the coating by thermally coating a coating material (3) after thermally joining and/or coating or thermally separating the workpieces. A part of the thermal energy introduced in the workpiece for joining and/or coating or separating or in the workpieces is used for thermally coating the coating material. Two workpieces are welded or soldered ...

  15. A novel approach for analyzing glass-transition temperature vs. composition patterns: application to pharmaceutical compound+polymer systems.

    Science.gov (United States)

    Kalogeras, Ioannis M

    2011-04-18

    In medicine, polymer-based materials are commonly used as excipients of poorly water-soluble drugs. The success of the encapsulation, as well as the physicochemical stability of the products, is often reflected on their glass transition temperature (T(g)) vs. composition (w) dependencies. The shape of the T(g)(w) patterns is critically influenced by polymer's molecular mass, drug molecule's shape and molecular volume, the type and degree of shielding of hydrogen-bonding capable functional groups, as well as aspects of the preparation process. By altering mixture's T(g) the amorphous solid form of the active ingredient may be retained at ambient or body temperatures, with concomitant improvements in handling, solubility, dissolution rate and oral bioavailability. Given the importance of the problem, the glass transitions observed in pharmaceutical mixtures have been extensively analyzed, aiming to appraise the state of mixing and intermolecular interactions. Here, accumulated experimental information on related systems is re-evaluated and comparably discussed under the light of a more effective and system-inclusive T(g)(w) equation. The present analysis indicates that free volume modifications and conformational changes of the macromolecular chains dominate, over enthalpic effects of mixing, in determining thermal characteristics and crystallization inhibition/retardation. Moreover, hydrogen-bonding and ion-dipole heterocontacts--although favorable of a higher degree of mixing--appear less significant compared to the steric hindrances and the antiplasticization proffered by the higher viscosity component. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Optimization of control parameters for SR in EDM injection flushing type on stainless steel 304 workpiece

    International Nuclear Information System (INIS)

    Reza, M S; Yusoff, A R; Shaharun, M A

    2012-01-01

    The operating control parameters of injection flushing type of electrical discharge machining process on stainless steel 304 workpiece with copper tools are being optimized according to its individual machining characteristic i.e. surface roughness (SR). Higher SR during EDM machining process results for poor surface integrity of the workpiece. Hence, the quality characteristic for SR is set to lower-the-better to achieve the optimum surface integrity. Taguchi method has been used for the construction, layout and analysis of the experiment for each of the machining characteristic for the SR. The use of Taguchi method in the experiment saves a lot of time and cost of machining the experiment samples. Therefore, an L18 Orthogonal array which was the fundamental component in the statistical design of experiments has been used to plan the experiments and Analysis of Variance (ANOVA) is used to determine the optimum machining parameters for this machining characteristic. The control parameters selected for this optimization experiments are polarity, pulse on duration, discharge current, discharge voltage, machining depth, machining diameter and dielectric liquid pressure. The result had shown that the lower the machining diameter, the lower will be the SR.

  17. Thermodynamic study of residual heat from a high temperature nuclear reactor to analyze its viability in cogeneration processes

    International Nuclear Information System (INIS)

    Santillan R, A.; Valle H, J.; Escalante, J. A.

    2015-09-01

    In this paper the thermodynamic study of a nuclear power plant of high temperature at gas turbine (GTHTR300) is presented for estimating the exploitable waste heat in a process of desalination of seawater. One of the most studied and viable sustainable energy for the production of electricity, without the emission of greenhouse gases, is the nuclear energy. The fourth generation nuclear power plants have greater advantages than those currently installed plants; these advantages have to do with security, increased efficiencies and feasibility to be coupled to electrical cogeneration processes. In this paper the thermodynamic study of a nuclear power plant type GTHTR300 is realized, which is selected by greater efficiencies and have optimal conditions for use in electrical cogeneration processes due to high operating temperatures, which are between 700 and 950 degrees Celsius. The aim of the study is to determine the heat losses and the work done at each stage of the system, determining where they are the greatest losses and analyzing in that processes can be taken advantage. Based on the study was appointed that most of the energy losses are in form of heat in the coolers and usually this is emitted into the atmosphere without being used. From the results a process of desalination of seawater as electrical cogeneration process is proposed. This paper contains a brief description of the operation of the nuclear power plant, focusing on operation conditions and thermodynamic characteristics for the implementation of electrical cogeneration process, a thermodynamic analysis based on mass and energy balance was developed. The results allow quantifying the losses of thermal energy and determining the optimal section for coupling of the reactor with the desalination process, seeking to have a great overall efficiency. (Author)

  18. Numerical simulation of the combination effect of external magnetic field and rotating workpiece on abrasive flow finishing

    Energy Technology Data Exchange (ETDEWEB)

    Kheradmand, Saeid; Esmailian, Mojtaba; Fatahy, A. [Malek-Ashtar University of Technology (MUT), Isfahan (Iran, Islamic Republic of)

    2017-04-15

    Finishing of a workpiece is a main process in the production. This affects the quality and lifetime. Finishing in order of nanometer, nowadays, is a main demand of the industries. Thus, some new finishing process, such as abrasive flow finishing, is introduced to respond this demand. This may be aided by rotating workpiece and imposing a magnetic field. Numerical simulation of this process can be beneficial to reduce the expense and predict the result in a minimum time. Accordingly, in this study, magnetorheological fluid finishing is numerically simulated. The working medium contains magnetic and abrasive particles, blended in a base fluid. Some hydrodynamic parameters and surface roughness variations are studied. It is found that combination of rotating a workpiece and imposing a magnetic field can improve the surface roughness up to 15 percent.

  19. DIAGNOSTICS OF WORKPIECE SURFACE CONDITION BASED ON CUTTING TOOL VIBRATIONS DURING MACHINING

    Directory of Open Access Journals (Sweden)

    Jerzy Józwik

    2015-05-01

    Full Text Available The paper presents functional relationships between surface geometry parameters, feed and vibrations level in the radial direction of the workpiece. Time characteristics of the acceleration of cutting tool vibration registered during C45 steel and stainless steel machining for separate axes (X, Y, Z were presented as a function of feedrate f. During the tests surface geometric accuracy assessment was performed and 3D surface roughness parameters were determined. The Sz parameter was selected for the analysis, which was then collated with RMS vibration acceleration and feedrate f. The Sz parameter indirectly provides information on peak to valley height and is characterised by high generalising potential i.e. it is highly correlated to other surface and volume parameters of surface roughness. Test results presented in this paper may constitute a valuable source of information considering the influence of vibrations on geometric accuracy of elements for engineers designing technological processes.

  20. T-Spline Based Unifying Registration Procedure for Free-Form Surface Workpieces in Intelligent CMM

    Directory of Open Access Journals (Sweden)

    Zhenhua Han

    2017-10-01

    Full Text Available With the development of the modern manufacturing industry, the free-form surface is widely used in various fields, and the automatic detection of a free-form surface is an important function of future intelligent three-coordinate measuring machines (CMMs. To improve the intelligence of CMMs, a new visual system is designed based on the characteristics of CMMs. A unified model of the free-form surface is proposed based on T-splines. A discretization method of the T-spline surface formula model is proposed. Under this discretization, the position and orientation of the workpiece would be recognized by point cloud registration. A high accuracy evaluation method is proposed between the measured point cloud and the T-spline surface formula. The experimental results demonstrate that the proposed method has the potential to realize the automatic detection of different free-form surfaces and improve the intelligence of CMMs.

  1. Characteristics of Speed Line Cutter and Fringe Analysis of Workpiece Surface

    Directory of Open Access Journals (Sweden)

    Shuai Wang

    2014-02-01

    Full Text Available Easy to operate, speed line cutter has a high machining cost performance, so is very popular among the majority of users. The precision of guide rails, screws and nuts used in most of the machines is not high, and the machine control cannot compensate for the screw pitch error, clearance during the transmission and machining error due to electrode wear. Furthermore, control signal may also be lost in control process. The development of speed line cutter focuses on the quality and machining stability of CNC speed line cutter. This article makes an analysis about the impact of machine’s inherent characteristics on machining workpiece surface, and concludes that analysis shall be made on the irregular fringe, therefore to heighten the machining precision.

  2. Reducing workpieces to their base geometry for multi-step incremental forming using manifold harmonics

    Science.gov (United States)

    Carette, Yannick; Vanhove, Hans; Duflou, Joost

    2018-05-01

    Single Point Incremental Forming is a flexible process that is well-suited for small batch production and rapid prototyping of complex sheet metal parts. The distributed nature of the deformation process and the unsupported sheet imply that controlling the final accuracy of the workpiece is challenging. To improve the process limits and the accuracy of SPIF, the use of multiple forming passes has been proposed and discussed by a number of authors. Most methods use multiple intermediate models, where the previous one is strictly smaller than the next one, while gradually increasing the workpieces' wall angles. Another method that can be used is the manufacture of a smoothed-out "base geometry" in the first pass, after which more detailed features can be added in subsequent passes. In both methods, the selection of these intermediate shapes is freely decided by the user. However, their practical implementation in the production of complex freeform parts is not straightforward. The original CAD model can be manually adjusted or completely new CAD models can be created. This paper discusses an automatic method that is able to extract the base geometry from a full STL-based CAD model in an analytical way. Harmonic decomposition is used to express the final geometry as the sum of individual surface harmonics. It is then possible to filter these harmonic contributions to obtain a new CAD model with a desired level of geometric detail. This paper explains the technique and its implementation, as well as its use in the automatic generation of multi-step geometries.

  3. Modeling of Principal Flank Wear: An Empirical Approach Combining the Effect of Tool, Environment and Workpiece Hardness

    Science.gov (United States)

    Mia, Mozammel; Al Bashir, Mahmood; Dhar, Nikhil Ranjan

    2016-10-01

    Hard turning is increasingly employed in machining, lately, to replace time-consuming conventional turning followed by grinding process. An excessive amount of tool wear in hard turning is one of the main hurdles to be overcome. Many researchers have developed tool wear model, but most of them developed it for a particular work-tool-environment combination. No aggregate model is developed that can be used to predict the amount of principal flank wear for specific machining time. An empirical model of principal flank wear (VB) has been developed for the different hardness of workpiece (HRC40, HRC48 and HRC56) while turning by coated carbide insert with different configurations (SNMM and SNMG) under both dry and high pressure coolant conditions. Unlike other developed model, this model includes the use of dummy variables along with the base empirical equation to entail the effect of any changes in the input conditions on the response. The base empirical equation for principal flank wear is formulated adopting the Exponential Associate Function using the experimental results. The coefficient of dummy variable reflects the shifting of the response from one set of machining condition to another set of machining condition which is determined by simple linear regression. The independent cutting parameters (speed, rate, depth of cut) are kept constant while formulating and analyzing this model. The developed model is validated with different sets of machining responses in turning hardened medium carbon steel by coated carbide inserts. For any particular set, the model can be used to predict the amount of principal flank wear for specific machining time. Since the predicted results exhibit good resemblance with experimental data and the average percentage error is <10 %, this model can be used to predict the principal flank wear for stated conditions.

  4. Use of fugacity model to analyze temperature-dependent removal of micro-contaminants in sewage treatment plants.

    Science.gov (United States)

    Thompson, Kelly; Zhang, Jianying; Zhang, Chunlong

    2011-08-01

    Effluents from sewage treatment plants (STPs) are known to contain residual micro-contaminants including endocrine disrupting chemicals (EDCs) despite the utilization of various removal processes. Temperature alters the efficacy of removal processes; however, experimental measurements of EDC removal at various temperatures are limited. Extrapolation of EDC behavior over a wide temperature range is possible using available physicochemical property data followed by the correction of temperature dependency. A level II fugacity-based STP model was employed by inputting parameters obtained from the literature and estimated by the US EPA's Estimations Programs Interface (EPI) including EPI's BIOWIN for temperature-dependent biodegradation half-lives. EDC removals in a three-stage activated sludge system were modeled under various temperatures and hydraulic retention times (HRTs) for representative compounds of various properties. Sensitivity analysis indicates that temperature plays a significant role in the model outcomes. Increasing temperature considerably enhances the removal of β-estradiol, ethinyestradiol, bisphenol, phenol, and tetrachloroethylene, but not testosterone with the highest biodegradation rate. The shortcomings of BIOWIN were mitigated by the correction of highly temperature-dependent biodegradation rates using the Arrhenius equation. The model predicts well the effects of operating temperature and HRTs on the removal via volatilization, adsorption, and biodegradation. The model also reveals that an impractically long HRT is needed to achieve a high EDC removal. The STP model along with temperature corrections is able to provide some useful insight into the different patterns of STP performance, and useful operational considerations relevant to EDC removal at winter low temperatures. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Seasonal variations in groundwater upwelling zones in a Danish lowland stream analyzed using Distributed Temperature Sensing (DTS)

    DEFF Research Database (Denmark)

    Matheswaran, Karthikeyan; Blemmer, Morten; Rosbjerg, Dan

    2014-01-01

    –night temperature difference were applied to three DTS datasets representing stream temperature responses to the variable meteorological and hydrological conditions prevailing in summer, winter and spring. The standard deviation criterion was useful to identify groundwater discharge zones in summer and spring......-term deployment covering variable meteorological and hydrological scenarios. Copyright © 2012 John Wiley & Sons, Ltd....

  6. A study of process parameters on workpiece anisotropy in the laser engineered net shaping (LENSTM) process

    Science.gov (United States)

    Chandra, Shubham; Rao, Balkrishna C.

    2017-06-01

    The process of laser engineered net shaping (LENSTM) is an additive manufacturing technique that employs the coaxial flow of metallic powders with a high-power laser to form a melt pool and the subsequent deposition of the specimen on a substrate. Although research done over the past decade on the LENSTM processing of alloys of steel, titanium, nickel and other metallic materials typically reports superior mechanical properties in as-deposited specimens, when compared to the bulk material, there is anisotropy in the mechanical properties of the melt deposit. The current study involves the development of a numerical model of the LENSTM process, using the principles of computational fluid dynamics (CFD), and the subsequent prediction of the volume fraction of equiaxed grains to predict process parameters required for the deposition of workpieces with isotropy in their properties. The numerical simulation is carried out on ANSYS-Fluent, whose data on thermal gradient are used to determine the volume fraction of the equiaxed grains present in the deposited specimen. This study has been validated against earlier efforts on the experimental studies of LENSTM for alloys of nickel. Besides being applicable to the wider family of metals and alloys, the results of this study will also facilitate effective process design to improve both product quality and productivity.

  7. Numerical study on the splitting of a vapor bubble in the ultrasonic assisted EDM process with the curved tool and workpiece.

    Science.gov (United States)

    Shervani-Tabar, M T; Seyed-Sadjadi, M H; Shabgard, M R

    2013-01-01

    Electrical discharge machining (EDM) is a powerful and modern method of machining. In the EDM process, a vapor bubble is generated between the tool and the workpiece in the dielectric liquid due to an electrical discharge. In this process dynamic behavior of the vapor bubble affects machining process. Vibration of the tool surface affects bubble behavior and consequently affects material removal rate (MRR). In this paper, dynamic behavior of the vapor bubble in an ultrasonic assisted EDM process after the appearance of the necking phenomenon is investigated. It is noteworthy that necking phenomenon occurs when the bubble takes the shape of an hour-glass. After the appearance of the necking phenomenon, the vapor bubble splits into two parts and two liquid jets are developed on the boundaries of the upper and lower parts of the vapor bubble. The liquid jet developed on the upper part of the bubble impinges to the tool and the liquid jet developed on the lower part of the bubble impinges to the workpiece. These liquid jets cause evacuation of debris from the gap between the tool and the workpiece and also cause erosion of the workpiece and the tool. Curved tool and workpiece affect the shape and the velocity of the liquid jets during splitting of the vapor bubble. In this paper dynamics of the vapor bubble after its splitting near the curved tool and workpiece is investigated in three cases. In the first case surfaces of the tool and the workpiece are flat, in the second case surfaces of the tool and the workpiece are convex and in the third case surfaces of the tool and workpiece are concave. Numerical results show that in the third case, the velocity of liquid jets which are developed on the boundaries of the upper and lower parts of the vapor bubble after its splitting have the highest magnitude and their shape are broader than the other cases. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. The influence of temperature calibration on the OC–EC results from a dual-optics thermal carbon analyzer

    Science.gov (United States)

    The Sunset Laboratory Dual-Optical Carbonaceous Analyzer that simultaneously measures transmission and reflectance signals is widely used in thermal-optical analysis of particulate matter samples. Most often this instrument is used to measure total carbon (TC), organic carbon (O...

  9. Technological capabilities of increasing surface quality of workpieces made of titanium alloy VT22 and stability of surface grinding

    Science.gov (United States)

    Soler, Ya I.; Salov, V. M.; Mai, D. S.

    2018-03-01

    Surface grinding of flat workpieces made of alloy VT22 was conducted by the periphery of a highly porous wheel (HPW) from cubic boron nitride CBN30 B107 100 OV K27 КF40 with three processing techniques (ij). They are 10 - cross-feed per stroke, HPW cutting into a workpiece changes alternately from up to down; 12 – cross-feed per double stroke during the up HPW cutting-in at the working stroke; 22 – cross-feed per double stroke during the down HPW cutting-in at the working stroke. With the involvement of artificial neural network models, it was revealed that to improve the quality of surfaces and stability of its formation, grinding should be conducted if ij = 12.

  10. Analyzing land surface temperature variations during Fogo Island (Cape Verde) 2014-2015 eruption with Landsat 8 images

    Science.gov (United States)

    Vieira, D.; Teodoro, A.; Gomes, A.

    2016-10-01

    Land Surface Temperature (LST) is an important parameter related to land surface processes that changes continuously through time. Assessing its dynamics during a volcanic eruption has both environmental and socio-economical interest. Lava flows and other volcanic materials produced and deposited throughout an eruption transform the landscape, contributing to its heterogeneity and altering LST measurements. This paper aims to assess variations of satellite-derived LST and to detect patterns during the latest Fogo Island (Cape Verde) eruption, extending from November 2014 through February 2015. LST data was obtained through four processed Landsat 8 images, focused on the caldera where Pico do Fogo volcano sits. QGIS' plugin Semi-Automatic Classification was used in order to apply atmospheric corrections and radiometric calibrations. The algorithm used to retrieve LST values is a single-channel method, in which emissivity values are known. The absence of in situ measurements is compensated by the use of MODIS sensor-derived LST data, used to compare with Landsat retrieved measurements. LST data analysis shows as expected that the highest LST values are located inside the caldera. High temperature values were also founded on the south-facing flank of the caldera. Although spatial patterns observed on the retrieved data remained roughly the same during the time period considered, temperature values changed throughout the area and over time, as it was also expected. LST values followed the eruption dynamic experiencing a growth followed by a decline. Moreover, it seems possible to recognize areas affected by lava flows of previous eruptions, due to well-defined LST spatial patterns.

  11. Analyzing the effects of urban expansion on land surface temperature patterns by landscape metrics: a case study of Isfahan city, Iran.

    Science.gov (United States)

    Madanian, Maliheh; Soffianian, Ali Reza; Koupai, Saeid Soltani; Pourmanafi, Saeid; Momeni, Mehdi

    2018-03-03

    Urban expansion can cause extensive changes in land use and land cover (LULC), leading to changes in temperature conditions. Land surface temperature (LST) is one of the key parameters that should be considered in the study of urban temperature conditions. The purpose of this study was, therefore, to investigate the effects of changes in LULC due to the expansion of the city of Isfahan on LST using landscape metrics. To this aim, two Landsat 5 and Landsat 8 images, which had been acquired, respectively, on August 2, 1985, and July 4, 2015, were used. The support vector machine method was then used to classify the images. The results showed that Isfahan city had been encountered with an increase of impervious surfaces; in fact, this class covered 15% of the total area in 1985, while this value had been increased to 30% in 2015. Then LST zoning maps were created, indicating that the bare land and impervious surfaces categories were dominant in high temperature zones, while in the zones where water was present or NDVI was high, LST was low. Then, the landscape metrics in each of the LST zones were analyzed in relation to the LULC changes, showing that LULC changes due to urban expansion changed such landscape properties as the percentage of landscape, patch density, large patch index, and aggregation index. This information could be beneficial for urban planners to monitor and manage changes in the LULC patterns.

  12. New methods to get valid signals at high temperature conditions by using DSP tools of the ASSA (Abnormal Signal Simulation Analyzer)

    International Nuclear Information System (INIS)

    Koo, Kil-Mo; Hong, Seong-Wan; Song, Jin-Ho; Baek, Won-Pil; Jung, Myung-Kwan

    2012-01-01

    A new method to get valid signals under high temperature conditions using DSP (Digital Signal Processing) tools of an ASSA (Abnormal Signal Simulation Analyzer) module through a signal analysis of important circuit modeling under severe accident conditions has been suggested. Already exist, such kinds of DSP technique operated by LabVIEW or MatLab code linked with PSpice code, which have convenient tools as a special function of the ASSA module including a signal reconstruction method. If we can obtain a shift data of the transient parameters such as the time constant of the R-L-C circuit affected by high temperature under a severe accident condition, it will be possible to reconstruct an abnormal signal using a trained deconvolution algorithm as a sort of DSP technique. (author)

  13. Numerical Simulation of a Grinding Process Model for the Spatial Work-pieces: Development of Modeling Techniques

    Directory of Open Access Journals (Sweden)

    S. A. Voronov

    2015-01-01

    Full Text Available The article presents a literature review in simulation of grinding processes. It takes into consideration the statistical, energy based, and imitation approaches to simulation of grinding forces. Main stages of interaction between abrasive grains and machined surface are shown. The article describes main approaches to the geometry modeling of forming new surfaces when grinding. The review of approaches to the chip and pile up effect numerical modeling is shown. Advantages and disadvantages of grain-to-surface interaction by means of finite element method and molecular dynamics method are considered. The article points out that it is necessary to take into consideration the system dynamics and its effect on the finished surface. Structure of the complex imitation model of grinding process dynamics for flexible work-pieces with spatial surface geometry is proposed from the literature review. The proposed model of spatial grinding includes the model of work-piece dynamics, model of grinding wheel dynamics, phenomenological model of grinding forces based on 3D geometry modeling algorithm. Model gives the following results for spatial grinding process: vibration of machining part and grinding wheel, machined surface geometry, static deflection of the surface and grinding forces under various cutting conditions.

  14. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    International Nuclear Information System (INIS)

    Cai, Yindi; Chen, Yuan-Liu; Shimizu, Yuki; Ito, So; Gao, Wei; Zhang, Liangchi

    2016-01-01

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  15. OPTIMIZING THE PLACEMENT OF A WORK-PIECE AT A MULTI-POSITION ROTARY TABLE OF TRANSFER MACHINE WITH VERTICAL MULTI-SPINDLE HEAD

    Directory of Open Access Journals (Sweden)

    N. N. Guschinski

    2015-01-01

    Full Text Available The problem of minimizing the weight of transfer machine with a multi-position rotary table by placing of a work-piece at the table for processing of homogeneous batch of work-pieces is considered. To solve this problem the mathematical model and heuristic particle swarm optimization algorithm are proposed. The results of numerical experiments for two real problems of this type are given. The experiments revealed that the particle swarm optimization algorithm is more effective for the solution of the problem compared to the methods of random search and LP-search.

  16. PARAMETER DETERMINATION FOR ADDITIONAL OPERATING FORCE MECHANISM IN DEVICE FOR PNEUMO-CENTRIFUGAL MACHINING OF BALL-SHAPED WORKPIECES

    Directory of Open Access Journals (Sweden)

    A. A. Sukhotsky

    2014-01-01

    Full Text Available The paper describes development of the methodology for optimization of parameters for an additional operating force mechanism in a device for pneumo-centrifugal machining of glass balls. Specific feature in manufacturing glass balls for micro-optics in accordance with technological process for obtaining ball-shaped workpieces is grinding and polishing of spherical surface in a free state. In this case component billets of future balls are made in the form of cubes and the billets are given preliminary a form of ball with the help of rough grinding. An advanced method for obtaining ball-shaped work-pieces from brittle materials is a pneumocentrifugal machining. This method presupposes an application of two conic rings with abrasive working surfaces which are set coaxially with large diameters to each other and the billets are rolled along these rings. Rotation of the billets is conveyed by means of pressure medium.The present devices for pneumo-centrifugal machining are suitable for obtaining balls up to 6 mm. Machining of the work-pieces with full spherical surfaces and large diameter is non-productive due to impossibility to ensure a sufficient force on the billet in the working zone. For this reason the paper proposes a modified device where an additional force on the machined billet is created by upper working disc that is making a reciprocating motion along an axis of abrasive conic rings. The motion is realized with the help of a cylindrical camshaft mechanism in the form of a ring with a profile working end face and the purpose of present paper is to optimize parameters of the proposed device.The paper presents expressions for calculation of constitutive parameters of the additional operating force mechanism including parameters of loading element motion, main dimensions of the additional operating force mechanism and parameters of a profile element in the additional operating force mechanism.Investigation method is a mathematical

  17. Determining the ion temperature and energy distribution in a lithium-plasma interaction test stand with a retarding field energy analyzer

    Science.gov (United States)

    Christenson, M.; Stemmley, S.; Jung, S.; Mettler, J.; Sang, X.; Martin, D.; Kalathiparambil, K.; Ruzic, D. N.

    2017-08-01

    The ThermoElectric-driven Liquid-metal plasma-facing Structures (TELS) experiment at the University of Illinois is a gas-puff driven, theta-pinch plasma source that is used as a test stand for off-normal plasma events incident on materials in the edge and divertor regions of a tokamak. The ion temperatures and resulting energy distributions are crucial for understanding how well a TELS pulse can simulate an extreme event in a larger, magnetic confinement device. A retarding field energy analyzer (RFEA) has been constructed for use with such a transient plasma due to its inexpensive and robust nature. The innovation surrounding the use of a control analyzer in conjunction with an actively sampling analyzer is presented and the conditions of RFEA operation are discussed, with results presented demonstrating successful performance under extreme conditions. Such extreme conditions are defined by heat fluxes on the order of 0.8 GW m-2 and on time scales of nearly 200 μs. Measurements from the RFEA indicate two primary features for a typical TELS discharge, following closely with the pre-ionizing coaxial gun discharge characteristics. For the case using the pre-ionization pulse (PiP) and the theta pinch, the measured ion signal showed an ion temperature of 23.3 ± 6.6 eV for the first peak and 17.6 ± 1.9 eV for the second peak. For the case using only the PiP, the measured signal showed an ion temperature of 7.9 ± 1.1 eV for the first peak and 6.6 ± 0.8 eV for the second peak. These differences illustrate the effectiveness of the theta pinch for imparting energy on the ions. This information also highlights the importance of TELS as being one of the few linear pulsed plasma sources whereby moderately energetic ions will strike targets without the need for sample biasing.

  18. Method for optimization of the orientation and fixing system of workpiece for the construction of control devices

    Directory of Open Access Journals (Sweden)

    Iordache Daniela-Monica

    2017-01-01

    Full Text Available The development and evolution of technological equipment for machining, assembly and control ensure the modernization of manufacturing processes. Devices as subsystems of technological system in the general context of the development and diversification of machinery, tools, workpiece and drives are made in a variety of sizes and constructive variants that create difficulties in their structure and improvement. Part of the research in recent years presented in this paper have as major objectives the increase of accuracy, productivity and flexibility of orientation and fixing devices for control operations. To this end there have been developed a mathematical model, a new method of working and an algorithm for optimizing the construction of the orientation and fixing system of a new type of control device.

  19. Comparative analysis between the SPIF and DPIF variants for die-less forming process for an automotive workpiece

    Directory of Open Access Journals (Sweden)

    Adrian José Benitez Lozano

    2015-07-01

    Full Text Available Over time the process of incremental deformation Die-less has been developed in many ways to meet the needs of flexible production with no investment in tooling and low production costs. Two of their configurations are the SPIF (Single point incremental forming and DPIF (Double point Incremental forming technique. The aim of this study is to compare both techniques with the purpose of exposing their advantages and disadvantages in the production of industrial parts, as well as to inform about Die-less as an alternative manufacturing process. Experiments with the exhaust pipe cover of a vehicle are performed, the main process parameters are described, and formed workpieces without evidence of defects are achieved. Significant differences between the two techniques in terms of production times and accuracy to the original model are also detected. Finally, it is suggested when is more convenient to use each of these.

  20. Applying Petroleum the Pressure Buildup Well Test Procedure on Thermal Response Test—A Novel Method for Analyzing Temperature Recovery Period

    Directory of Open Access Journals (Sweden)

    Tomislav Kurevija

    2018-02-01

    Full Text Available The theory of Thermal Response Testing (TRT is a well-known part of the sizing process of the geothermal exchange system. Multiple parameters influence the accuracy of effective ground thermal conductivity measurement; like testing time, variable power, climate interferences, groundwater effect, etc. To improve the accuracy of the TRT, we introduced a procedure to additionally analyze falloff temperature decline after the power test. The method is based on a premise of analogy between TRT and petroleum well testing, since the origin of both procedures lies in the diffusivity equation with solutions for heat conduction or pressure analysis during radial flow. Applying pressure build-up test interpretation techniques to borehole heat exchanger testing, greater accuracy could be achieved since ground conductivity could be obtained from this period. Analysis was conducted on a coaxial exchanger with five different power steps, and with both direct and reverse flow regimes. Each test was set with 96 h of classical TRT, followed by 96 h of temperature decline, making for almost 2000 h of cumulative borehole testing. Results showed that the ground conductivity value could vary by as much as 25%, depending on test time, seasonal period and power fluctuations, while the thermal conductivity obtained from the falloff period provided more stable values, with only a 10% value variation.

  1. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  2. High-throughput simultaneous determination of plasma water deuterium and 18-oxygen enrichment using a high-temperature conversion elemental analyzer with isotope ratio mass spectrometry.

    Science.gov (United States)

    Richelle, M; Darimont, C; Piguet-Welsch, C; Fay, L B

    2004-01-01

    This paper presents a high-throughput method for the simultaneous determination of deuterium and oxygen-18 (18O) enrichment of water samples isolated from blood. This analytical method enables rapid and simple determination of these enrichments of microgram quantities of water. Water is converted into hydrogen and carbon monoxide gases by the use of a high-temperature conversion elemental analyzer (TC-EA), that are then transferred on-line into the isotope ratio mass spectrometer. Accuracy determined with the standard light Antartic precipitation (SLAP) and Greenland ice sheet precipitation (GISP) is reliable for deuterium and 18O enrichments. The range of linearity is from 0 up to 0.09 atom percent excess (APE, i.e. -78 up to 5725 delta per mil (dpm)) for deuterium enrichment and from 0 up to 0.17 APE (-11 up to 890 dpm) for 18O enrichment. Memory effects do exist but can be avoided by analyzing the biological samples in quintuplet. This method allows the determination of 1440 samples per week, i.e. 288 biological samples per week. Copyright 2004 John Wiley & Sons, Ltd.

  3. Analyzing Snowpack Metrics Over Large Spatial Extents Using Calibrated, Enhanced-Resolution Brightness Temperature Data and Long Short Term Memory Artificial Neural Networks

    Science.gov (United States)

    Norris, W.; J Q Farmer, C.

    2017-12-01

    Snow water equivalence (SWE) is a difficult metric to measure accurately over large spatial extents; snow-tell sites are too localized, and traditional remotely sensed brightness temperature data is at too coarse of a resolution to capture variation. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) data from the National Snow and Ice Data Center (NSIDC) offers remotely sensed brightness temperature data at an enhanced resolution of 3.125 km versus the original 25 km, which allows for large spatial extents to be analyzed with reduced uncertainty compared to the 25km product. While the 25km brightness temperature data has proved useful in past research — one group found decreasing trends in SWE outweighed increasing trends three to one in North America; other researchers used the data to incorporate winter conditions, like snow cover, into ecological zoning criterion — with the new 3.125 km data, it is possible to derive more accurate metrics for SWE, since we have far more spatial variability in measurements. Even with higher resolution data, using the 37 - 19 GHz frequencies to estimate SWE distorts the data during times of melt onset and accumulation onset. Past researchers employed statistical splines, while other successful attempts utilized non-parametric curve fitting to smooth out spikes distorting metrics. In this work, rather than using legacy curve fitting techniques, a Long Short Term Memory (LSTM) Artificial Neural Network (ANN) was trained to perform curve fitting on the data. LSTM ANN have shown great promise in modeling time series data, and with almost 40 years of data available — 14,235 days — there is plenty of training data for the ANN. LSTM's are ideal for this type of time series analysis because they allow important trends to persist for long periods of time, but ignore short term fluctuations; since LSTM's have poor mid- to short-term memory, they are ideal for smoothing out the large spikes generated in the melt

  4. A probabilistic-based approach to monitoring tool wear state and assessing its effect on workpiece quality in nickel-based alloys

    Science.gov (United States)

    Akhavan Niaki, Farbod

    The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the

  5. On-line hydrogen-isotope measurements of organic samples using elemental chromium: An extension for high temperature elemental-analyzer techniques

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B.; Meijer, Harro A.J.; Brand, Willi A.; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ2H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ2H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  6. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  7. Study of thermal and electrical parameters of workpieces during spray coating by electrolytic plasma jet

    International Nuclear Information System (INIS)

    Khafizov, A A; Shakirov, Yu I; Valiev, R A; Valiev, R I; Khafizova, G M

    2016-01-01

    In this paper the results are presented of thermal and electrical parameters of products in the system bottom layer - intermediate layer when applying protective coatings of ferromagnetic powder by plasma spray produced in an electric discharge with a liquid cathode, on steel samples. Temperature distribution and gradients in coating and intermediate coating were examined. Detailed descriptions of spray coating with ferromagnetic powder by plasma jet obtained in electrical discharge with liquid cathode and the apparatus for obtaining thereof is provided. Problem has been solved by using of Fourier analysis. Initial data for calculations is provided. Results of numerical analysis are provided as temporal functions of temperature in contiguity between coating and intermediate coating as well as temporal function of the value Q=q-φ; where q is density of heat current directed to the free surface of intermediate coating, φ is density of heat current in contiguity between coating and intermediate coating. The analysis of data given shows that in the systems of contact heat exchange bottom layer-intermediate layer with close values of the thermophysical characteristics of constituting materials is observed a slow increase of the temperature of the contact as a function of time. (paper)

  8. Welding Current Distribution in the Work-piece and Pool in Arc Welding

    Directory of Open Access Journals (Sweden)

    A. M. Rybachuk

    2015-01-01

    Full Text Available In order to select the optimal configuration of controlling magnetic fields and build rational construction of magnetic systems, we need to know the distribution of welding current in the molten metal of the weld pool. So the objective of the work is to establish the calculated methods for determining current density in the weld pool during arc welding. The distribution of welding current in the pool depends on the field of the electrical resistance, which is determined by the deformed temperature field while arc moves with the welding speed. The previous works have shown experimentally and by simulation on the conductive paper that deformation of temperature field defines deformation of electric field. On the basis thereof, under certain boundary conditions the problem has been solved to give a general solution of differential equation, which relates the potential distribution to the temperature in the product during arc welding. This solution is obtained under the following boundary conditions: 1 metal is homogeneous; 2 input and output surfaces of heat flux and electric current coincide; 3 input and output surfaces of heat flux and electric current are insulated and equipotential; 4 other (lateral surfaces are adiabatic boundaries. Therefore, this paper pays basic attention to obtaining the analytical solution of a general differential equation, which relates distribution of potential to the temperature in the product. It considers the temperature field of the heat source, which moves at a welding speed with normal-circular distribution of the heat flow at a certain concentration factor. The distribution of current density is calculated on the assumption that the welding current is introduced through the same surface as the heat flux and the distribution of current density corresponds to the normally circular at a certain concentration factor. As a result, we get an expression that allows us to calculate the current density from the known

  9. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  10. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  11. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  12. Finite element analysis for temperature distributions in a cold forging

    International Nuclear Information System (INIS)

    Kim, Dong Bum; Lee, In Hwan; Cho, Hae Yong; Kim, Sung Wook; Song, In Chul; Jeon, Byung Cheol

    2013-01-01

    In this research, the finite element method is utilized to predict the temperature distributions in a cold-forging process for a cambolt. The cambolt is mainly used as a part of a suspension system of a vehicle. The cambolt has an off-centered lobe that manipulates the vertical position of the knuckle and wheel to a slight degree. The cambolt requires certain mechanical properties, such as strength and endurance limits. Moreover, temperature is also an important factor to realize mass production and improve efficiency. However, direct measurement of temperature in a forging process is infeasible with existing technology; therefore, there is a critical need for a new technique. Accordingly, in this study, a thermo-coupled finite element method is developed for predicting the temperature distribution. The rate of energy conversion to heat for the workpiece material is determined, and the temperature distribution is analyzed throughout the forging process for a cambolt. The temperatures associated with different punch speeds are also studied, as well as the relationships between load, temperature, and punch speed. Experimental verification of the technique is presented.

  13. Finite element analysis for temperature distributions in a cold forging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Bum; Lee, In Hwan; Cho, Hae Yong [Chungbuk National University, Cheongju (Korea, Republic of); Kim, Sung Wook [Yanbian National University, Yanbian (China); Song, In Chul; Jeon, Byung Cheol [Sunil dyfas, Jincheon (Korea, Republic of)

    2013-10-15

    In this research, the finite element method is utilized to predict the temperature distributions in a cold-forging process for a cambolt. The cambolt is mainly used as a part of a suspension system of a vehicle. The cambolt has an off-centered lobe that manipulates the vertical position of the knuckle and wheel to a slight degree. The cambolt requires certain mechanical properties, such as strength and endurance limits. Moreover, temperature is also an important factor to realize mass production and improve efficiency. However, direct measurement of temperature in a forging process is infeasible with existing technology; therefore, there is a critical need for a new technique. Accordingly, in this study, a thermo-coupled finite element method is developed for predicting the temperature distribution. The rate of energy conversion to heat for the workpiece material is determined, and the temperature distribution is analyzed throughout the forging process for a cambolt. The temperatures associated with different punch speeds are also studied, as well as the relationships between load, temperature, and punch speed. Experimental verification of the technique is presented.

  14. Experimental Research and Method for Calculation of 'Upsetting-with-Buckling' Load at the Impression-Free (Dieless Preforming of Workpiece

    Directory of Open Access Journals (Sweden)

    Kukhar Volodymir

    2018-01-01

    Full Text Available This paper presents the results of experimental studies of load characteristic changes during the upsetting of high billets with the upsetting ratio (height to diameter ratio from 3.0 to 6.0, which is followed by buckling. Such pass is an effective way of preforming the workpiece for production of forgings with a bended axis or dual forming, and belongs to impression-free (dieless operation of bulk forming. Based on the experimental data analysis, an engineering method for calculation of workpiece pre-forming load as a maximum buckling force has been developed. The analysis of the obtained data confirmed the possibility of performing of this pre-forming operation on the main forging equipment, since the load of shaping by buckling does not exceed the load of the dieforging.

  15. Experimental Research and Method for Calculation of 'Upsetting-with-Buckling' Load at the Impression-Free (Dieless) Preforming of Workpiece

    Science.gov (United States)

    Kukhar, Volodymir; Artiukh, Victor; Prysiazhnyi, Andrii; Pustovgar, Andrey

    2018-03-01

    This paper presents the results of experimental studies of load characteristic changes during the upsetting of high billets with the upsetting ratio (height to diameter ratio) from 3.0 to 6.0, which is followed by buckling. Such pass is an effective way of preforming the workpiece for production of forgings with a bended axis or dual forming, and belongs to impression-free (dieless) operation of bulk forming. Based on the experimental data analysis, an engineering method for calculation of workpiece pre-forming load as a maximum buckling force has been developed. The analysis of the obtained data confirmed the possibility of performing of this pre-forming operation on the main forging equipment, since the load of shaping by buckling does not exceed the load of the dieforging.

  16. Fire simulation in large compartments with a fire model 'CFAST'. Part 1. Survey of applicability for analyzing air-temperature profile in compartments

    International Nuclear Information System (INIS)

    Hattori, Yasuo; Suto, Hitoshi; Shirai, Koji; Eguchi, Yuzuru; Sano, Tadashi

    2012-01-01

    The basic performance of numerical analysis of air-temperature profiles in large-scale compartments by using a zone model, CFAST (Consolidated model of Fire growth And Smoke Transport), which has been widely applied for fire protection design of buildings is examined. Special attentions are paid to the dependence of the setting boundary conditions and the choosing model parameters. The simulations carried out under the denkyoken-test conditions, in which the air-temperature profiles in compartments and the heat-release rate of a fire have been precisely measured, indicate that the CFAST has a capability to appropriately represent the time-histories of air-temperature in the high air-temperature layer generated in the vicinity of ceiling of the compartment which includes the source of a fire, by applying the proper boundary conditions, i.e., time-histories of air-temperature in the upper (high temperature) layer given by the CFAST agree well with those of observations. The sensitivity analysis in the simulations also reveals that the appropriately setting of the boundary-conditions, especially for the heat-release ratio from a fire and the heat-transfer rate from walls of compartments to ambient air is vital. Contrary to this, the impacts of choosing numerical parameters on the air-temperature analysis are quite small. (author)

  17. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  18. Temperature dependent I-V characteristics of an Au/n-GaAs Schottky diode analyzed using Tung’s model

    Science.gov (United States)

    Korucu, Demet; Turut, Abdulmecit; Efeoglu, Hasan

    2013-04-01

    The current-voltage (I-V) characteristics of Au/n-GaAs contacts prepared with photolithography technique have been measured in the temperature range of 80-320 K. The ideality factor and barrier height (BH) values have remained almost unchanged between 1.04 and 1.10 and at a value of about 0.79 eV at temperatures above 200 K, respectively. Therefore, the ideality factor values near unity say that the experimental I-V data are almost independent of the sample temperature, that is, contacts have shown excellent Schottky diode behavior above 200 K. An abnormal decrease in the experimental BH Φb and an increase in the ideality factor with a decrease in temperature have been observed below 200 K. This behavior has been attributed to the barrier inhomogeneity by assuming a Gaussian distribution of nanometer-sized patches with low BH at the metal-semiconductor interface. The barrier inhomogeneity assumption is also confirmed by the linear relationship between the BH and the ideality factor. According to Tung’s barrier inhomogeneity model, it has been seen that the value of σT=7.41×10-5 cm2/3 V1/3from ideality factor versus (kT)-1 curve is in close agreement with σT=7.95×10-5 cm2/3 V1/3 value from the Φeff versus (2kT)-1 curve in the range of 80-200 K. The modified Richardson ln(J0/T2)-(qσT)2(Vb/η)2/3/[2(kT)2] versus (kT)-1 plot, from Tung’s Model, has given a Richardson constant value of 8.47 A cm-2 K-2which is in very close agreement with the known value of 8.16 A cm-2 K-2 for n-type GaAs; considering the effective patch area which is significantly lower than the entire geometric area of the Schottky contact, in temperature range of 80-200 K. Thus, it has been concluded that the use of Tung’s lateral inhomogeneity model is more appropriate to interpret the temperature-dependent I-V characteristics in the Schottky contacts.

  19. Using the Signal Tools and Statistical Tools to Redefine the 24 Solar Terms in Peasant Calendar by Analyzing Surface Temperature and Precipitation

    Science.gov (United States)

    Huang, J. Y.; Tung, C. P.

    2017-12-01

    There is an important book called "Peasant Calendar" in the Chinese society. The Peasant Calendar is originally based on the orbit of the Sun and each year is divided into 24 solar terms. Each term has its own special meaning and conception. For example, "Spring Begins" means the end of winter and the beginning of spring. In Taiwan, 24 solar terms play an important role in agriculture because farmers always use the Peasant Calendar to decide when to sow. However, the current solar term in Taiwan is fixed about 15 days. This way doesn't show the temporal variability of climate and also can't truly reflect the regional climate characteristics in different areas.The number of days in each solar term should be more flexible. Since weather is associated with climate, all weather phenomena can be regarded as a multiple fluctuation signal. In this research, 30 years observation data of surface temperature and precipitation from 1976 2016 are used. The data is cut into different time series, such as a week, a month, six months to one year and so on. Signal analysis tools such as wavelet, change point analysis and Fourier transform are used to determine the length of each solar term. After determining the days of each solar term, statistical tests are used to find the relationships between the length of solar terms and climate turbulent (e.g., ENSO and PDO).For example, one of the solar terms called "Major Heat" should typically be more than 20 days in Taiwan due to global warming and heat island effect. The advance of Peasant Calendar can help farmers to make better decision, controlling crop schedule and using the farmland more efficient. For instance, warmer condition can accelerate the accumulation of accumulated temperature, which is the key of crop's growth stage. The result also can be used on disaster reduction (e.g., preventing agricultural damage) and water resources project.

  20. A Demonstration of an Improved Filtering Technique for Analyzing Climate Records via Comparisons of Satellite MSU/AMSU Instrument Temperature Products from Three Research Groups

    Science.gov (United States)

    Swanson, R. E.

    2017-12-01

    Climate data records typically exhibit considerable variation over short time scales both from natural variability and from instrumentation issues. The use of linear least squares regression can provide overall trend information from noisy data, however assessing intermediate time periods can also provide useful information unavailable from basic trend calculations. Extracting the short term information in these data for assessing changes to climate or for comparison of data series from different sources requires the application of filters to separate short period variations from longer period trends. A common method used to smooth data is the moving average, which is a simple digital filter that can distort the resulting series due to the aliasing of the sampling period into the output series. We utilized Hamming filters to compare MSU/AMSU satellite time series developed by three research groups (UAH, RSS and NOAA STAR), the results published in January 2017 [http://journals.ametsoc.org/doi/abs/10.1175/JTECH-D-16-0121.1]. Since the last release date (July 2016) for the data analyzed in that paper, some of these groups have updated their analytical procedures and additional months of data are available to extend the series. An updated analysis of these data using the latest data releases available from each group is to be presented. Improved graphics will be employed to provide a clearer visualization of the differences between each group's results. As in the previous paper, the greatest difference between the UAH TMT series and those from the RSS and NOAA data appears during the early period of data from the MSU instruments before about 2003, as shown in the attached figure, and preliminary results indicate this pattern continues. Also to be presented are other findings regarding seasonal changes which were not included in the previous study.

  1. Analyses of Effects of Cutting Parameters on Cutting Edge Temperature Using Inverse Heat Conduction Technique

    Directory of Open Access Journals (Sweden)

    Marcelo Ribeiro dos Santos

    2014-01-01

    Full Text Available During machining energy is transformed into heat due to plastic deformation of the workpiece surface and friction between tool and workpiece. High temperatures are generated in the region of the cutting edge, which have a very important influence on wear rate of the cutting tool and on tool life. This work proposes the estimation of heat flux at the chip-tool interface using inverse techniques. Factors which influence the temperature distribution at the AISI M32C high speed steel tool rake face during machining of a ABNT 12L14 steel workpiece were also investigated. The temperature distribution was predicted using finite volume elements. A transient 3D numerical code using irregular and nonstaggered mesh was developed to solve the nonlinear heat diffusion equation. To validate the software, experimental tests were made. The inverse problem was solved using the function specification method. Heat fluxes at the tool-workpiece interface were estimated using inverse problems techniques and experimental temperatures. Tests were performed to study the effect of cutting parameters on cutting edge temperature. The results were compared with those of the tool-work thermocouple technique and a fair agreement was obtained.

  2. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  3. Properties of bioadhesive ketoprofen liquid suppositories: preparation, determination of gelation temperature, viscosity studies and evaluation of mechanical properties using texture analyzer by 4 × 4 factorial design.

    Science.gov (United States)

    Ozgüney, Işık; Kardhiqi, Anita

    2014-12-01

    Development and evaluation of thermosensitive and bioadhesive liquid suppositories containing ketoprofen (KP). This study was conducted to develope thermosensitive and bioadhesive liquid suppositories containing KP using poloxamer and different bioadhesive polymers and to investigate their gelation temperature, viscosity and mechanical properties. Bioadhesive liquid suppositories were prepared by the cold method using poloxamer 407 (P 407), Poloxamer 188 (P 188) and various amounts of different bioadhesive polymers. Their gelation temperatures, viscosity values and mechanical properties were determined using texture analyzer by 4 × 4 factorial design. It was seen that in presence of KP, gelation temperature of formulation P 407/P 188 (4/20%) significantly decreased from 64 to 37.1 °C. It is to be noted that addition of increasing concentrations of bioadhesive polymers lowered gelation temperature and its decrease was highest with addition of Carbopol 934 P (C). Results of texture profile analysis (TPA) showed that formulations containing C have significantly higher hardness and adhesiveness values than other bioadhesive formulations. According to TPA, gel structure of liquid suppository formulation F5, containing P 407/P 188/KP/C (4/20/2.5/0.8%), exhibited the greatest hardness, compressibilty, adhesiveness and besides greatest viscosity. According to mechanical properties and viscosity values, it was concluded that F5 could be a promising formulation.

  4. Error estimation and parameter dependence of the calculation of the fast ion distribution function, temperature, and density using data from the KF1 high energy neutral particle analyzer on Joint European Torus

    International Nuclear Information System (INIS)

    Schlatter, Christian; Testa, Duccio; Cecconello, Marco; Murari, Andrea; Santala, Marko

    2004-01-01

    Joint European Torus high energy neutral particle analyzer measures the flux of fast neutrals originating from the plasma core. From this data, the fast ion distribution function f i fast , temperature T i,perpendicular fast , and density n i fast are derived using knowledge of various plasma parameters and of the cross section for the required atomic processes. In this article, a systematic sensitivity study of the effect of uncertainties in these quantities on the evaluation of the neutral particle analyzer f i fast , T i,perpendicular fast , and n i fast is reported. The dominant parameter affecting n i fast is the impurity confinement time and therefore a reasonable estimate of this quantity is necessary to reduce the uncertainties in n i fast below 50%. On the other hand, T i,perpendicular fast is much less sensitive and can certainly be provided with an accuracy of better than 10%

  5. Finite Element Modelling of a Pattern of Temperature Distribution during Travelling Heat Source from Oxyacetylene Flame

    Directory of Open Access Journals (Sweden)

    Alkali Adam Umar

    2014-07-01

    Full Text Available A 3D Finite element model was developed to analyse the conduction temperature distribution on type 304 stainless steel workpiece. An experimental heating-only test was conducted using the input parameters from FEM model which predicted the temperature field on the 304 stainless steel work pieces. Similar temperature pattern was noticed for both the FEM model as well as the experimental. Conduction was observed to be the dominant heat transfer mode. Maximum temperatures were observed to occur at the regions of contact between flame heat and the work pieces. Maximum temperature attained during the two investigated runs was 355°C. Even so austenite crystal morphology was retained on the preheated workpiece.

  6. Thermodynamic study of residual heat from a high temperature nuclear reactor to analyze its viability in cogeneration processes; Estudio termodinamico del calor residual de un reactor nuclear de alta temperatura para analizar su viabilidad en procesos de cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Santillan R, A.; Valle H, J.; Escalante, J. A., E-mail: santillanaura@gmail.com [Universidad Politecnica Metropolitana de Hidalgo, Boulevard acceso a Tolcayuca 1009, Ex-Hacienda San Javier, 43860 Tolcayuca, Hidalgo (Mexico)

    2015-09-15

    In this paper the thermodynamic study of a nuclear power plant of high temperature at gas turbine (GTHTR300) is presented for estimating the exploitable waste heat in a process of desalination of seawater. One of the most studied and viable sustainable energy for the production of electricity, without the emission of greenhouse gases, is the nuclear energy. The fourth generation nuclear power plants have greater advantages than those currently installed plants; these advantages have to do with security, increased efficiencies and feasibility to be coupled to electrical cogeneration processes. In this paper the thermodynamic study of a nuclear power plant type GTHTR300 is realized, which is selected by greater efficiencies and have optimal conditions for use in electrical cogeneration processes due to high operating temperatures, which are between 700 and 950 degrees Celsius. The aim of the study is to determine the heat losses and the work done at each stage of the system, determining where they are the greatest losses and analyzing in that processes can be taken advantage. Based on the study was appointed that most of the energy losses are in form of heat in the coolers and usually this is emitted into the atmosphere without being used. From the results a process of desalination of seawater as electrical cogeneration process is proposed. This paper contains a brief description of the operation of the nuclear power plant, focusing on operation conditions and thermodynamic characteristics for the implementation of electrical cogeneration process, a thermodynamic analysis based on mass and energy balance was developed. The results allow quantifying the losses of thermal energy and determining the optimal section for coupling of the reactor with the desalination process, seeking to have a great overall efficiency. (Author)

  7. Project ATLANTA (Atlanta Land use Analysis: Temperature and Air Quality): Use of Remote Sensing and Modeling to Analyze How Urban Land Use Change Affects Meteorology and Air Quality Through Time

    Science.gov (United States)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.

    1999-01-01

    This paper presents an overview of Project ATLANTA (ATlanta Land use ANalysis: Temperature and Air-quality) which is an investigation that seeks to observe, measure, model, and analyze how the rapid growth of the Atlanta, Georgia metropolitan area since the early 1970's has impacted the region's climate and air quality. The primary objectives for this research effort are: (1) To investigate and model the relationships between land cover change in the Atlanta metropolitan, and the development of the urban heat island phenomenon through time; (2) To investigate and model the temporal relationships between Atlanta urban growth and land cover change on air quality; and (3) To model the overall effects of urban development on surface energy budget characteristics across the Atlanta urban landscape through time. Our key goal is to derive a better scientific understanding of how land cover changes associated with urbanization in the Atlanta area, principally in transforming forest lands to urban land covers through time, has, and will, effect local and regional climate, surface energy flux, and air quality characteristics. Allied with this goal is the prospect that the results from this research can be applied by urban planners, environmental managers and other decision-makers, for determining how urbanization has impacted the climate and overall environment of the Atlanta area. Multiscaled remote sensing data, particularly high resolution thermal infrared data, are integral to this study for the analysis of thermal energy fluxes across the Atlanta urban landscape.

  8. System and method of adjusting the equilibrium temperature of an inductively-heated susceptor

    Science.gov (United States)

    Matsen, Marc R; Negley, Mark A; Geren, William Preston

    2015-02-24

    A system for inductively heating a workpiece may include an induction coil, at least one susceptor face sheet, and a current controller coupled. The induction coil may be configured to conduct an alternating current and generate a magnetic field in response to the alternating current. The susceptor face sheet may be configured to have a workpiece positioned therewith. The susceptor face sheet may be formed of a ferromagnetic alloy having a Curie temperature and being inductively heatable to an equilibrium temperature approaching the Curie temperature in response to the magnetic field. The current controller may be coupled to the induction coil and may be configured to adjust the alternating current in a manner causing a change in at least one heating parameter of the susceptor face sheet.

  9. Optimization of on-line hydrogen stable isotope ratio measurements of halogen- and sulfur-bearing organic compounds using elemental analyzer-chromium/high-temperature conversion isotope ratio mass spectrometry (EA-Cr/HTC-IRMS).

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Geilmann, Heike; Qi, Haiping; Coplen, Tyler B; Kümmel, Steffen; Ivdra, Natalija; Brand, Willi A; Schimmelmann, Arndt

    2017-03-30

    Accurate hydrogen isotopic analysis of halogen- and sulfur-bearing organics has not been possible with traditional high-temperature conversion (HTC) because the formation of hydrogen-bearing reaction products other than molecular hydrogen (H 2 ) is responsible for non-quantitative H 2 yields and possible hydrogen isotopic fractionation. Our previously introduced, new chromium-based EA-Cr/HTC-IRMS (Elemental Analyzer-Chromium/High-Temperature Conversion Isotope Ratio Mass Spectrometry) technique focused primarily on nitrogen-bearing compounds. Several technical and analytical issues concerning halogen- and sulfur-bearing samples, however, remained unresolved and required further refinement of the reactor systems. The EA-Cr/HTC reactor was substantially modified for the conversion of halogen- and sulfur-bearing samples. The performance of the novel conversion setup for solid and liquid samples was monitored and optimized using a simultaneously operating dual-detection system of IRMS and ion trap MS. The method with several variants in the reactor, including the addition of manganese metal chips, was evaluated in three laboratories using EA-Cr/HTC-IRMS (on-line method) and compared with traditional uranium-reduction-based conversion combined with manual dual-inlet IRMS analysis (off-line method) in one laboratory. The modified EA-Cr/HTC reactor setup showed an overall H 2 -recovery of more than 96% for all halogen- and sulfur-bearing organic compounds. All results were successfully normalized via two-point calibration with VSMOW-SLAP reference waters. Precise and accurate hydrogen isotopic analysis was achieved for a variety of organics containing F-, Cl-, Br-, I-, and S-bearing heteroelements. The robust nature of the on-line EA-Cr/HTC technique was demonstrated by a series of 196 consecutive measurements with a single reactor filling. The optimized EA-Cr/HTC reactor design can be implemented in existing analytical equipment using commercially available material and

  10. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  11. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  12. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  13. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  14. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  15. Investigations on the micro-scale surface interactions at the tool and workpiece interface in micro-manufacturing of bipolar plates for proton exchange membrane fuel cells

    Science.gov (United States)

    Peker, Mevlut Fatih

    Micro-forming studies have been more attractive in recent years because of miniaturization trend. One of the promising metal forming processes, micro-stamping, provides durability, strength, surface finish, and low cost for metal products. Hence, it is considered a prominent method for fabricating bipolar plates (BPP) with micro-channel arrays on large metallic surfaces to be used in Proton Exchange Membrane Fuel Cells (PEMFC). Major concerns in micro-stamping of high volume BPPs are surface interactions between micro-stamping dies and blank metal plates, and tribological changes. These concerns play a critical role in determining the surface quality, channel formation, and dimensional precision of bipolar plates. The surface quality of BPP is highly dependent on the micro-stamping die surface, and process conditions due to large ratios of surface area to volume (size effect) that cause an increased level of friction and wear issues at the contact interface. Due to the high volume and fast production rates, BPP surface characteristics such as surface roughness, hardness, and stiffness may change because of repeated interactions between tool (micro-forming die) and workpiece (sheet blank of interest). Since the surface characteristics of BPPs have a strong effect on corrosion and contact resistance of bipolar plates, and consequently overall fuel cell performance, evolution of surface characteristics at the tool and workpiece should be monitored, controlled, and kept in acceptable ranges throughout the long production cycles to maintain the surface quality. Compared to macro-forming operations, tribological changes in micro-forming process are bigger challenges due to their dominance and criticality. Therefore, tribological size effect should be considered for better understanding of tribological changes in micro-scale. The integrity of process simulation to the experiments, on the other hand, is essential. This study describes an approach that aims to investigate

  16. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  17. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  18. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  19. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  20. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  1. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  2. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  3. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  4. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  5. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  6. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  7. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  8. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  9. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  10. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  11. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  12. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  13. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  14. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  15. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  16. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  17. Finite Element Simulation of Temperature and Strain Distribution during Friction Stir Welding of AA2024 Aluminum Alloy

    Science.gov (United States)

    Jain, Rahul; Pal, Surjya Kanta; Singh, Shiv Brat

    2017-02-01

    Friction Stir Welding (FSW) is a solid state joining process and is handy for welding aluminum alloys. Finite Element Method (FEM) is an important tool to predict state variables of the process but numerical simulation of FSW is highly complex due to non-linear contact interactions between tool and work piece and interdependency of displacement and temperature. In the present work, a three dimensional coupled thermo-mechanical method based on Lagrangian implicit method is proposed to study the thermal history, strain distribution and thermo-mechanical process in butt welding of Aluminum alloy 2024 using DEFORM-3D software. Workpiece is defined as rigid-visco plastic material and sticking condition between tool and work piece is defined. Adaptive re-meshing is used to tackle high mesh distortion. Effect of tool rotational and welding speed on plastic strain is studied and insight is given on asymmetric nature of FSW process. Temperature distribution on the workpiece and tool is predicted and maximum temperature is found in workpiece top surface.

  18. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  19. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Directory of Open Access Journals (Sweden)

    Abdil Kus

    2015-01-01

    Full Text Available In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  20. Thermocouple and infrared sensor-based measurement of temperature distribution in metal cutting.

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-12

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  1. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M. Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-01

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining. PMID:25587976

  2. SINDA, Systems Improved Numerical Differencing Analyzer

    Science.gov (United States)

    Fink, L. C.; Pan, H. M. Y.; Ishimoto, T.

    1972-01-01

    Computer program has been written to analyze group of 100-node areas and then provide for summation of any number of 100-node areas to obtain temperature profile. SINDA program options offer user variety of methods for solution of thermal analog modes presented in network format.

  3. Effect of the Preheating Temperature on Process Time in Friction Stir Welding of Al 6061-T6

    DEFF Research Database (Denmark)

    Jabbari, Masoud

    2013-01-01

    This paper presents the results obtained and the deductions made from an analytical modeling involving friction stir welding of Al 6061-T6. A new database was developed to simulate the contact temperature between the tool and the workpiece. A second-order equation is proposed for simulating...... the temperature in the contact boundary and the thermal history during the plunge phase. The effect of the preheating temperature on the process time was investigated with the proposed model. The results show that an increase of the preheating time leads to a decrease in the process time up to the plunge...

  4. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  5. Grinding temperature and energy ratio coefficient in MQL grinding of high-temperature nickel-base alloy by using different vegetable oils as base oil

    OpenAIRE

    Li Benkai; Li Changhe; Zhang Yanbin; Wang Yaogang; Jia Dongzhou; Yang Min

    2016-01-01

    Vegetable oil can be used as a base oil in minimal quantity of lubrication (MQL). This study compared the performances of MQL grinding by using castor oil, soybean oil, rapeseed oil, corn oil, sunflower oil, peanut oil, and palm oil as base oils. A K-P36 numerical-control precision surface grinder was used to perform plain grinding on a workpiece material with a high-temperature nickel base alloy. A YDM–III 99 three-dimensional dynamometer was used to measure grinding force, and a clip-type t...

  6. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  7. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  8. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  9. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  10. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  11. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  12. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  13. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  14. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  15. Study of Cutting Edge Temperature and Cutting Force of End Mill Tool in High Speed Machining

    Directory of Open Access Journals (Sweden)

    Kiprawi Mohammad Ashaari

    2017-01-01

    Full Text Available A wear of cutting tools during machining process is unavoidable due to the presence of frictional forces during removing process of unwanted material of workpiece. It is unavoidable but can be controlled at slower rate if the cutting speed is fixed at certain point in order to achieve optimum cutting conditions. The wear of cutting tools is closely related with the thermal deformations that occurred between the frictional contact point of cutting edge of cutting tool and workpiece. This research paper is focused on determinations of relationship among cutting temperature, cutting speed, cutting forces and radial depth of cutting parameters. The cutting temperature is determined by using the Indium Arsenide (InAs and Indium Antimonide (InSb photocells to measure infrared radiation that are emitted from cutting tools and cutting forces is determined by using dynamometer. The high speed machining process is done by end milling the outer surface of carbon steel. The signal from the photocell is digitally visualized in the digital oscilloscope. Based on the results, the cutting temperature increased as the radial depth and cutting speed increased. The cutting forces increased when radial depth increased but decreased when cutting speed is increased. The setup for calibration and discussion of the experiment will be explained in this paper.

  16. Plant for treating workpieces with powerful radiation

    International Nuclear Information System (INIS)

    Messerschmied, H.; Martin, W.

    1983-01-01

    The plant for wetting paint using electron beams has a series of chambers along a conveyor belt for accepting painted articles. In order to achieve a continuous process and to save nitrogen to be introduced into the chambers, the chamber are formed by containers open at the top, which are closed from an irradiation station by an endless belt or by a roller bed running synchronously with the containers. (orig./HP) [de

  17. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  18. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  19. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  20. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  1. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  2. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  3. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  4. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  5. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  6. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  7. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  8. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  9. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  10. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  11. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  12. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  13. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  14. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  15. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  16. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  17. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  18. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  19. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  20. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  1. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  2. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  3. Faraday cup for analyzing multi-ion plasma

    International Nuclear Information System (INIS)

    Fujita, Takao

    1987-01-01

    A compact and convenient ion analyzer (a kind of a Faraday cup) is developed in order to analyze weakly ionized multi-ion plasmas. This Faraday cup consists of three mesh electrodes and a movable ion collector. With a negative gate pulse superimposed on the ion retarding bias, ions are analyzed by means of time-of-flight. The identification of ion species and measurements of ion density and ion temperature are studied. (author)

  4. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  5. Analyzing the effect of cutting parameters on surface roughness and tool wear when machining nickel based hastelloy - 276

    International Nuclear Information System (INIS)

    Khidhir, Basim A; Mohamed, Bashir

    2011-01-01

    Machining parameters has an important factor on tool wear and surface finish, for that the manufacturers need to obtain optimal operating parameters with a minimum set of experiments as well as minimizing the simulations in order to reduce machining set up costs. The cutting speed is one of the most important cutting parameter to evaluate, it clearly most influences on one hand, tool life, tool stability, and cutting process quality, and on the other hand controls production flow. Due to more demanding manufacturing systems, the requirements for reliable technological information have increased. For a reliable analysis in cutting, the cutting zone (tip insert-workpiece-chip system) as the mechanics of cutting in this area are very complicated, the chip is formed in the shear plane (entrance the shear zone) and is shape in the sliding plane. The temperature contributed in the primary shear, chamfer and sticking, sliding zones are expressed as a function of unknown shear angle on the rake face and temperature modified flow stress in each zone. The experiments were carried out on a CNC lathe and surface finish and tool tip wear are measured in process. Machining experiments are conducted. Reasonable agreement is observed under turning with high depth of cut. Results of this research help to guide the design of new cutting tool materials and the studies on evaluation of machining parameters to further advance the productivity of nickel based alloy Hastelloy - 276 machining.

  6. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  7. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  8. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  9. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  10. Properties of Free-Machining Aluminum Alloys at Elevated Temperatures

    Science.gov (United States)

    Faltus, Jiří; Karlík, Miroslav; Haušild, Petr

    In areas close to the cutting tool the workpieces being dry machined could be heated up to 350°C and they may be impact loaded. Therefore it is of interest to study mechanical properties of corresponding materials at elevated temperatures. Free-machining alloys of Al-Cu and Al-Mg-Si systems containing Pb, Bi and Sn additions (AA2011, AA2111B, AA6262, and AA6023) were subjected to Charpy U notch impact test at the temperatures ranging from 20 to 350°C. The tested alloys show a sharp drop in notch impact strength KU at different temperatures. This drop of KU is caused by liquid metal embrittlement due to the melting of low-melting point dispersed phases which is documented by differential scanning calorimetry. Fracture surfaces of the specimens were observed using a scanning electron microscope. At room temperature, the fractures of all studied alloys exhibited similar ductile dimple fracture micromorphology, at elevated temperatures, numerous secondary intergranular cracks were observed.

  11. Finite element analysis of spot laser of steel welding temperature history

    Directory of Open Access Journals (Sweden)

    Shibib Khalid S.

    2009-01-01

    Full Text Available Laser welding process reduces the heat input to the work-piece which is the main goal in aerospace and electronics industries. A finite element model for axi-symmetric transient heat conduction has been used to predict temperature distribution through a steel cylinder subjected to CW laser beam of rectangular beam profile. Many numerical improvements had been used to reduce time of calculation and size of the program so as to achieve the task with minimum time required. An experimental determined absorptivity has been used to determine heat induced when laser interact with material. The heat affected zone and welding zone have been estimated to determine the effect of welding on material. The ratio of depth to width of the welding zone can be changed by proper selection of beam power to meet the specific production requirement. The temperature history obtained numerically has been compared with experimental data indicating good agreement.

  12. Effects of heat production on the temperature pattern and stresses on frictional hardening of cylindrical components

    International Nuclear Information System (INIS)

    Maksimovich, V.M.; Kratyuk, P.B.; Babei, Yu.I.; Maksimishin, M.D.

    1992-01-01

    Metal heating occurs during pulse hardening which influences the structure, state of strain, and physicomechanical properties, which in turn affects the viability. Difficulties exists in measuring the resulting temperature distributions because of the lag in existing methods. More accurate estimates of temperature distributions may often be obtained using theoretical methods, which involve solving coupled problems in the theory of elasticity and thermal conductivity. In this work, a planar contact case in thermoelasticity is considered for frictional hardening, in which the friction disk and the workpiece are represented as an elastic plunger and the body.It is assumed that the contact normal and tangential stresses are related by Coulomb's law. Also given is a method of solving which enables the definition of the thermoelastic state with a given accuracy in the contact region for high disk speeds. 5 refs., 2 figs., 1 tab

  13. 40 CFR 91.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature of the exhaust gas at the sample probe is below 190 °C, the temperature of the valves, pipe work, and so forth, must be controlled so as to maintain a wall temperature of 190 ±11 °C. If the temperature of the exhaust...

  14. Temperature fluctuations superimposed on background temperature change

    International Nuclear Information System (INIS)

    Otto, James; Roberts, J.A.

    2016-01-01

    Proxy data allows the temperature of the Earth to be mapped over long periods of time. In this work the temperature fluctuations for over 200 proxy data sets were examined and from this set 50 sets were analyzed to test for periodic and quasi-periodic fluctuations in the data sets. Temperature reconstructions over 4 different time scales were analyzed to see if patterns emerged. Data were put into four time intervals; 4,000 years, 14,000 years, 1,000,000 years, and 3,000,000 years and analyzed with a goal to understanding periodic and quasi-periodic patterns in global temperature change superimposed on a “background” average temperature change. Quasi-periodic signatures were identified that predate the Industrial Revolution, during much of which direct data on temperature are not available. These data indicate that Earth temperatures have undergone a number of periodic and quasi-periodic intervals that contain both global warming and global cooling cycles. The fluctuations are superimposed on a background of temperature change that has a declining slope during the two periods, pre-ice age and post ice age with a transition about 12,000 BCE. The data are divided into “events” that span the time periods 3,000,000 BCE to “0” CE, 1,000,000 BCE to “0” CE, 12,000 BCE to 2,000 CE and 2,000 BCE to 2,000 CE. An equation using a quasi-periodic (frequency modulated sine waves) patterns was developed to analyze the date sets for quasi-periodic patterns. “Periodicities” which show reasonable agreement with the predictions of Milankovitch and other investigators were found in the data sets.

  15. Plasma diagnostics with a retarding potential analyzer

    International Nuclear Information System (INIS)

    Jack, T.M.

    1996-01-01

    The plasma rocket is located at NASA Johnson Space Center. To produce a thrust in space, an inert gas is ionized into a plasma and heated in the linear section of a tokamak fusion device. The magnetic field used to contain the plasma has a magnitude of 2--10 kGauss. The plasma plume has a variable thrust and specific impulse. A high temperature retarding potential analyzer (RPA) is being developed to characterize the plasma in the plume and at the edge of the magnetically contained plasma. The RPA measures the energy and density of ions or electrons entering into its solid angle of collection. An oscilloscope displays the ion flux versus the collected current. All measurements are made relative to the facility ground. Testing of this device involves the determination of its output parameters, sensitivity, and responses to a wide range of energies and densities. Each grid will be tested individually by changing only its voltage and observing the output from the RPA. To verify that the RPA is providing proper output, it is compared to the output from a Langmuir or Faraday probe

  16. PLT and PDX perpendicular charge-exchange analyzers

    International Nuclear Information System (INIS)

    Mueller, D.; Hammett, G.W.; McCune, D.C.

    1986-01-01

    The perpendicular charge-exchange systems used on the poloidal divertor experiment and the Princeton large torus are comprised of ten-channel, mass-resolved, charge-exchange analyzers. Results from these systems indicate that instrumental effects can lead to erroneous temperature measurements during deuterium neutral beam injection or at low hydrogen concentrations

  17. 40 CFR 90.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... of the exhaust gas at the sample probe is below 190 °C, the temperature of the valves, pipe work, and... temperature of the exhaust gas at the sample probe is above 190 °C, the temperature of the valves, pipe work... and carbon dioxide measurements must be made on a dry basis (for raw exhaust measurement only...

  18. An Experimental Investigation of Cutting Temperature and Tool Wear in 2 Dimensional Ultrasonic Vibrations Assisted Micro-Milling

    Directory of Open Access Journals (Sweden)

    Ibrahim Mohd Rasidi

    2017-01-01

    Full Text Available Two dimensional Ultrasonic vibration assisted milling (2D UVAM well knows process that involved in high tech system to generate ultra range of frequency applied to the milling process. More industries nowadays become aware taking this opportunity to improve their productivity without decreasing their product accuracies. This paper investigate a comparative machining between UVAM and conventional machining (CM in tool wear and cutting temperature in milling process. Micro amplitude and sine wave frequency will be generate into the workpiece jig by piezo-actuator. Thus, creating a micro gap that allow heat remove effectively with the chip produces. A more complex tool trajectory mechanics of 2D UVAM has been found during this research. The approaching the tool tip into the workpiece surfaces is affected by the amplitude displacement along the frequency applied. It is found that the tool wear was reduce and surface roughness improvement by applying the 2D UVAM compared to the CM when choosing the optimum amplitude and appropriate frequency.

  19. Methods for Analyzing Electric Load Shape and its Variability

    Energy Technology Data Exchange (ETDEWEB)

    Price, Philip

    2010-05-12

    Current methods of summarizing and analyzing electric load shape are discussed briefly and compared. Simple rules of thumb for graphical display of load shapes are suggested. We propose a set of parameters that quantitatively describe the load shape in many buildings. Using the example of a linear regression model to predict load shape from time and temperature, we show how quantities such as the load?s sensitivity to outdoor temperature, and the effectiveness of demand response (DR), can be quantified. Examples are presented using real building data.

  20. Modelling of Strains During SAW Surfacing Taking into Heat of the Weld in Temperature Field Description and Phase Transformations

    Science.gov (United States)

    Winczek, J.; Makles, K.; Gucwa, M.; Gnatowska, R.; Hatala, M.

    2017-08-01

    In the paper, the model of the thermal and structural strain calculation in a steel element during single-pass SAW surfacing is presented. The temperature field is described analytically assuming a bimodal volumetric model of heat source and a semi-infinite body model of the surfaced (rebuilt) workpiece. The electric arc is treated physically as one heat source. Part of the heat is transferred by the direct impact of the electric arc, while another part of the heat is transferred to the weld by the melted material of the electrode. Kinetics of phase transformations during heating is limited by temperature values at the beginning and at the end of austenitic transformation, while the progress of phase transformations during cooling is determined on the basis of TTT-welding diagramand JMA-K law for diffusive transformations, and K-M law for martensitic transformation. Totalstrains equal to the sum ofthermaland structuralstrainsinduced by phasetransformationsin weldingcycle.

  1. Using finite element modelling to examine the flow process and temperature evolution in HPT under different constraining conditions

    International Nuclear Information System (INIS)

    Pereira, P H R; Langdon, T G; Figueiredo, R B; Cetlin, P R

    2014-01-01

    High-pressure torsion (HPT) is a metal-working technique used to impose severe plastic deformation into disc-shaped samples under high hydrostatic pressures. Different HPT facilities have been developed and they may be divided into three distinct categories depending upon the configuration of the anvils and the restriction imposed on the lateral flow of the samples. In the present paper, finite element simulations were performed to compare the flow process, temperature, strain and hydrostatic stress distributions under unconstrained, quasi-constrained and constrained conditions. It is shown there are distinct strain distributions in the samples depending on the facility configurations and a similar trend in the temperature rise of the HPT workpieces

  2. A mathematical approach based on finite differences method for analyzing the temperature field in arc welding of stainless steel thin sheets; Desarrollo de un modelo matematico de diferencias finitas para el analisis del campo de temperaturas en la soldadura por arco de chapas finas de acero inoxidable

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Conesa, E.J.; Estrems, M.; Miguel, V.

    2010-07-01

    This work develops a finite difference method to evaluate the temperature field in the heat affected zone in butt welding applied to AISI 304 stainless steel thin sheet by GTAW process. A computer program has been developed and implemented by Visual Basic for Applications (VBA) in MS-Excel spreadsheet. The results that are obtained using the numerical application foresee the thermal behaviour of arc welding processes. An experimental methodology has been developed to validate the mathematical model that allows to measure the temperature in several points close to the weld bead. The methodology is applied to a stainless steel sheet with a thickness lower than 3 mm, although may be used for other steels and welding processes as MIG/MAG and SMAW. The data which has been obtained from the experimental procedure have been used to validate the results that have been calculated by the finite differences numerical method. The mathematical model adjustment has been carried out taking into account the experimental results. The differences found between the experimental and theoretical approaches are due to the convection and radiation heat losses, which have not been considered in the simulation model.With this simple model, the designer will be able to calculate the thermal cycles that take place in the process as well as to predict the temperature field in the proximity of the weld bead. (Author). 18 refs.

  3. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  4. Interactive nuclear plant analyzer for the VVER-440 reactor

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer (NPA) has been developed for a VVER-440 model 213 reactor for use in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. This NPA is operational on an IBM RISC-6000 workstation and utilizes the RELAP5/MOD2 computer code for the calculation of the VVER-440 reactor response to the interactive commands initiated by the NPA operator. Results of the interactive calculation can be through the user-defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperatures of other metal structures. In addition, changes in the status of various components and system can be initiated and/or displayed both numerically and graphically on the mask

  5. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  6. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  7. Temperature dependent anomalous statistics

    International Nuclear Information System (INIS)

    Das, A.; Panda, S.

    1991-07-01

    We show that the anomalous statistics which arises in 2 + 1 dimensional Chern-Simons gauge theories can become temperature dependent in the most natural way. We analyze and show that a statistic's changing phase transition can happen in these theories only as T → ∞. (author). 14 refs

  8. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  9. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  10. New high voltage parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Kawasumi, Y.; Masai, K.; Iguchi, H.; Fujisawa, A.; Abe, Y.

    1992-01-01

    A new modification on the parallel plate analyzer for 500 keV heavy ions to eliminate the effect of the intense UV and visible radiations, is successfully conducted. Its principle and results are discussed. (author)

  11. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  12. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  13. FST Based Morphological Analyzer for Hindi Language

    OpenAIRE

    Deepak Kumar; Manjeet Singh; Seema Shukla

    2012-01-01

    Hindi being a highly inflectional language, FST (Finite State Transducer) based approach is most efficient for developing a morphological analyzer for this language. The work presented in this paper uses the SFST (Stuttgart Finite State Transducer) tool for generating the FST. A lexicon of root words is created. Rules are then added for generating inflectional and derivational words from these root words. The Morph Analyzer developed was used in a Part Of Speech (POS) Tagger based on Stanford...

  14. Framework for analyzing hyper-viscoelastic polymers

    Science.gov (United States)

    Trivedi, Akash; Siviour, Clive

    2017-06-01

    Hyper-viscoelastic polymers have multiple areas of application including aerospace, biomedicine, and automotive. Their mechanical responses are therefore extremely important to understand, particularly because they exhibit strong rate and temperature dependence, including a low temperature brittle transition. Relationships between the response at various strain rates and temperatures are investigated and a framework developed to predict response at rates where experiments are unfeasible. A master curve of the storage modulus's rate dependence at a reference temperature is constructed using a DMA test of the polymer. A frequency sweep spanning two decades and a temperature range from pre-glass transition to pre-melt is used. A fractional derivative model is fitted to the experimental data, and this model's parameters are used to derive stress-strain relationships at a desired strain rate. Finite element simulations with this constitutive model are used for verification with experimental data. This material is based upon work supported by the Air Force Office of Scientific Research, Air Force Materiel Command, USAF under Award No. FA9550-15-1-0448.

  15. Temperature Pill

    Science.gov (United States)

    1988-01-01

    Ingestible Thermal Monitoring System was developed at Johns Hopkins University as means of getting internal temperature readings for treatments of such emergency conditions as dangerously low (hypothermia) and dangerously high (hyperthermia) body temperatures. ITMS's accuracy is off no more than one hundredth of a degree and provides the only means of obtaining deep body temperature. System has additional applicability in fertility monitoring and some aspects of surgery, critical care obstetrics, metabolic disease treatment, gerontology (aging) and food processing research. Three-quarter inch silicone capsule contains telemetry system, micro battery, and a quartz crystal temperature sensor inserted vaginally, rectally, or swallowed.

  16. The Effect of Process and Model Parameters in Temperature Prediction for Hot Stamping of Boron Steel

    Directory of Open Access Journals (Sweden)

    Chaoyang Sun

    2013-01-01

    Full Text Available Finite element models of the hot stamping and cold die quenching process for boron steel sheet were developed using either rigid or elastic tools. The effect of tool elasticity and process parameters on workpiece temperature was investigated. Heat transfer coefficient between blank and tools was modelled as a function of gap and contact pressure. Temperature distribution and thermal history in the blank were predicted, and thickness distribution of the blank was obtained. Tests were carried out and the test results are used for the validation of numerical predictions. The effect of holding load and the size of cooling ducts on temperature distribution during the forming and the cool die quenching process was also studied by using two models. The results show that higher accuracy predictions of blank thickness and temperature distribution during deformation were obtained using the elastic tool model. However, temperature results obtained using the rigid tool model were close to those using the elastic tool model for a range of holding load.

  17. The use of cutting temperature to evaluate the machinability of titanium alloys.

    Science.gov (United States)

    Kikuchi, Masafumi

    2009-02-01

    This study investigated the machinability of titanium, two commercial titanium alloys (Ti-6Al-4V and Ti-6Al-7Nb) and free-cutting brass using the cutting temperature. The cutting temperature was estimated by measuring the thermal electromotive force of the tool-workpiece thermocouple during cutting. The thermoelectric power of each metal relative to the tool had previously been determined. The metals were slotted using a milling machine and carbide square end mills under four cutting conditions. The cutting temperatures of Ti-6Al-4V and Ti-6Al-7Nb were significantly higher than that of the titanium, while that of the free-cutting brass was lower. This result coincided with the relationship of the magnitude of the cutting forces measured in a previous study. For each metal, the cutting temperature became higher when the depth of cut or the cutting speed and feed increased. The increase in the cutting speed and feed was more influential on the value than the increase in the depth of cut when two cutting conditions with the same removal rates were compared. The results demonstrated that cutting temperature measurement can be utilized to develop a new material for dental CAD/CAM applications and to optimize the cutting conditions.

  18. Cutting Temperature Investigation of AISI H13 in High Speed End Milling

    Directory of Open Access Journals (Sweden)

    Muhammad Riza

    2016-10-01

    Full Text Available Heat produced at the tool-chip interface during high speed milling operations have been known as a significant factor that affect to tool life and workpiece geometry or properties. This paper aims to investigate cutting temperature behaviours of AISI H13 (48 HRC under high speed machining circumstances during pocketing. The experiments were conducted on CNC vertical machining centre by using PVD coated carbide insert. Milling processes were done at cutting speeds 150, 200 and 250 m/min and feed rate were 0.05, 0.1 and 0.15 mm/tooth. Depths of cut applied were 0.1, 0.15 and 0.2 mm. Tool path method applied in this experiment was contour in. Results presented in this paper indicate that by increasing cutting speed the cutting temperature is lower than low cutting speed. However, by decreasing feed rate leads to cutting temperature low. Cutting temperature phenomena at the corner of pocket milling were also investigated. The phenomena showed that cutting temperature tends to decrease a moment when cutter comes to the corner of pocket and turning point of tool path and increase extremely a moment before leaving the corner and turning point.

  19. Response surface and neural network based predictive models of cutting temperature in hard turning

    Directory of Open Access Journals (Sweden)

    Mozammel Mia

    2016-11-01

    Full Text Available The present study aimed to develop the predictive models of average tool-workpiece interface temperature in hard turning of AISI 1060 steels by coated carbide insert. The Response Surface Methodology (RSM and Artificial Neural Network (ANN were employed to predict the temperature in respect of cutting speed, feed rate and material hardness. The number and orientation of the experimental trials, conducted in both dry and high pressure coolant (HPC environments, were planned using full factorial design. The temperature was measured by using the tool-work thermocouple. In RSM model, two quadratic equations of temperature were derived from experimental data. The analysis of variance (ANOVA and mean absolute percentage error (MAPE were performed to suffice the adequacy of the models. In ANN model, 80% data were used to train and 20% data were employed for testing. Like RSM, herein, the error analysis was also conducted. The accuracy of the RSM and ANN model was found to be ⩾99%. The ANN models exhibit an error of ∼5% MAE for testing data. The regression coefficient was found to be greater than 99.9% for both dry and HPC. Both these models are acceptable, although the ANN model demonstrated a higher accuracy. These models, if employed, are expected to provide a better control of cutting temperature in turning of hardened steel.

  20. A new automatic analyzer for uranium determination

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyan; Zhang Lan

    1992-08-01

    An intellectual automatic analyzer for uranium based on the principle of flow injection analysis (FIA) has been developed. It can directly determine the uranium solution in range of 0.02 to 500 mg/L without any pre-process. A chromatographic column with extractant, in which the trace uranium is concentrated and separated, has special ability to enrich uranium, is connected to the manifold of the analyzer. The analyzer is suited for trace uranium determination in varies samples. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) is used as color reagent. Uranium is determined in aqueous solution by adding cation surfactant, cetyl-pyridinium bromide (PCB). The rate of analysis is 30 to 90 samples per hour. The relative standard deviation of determination is 1% ∼ 2%. The analyzer has been used in factories and laboratory, and the results are satisfied. The determination range can easily be changed by using a multi-function auto-injection valve that changes the injection volume of the sample and channels. So, it could adopt varies FIA operation modes to meet the needs of FIA determination for other substance. The analyzer has universal functions

  1. Scintiscans data analyzer model AS-10

    International Nuclear Information System (INIS)

    Malesa, J.; Wierzbicki, W.

    1975-01-01

    The principle of work and construction elements of the device made up for scintiscans data analyzation by ''square root scaling'' is presented. The device is equipped with cassette tape recorder type MK-125, made in Poland serving like scintiscans data bank, and with scintiscans data analyzation three programs. The cassette of two types, C-60 and C-90, is applied with working time of 2 x 30 min. and 2 x 45 min. respectivly. Results of scintiscans data analysation are printed by electric typewriter at figures in form of digital scintigram. (author)

  2. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    . To facilitate occupational safety and health there is a need to develop instruments to monitor and analyze nanoparticles in the industry, research and urban environments. The aim of this Ph.D. project was to develop new sensors that can analyze engineered nanoparticles. Two sensors were studied: (i......) a miniaturized toxicity sensor based on electrochemistry and (ii) a photothermal spectrometer based on tensile-stressed mechanical resonators (string resonators). Miniaturization of toxicity sensor targeting engineered nanoparticles was explored. This concept was based on the results of the biodurability test...

  3. Analyzing Web Behavior in Indoor Retail Spaces

    OpenAIRE

    Ren, Yongli; Tomko, Martin; Salim, Flora; Ong, Kevin; Sanderson, Mark

    2015-01-01

    We analyze 18 million rows of Wi-Fi access logs collected over a one year period from over 120,000 anonymized users at an inner-city shopping mall. The anonymized dataset gathered from an opt-in system provides users' approximate physical location, as well as Web browsing and some search history. Such data provides a unique opportunity to analyze the interaction between people's behavior in physical retail spaces and their Web behavior, serving as a proxy to their information needs. We find: ...

  4. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  5. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  6. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  7. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  8. World Ocean Atlas 2005, Temperature

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — World Ocean Atlas 2005 (WOA05) is a set of objectively analyzed (1° grid) climatological fields of in situ temperature, salinity, dissolved oxygen, Apparent Oxygen...

  9. Temperature trends with reduced impact of ocean air temperature

    DEFF Research Database (Denmark)

    Lansner, Frank; Pedersen, Jens Olaf Pepke

    2018-01-01

    Temperature data 1900–2010 from meteorological stations across the world have been analyzed and it has been found that all land areas generally have two different valid temperature trends. Coastal stations and hill stations facing ocean winds are normally more warm-trended than the valley station...

  10. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  11. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  12. Analyzing the Acoustic Beat with Mobile Devices

    Science.gov (United States)

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-01-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency ?f. The…

  13. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  14. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  15. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  16. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  17. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  18. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  19. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  20. Real time speech formant analyzer and display

    Science.gov (United States)

    Holland, George E.; Struve, Walter S.; Homer, John F.

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  1. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  2. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  3. Testing the Application for Analyzing Structured Entities

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2011-01-01

    Full Text Available The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the application on components and as a whole are established. A testing strategy for different objectives is proposed. The behavior of users during the testing period is analyzed. Statistical analysis regarding the behavior of users in processes of infinite resources access are realized.

  4. A new approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility,

  5. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  6. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  7. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  8. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  9. Analyzing Gender Stereotyping in Bollywood Movies

    OpenAIRE

    Madaan, Nishtha; Mehta, Sameep; Agrawaal, Taneea S; Malhotra, Vrinda; Aggarwal, Aditi; Saxena, Mayank

    2017-01-01

    The presence of gender stereotypes in many aspects of society is a well-known phenomenon. In this paper, we focus on studying such stereotypes and bias in Hindi movie industry (Bollywood). We analyze movie plots and posters for all movies released since 1970. The gender bias is detected by semantic modeling of plots at inter-sentence and intra-sentence level. Different features like occupation, introduction of cast in text, associated actions and descriptions are captured to show the pervasiv...

  10. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  11. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  12. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  13. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  14. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  15. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  16. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  17. Testing the Application for Analyzing Structured Entities

    OpenAIRE

    Ion IVAN; Bogdan VINTILA

    2011-01-01

    The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the applicat...

  18. Evaluation of the Air Void Analyzer

    Science.gov (United States)

    2013-07-01

    concrete using image analysis: Petrography of cementitious materials. ASTM STP 1215. S.M. DeHayes and D. Stark, eds. Philadelphia, PA: American...Administration (FHWA). 2006. Priority, market -ready technologies and innovations: Air Void Analyzer. Washington D.C. PDF file. Germann Instruments (GI). 2011...tests and properties of concrete and concrete-making materials. STP 169D. West Conshohocken, PA: ASTM International. Magura, D.D. 1996. Air void

  19. Semantic analyzability in children's understanding of idioms.

    Science.gov (United States)

    Gibbs, R W

    1991-06-01

    This study investigated the role of semantic analyzability in children's understanding of idioms. Kindergartners and first, third, and fourth graders listened to idiomatic expressions either alone or at the end of short story contexts. Their task was to explain verbally the intended meanings of these phrases and then to choose their correct idiomatic interpretations. The idioms presented to the children differed in their degree of analyzability. Some idioms were highly analyzable or decomposable, with the meanings of their parts contributing independently to their overall figurative meanings. Other idioms were nondecomposable because it was difficult to see any relation between a phrase's individual components and the idiom's figurative meaning. The results showed that younger children (kindergartners and first graders) understood decomposable idioms better than they did nondecomposable phrases. Older children (third and fourth graders) understood both kinds of idioms equally well in supporting contexts, but were better at interpreting decomposable idioms than they were at understanding nondecomposable idioms without contextual information. These findings demonstrate that young children better understand idiomatic phrases whose individual parts independently contribute to their overall figurative meanings.

  20. Handheld Fluorescence Microscopy based Flow Analyzer.

    Science.gov (United States)

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  1. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    Fasching, G.E.; Patton, G.H.

    1975-01-01

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  2. In temperature forming of friction stir lap welds in aluminium alloys

    Science.gov (United States)

    Bruni, Carlo; Cabibbo, Marcello; Greco, Luciano; Pieralisi, Massimiliano

    2018-05-01

    The objective of such investigation is the study in depth of the forming phase of welds realized on three sheet metal blanks in aluminium alloys by friction stir lap welding. Such forming phase was performed by upsetting at different constant forming temperatures varying from 200°C to 350°C with constant ram velocities of 0.01 and 0.1 mm/s. The temperature values were obtained by the use of heating strips applied on the upper tool and on the lower tool. It was observed an increase in the friction factor, acting at the upsetting tool-workpiece interface, with increasing temperature that is very useful in producing the required localized deformation with which to improve the weld. It was also confirmed that the forming phase allows to realize a required thickness in the weld area allowing to neglect the surficial perturbation produced by the friction stir welding tool shoulder. The obtained thickness could be subjected to springback when too low temperatures are considered.

  3. Temperature metrology

    Science.gov (United States)

    Fischer, J.; Fellmuth, B.

    2005-05-01

    The majority of the processes used by the manufacturing industry depend upon the accurate measurement and control of temperature. Thermal metrology is also a key factor affecting the efficiency and environmental impact of many high-energy industrial processes, the development of innovative products and the health and safety of the general population. Applications range from the processing, storage and shipment of perishable foodstuffs and biological materials to the development of more efficient and less environmentally polluting combustion processes for steel-making. Accurate measurement and control of temperature is, for instance, also important in areas such as the characterization of new materials used in the automotive, aerospace and semiconductor industries. This paper reviews the current status of temperature metrology. It starts with the determination of thermodynamic temperatures required on principle because temperature is an intensive quantity. Methods to determine thermodynamic temperatures are reviewed in detail to introduce the underlying physical basis. As these methods cannot usually be applied for practical measurements the need for a practical temperature scale for day-to-day work is motivated. The International Temperature Scale of 1990 and the Provisional Low Temperature Scale PLTS-2000 are described as important parts of the International System of Units to support science and technology. Its main importance becomes obvious in connection with industrial development and international markets. Every country is strongly interested in unique measures, in order to guarantee quality, reproducibility and functionability of products. The eventual realization of an international system, however, is only possible within the well-functioning organization of metrological laboratories. In developed countries the government established scientific institutes have certain metrological duties, as, for instance, the maintenance and dissemination of national

  4. Temperature metrology

    International Nuclear Information System (INIS)

    Fischer, J; Fellmuth, B

    2005-01-01

    The majority of the processes used by the manufacturing industry depend upon the accurate measurement and control of temperature. Thermal metrology is also a key factor affecting the efficiency and environmental impact of many high-energy industrial processes, the development of innovative products and the health and safety of the general population. Applications range from the processing, storage and shipment of perishable foodstuffs and biological materials to the development of more efficient and less environmentally polluting combustion processes for steel-making. Accurate measurement and control of temperature is, for instance, also important in areas such as the characterization of new materials used in the automotive, aerospace and semiconductor industries. This paper reviews the current status of temperature metrology. It starts with the determination of thermodynamic temperatures required on principle because temperature is an intensive quantity. Methods to determine thermodynamic temperatures are reviewed in detail to introduce the underlying physical basis. As these methods cannot usually be applied for practical measurements the need for a practical temperature scale for day-to-day work is motivated. The International Temperature Scale of 1990 and the Provisional Low Temperature Scale PLTS-2000 are described as important parts of the International System of Units to support science and technology. Its main importance becomes obvious in connection with industrial development and international markets. Every country is strongly interested in unique measures, in order to guarantee quality, reproducibility and functionability of products. The eventual realization of an international system, however, is only possible within the well-functioning organization of metrological laboratories. In developed countries the government established scientific institutes have certain metrological duties, as, for instance, the maintenance and dissemination of national

  5. CONTINUOUS ANALYZER UTILIZING BOILING POINT DETERMINATION

    Science.gov (United States)

    Pappas, W.S.

    1963-03-19

    A device is designed for continuously determining the boiling point of a mixture of liquids. The device comprises a distillation chamber for boiling a liquid; outlet conduit means for maintaining the liquid contents of said chamber at a constant level; a reflux condenser mounted above said distillation chamber; means for continuously introducing an incoming liquid sample into said reflux condenser and into intimate contact with vapors refluxing within said condenser; and means for measuring the temperature of the liquid flowing through said distillation chamber. (AEC)

  6. Development of a test facility for analyzing supercritical fluid blowdown

    International Nuclear Information System (INIS)

    Roberto, Thiago D.; Alvim, Antonio C.M.

    2015-01-01

    The generation IV nuclear reactors under development mostly use supercritical fluids as the working fluid because higher temperatures improve the thermal efficiency. Supercritical fluids are used by modern nuclear power plants to achieve thermal efficiencies of around 45%. With water as the supercritical working fluid, these plants operate at a high temperature and pressure. However, experiments on supercritical water are limited by technical and financial difficulties. These difficulties can be overcome by using model fluids, which have more feasible supercritical conditions and exhibit a lower critical pressure and temperature. Experimental research is normally used to determine the conditions under which model fluids represent supercritical fluids under steady-state conditions. A fluid-to-fluid scaling approach has been proposed to determine model fluids that can represent supercritical fluids in a transient state. This paper presents an application of fractional scale analysis to determine the simulation parameters for a depressurization test facility. Carbon dioxide (CO 2 ) and R134a gas were considered as the model fluids because their critical point conditions are more feasible than those of water. The similarities of water (prototype), CO 2 (model) and R134a (model) for depressurization in a pressure vessel were analyzed. (author)

  7. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  8. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  9. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  10. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  11. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    Matsuyama, Yuji; Orii, Shigeo; Ota, Toshiro; Kume, Etsuo; Aikawa, Hiroshi.

    1997-03-01

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  12. A low power Multi-Channel Analyzer

    International Nuclear Information System (INIS)

    Anderson, G.A.; Brackenbush, L.W.

    1993-06-01

    The instrumentation used in nuclear spectroscopy is generally large, is not portable, and requires a lot of power. Key components of these counting systems are the computer and the Multi-Channel Analyzer (MCA). To assist in performing measurements requiring portable systems, a small, very low power MCA has been developed at Pacific Northwest Laboratory (PNL). This MCA is interfaced with a Hewlett Packard palm top computer for portable applications. The MCA can also be connected to an IBM/PC for data storage and analysis. In addition, a real-time time display mode allows the user to view the spectra as they are collected

  13. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  14. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  15. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  16. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  17. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  18. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  19. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  20. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO 2 differential (ΔCO 2 ) increased two-fold with no change in apparent R d , when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO 2 . Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO 2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  2. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  3. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  4. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  5. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  6. Automatic analyzing device for chlorine ion

    International Nuclear Information System (INIS)

    Sugibayashi, Shinji; Morikawa, Yoshitake; Fukase, Kazuo; Kashima, Hiromasa.

    1997-01-01

    The present invention provides a device of automatically analyzing a trance amount of chlorine ions contained in feedwater, condensate and reactor water of a BWR type power plant. Namely, zero-adjustment or span calibration in this device is conducted as follows. (1) A standard chlorine ion liquid is supplied from a tank to a mixer by a constant volume pump, and the liquid is diluted and mixed with purified water to form a standard liquid. (2) The pH of the standard liquid is adjusted by a pH adjuster. (3) The standard liquid is supplied to an electrode cell to conduct zero adjustment or span calibration. Chlorine ions in a specimen are measured by the device of the present invention as follows. (1) The specimen is supplied to a head tank through a line filter. (2) The pH of the specimen is adjusted by a pH adjuster. (3) The specimen is supplied to an electrode cell to electrically measure the concentration of the chlorine ions in the specimen. The device of the present invention can automatically analyze trance amount of chlorine ions at a high accuracy, thereby capable of improving the sensitivity, reducing an operator's burden and radiation exposure. (I.S.)

  7. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  8. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  9. Performance evaluation of Samsung LABGEO(HC10) Hematology Analyzer.

    Science.gov (United States)

    Park, Il Joong; Ahn, Sunhyun; Kim, Young In; Kang, Seon Joo; Cho, Sung Ran

    2014-08-01

    The Samsung LABGEO(HC10) Hematology Analyzer (LABGEO(HC10)) is a recently developed automated hematology analyzer that uses impedance technologies. The analyzer provides 18 parameters including 3-part differential at a maximum rate of 80 samples per hour. To evaluate the performance of the LABGEO(HC10). We evaluated precision, linearity, carryover, and relationship for complete blood cell count parameters between the LABGEO(HC10) and the LH780 (Beckman Coulter Inc) in a university hospital in Korea according to the Clinical and Laboratory Standards Institute guidelines. Sample stability and differences due to the anticoagulant used (K₂EDTA versus K₃EDTA) were also evaluated. The LABGEO(HC10) showed linearity over a wide range and minimal carryover ( 0.92) except for mean corpuscular hemoglobin concentration. The bias estimated was acceptable for all parameters investigated except for monocyte count. Most parameters were stable until 24 hours both at room temperature and at 4°C. The difference by anticoagulant type was statistically insignificant for all parameters except for a few red cell parameters. The accurate results achievable and simplicity of operation make the unit recommendable for small to medium-sized laboratories.

  10. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  11. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  12. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  13. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  14. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  15. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  16. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  17. Analyzing Demand: Hegemonic Masculinity and Feminine Prostitution

    Directory of Open Access Journals (Sweden)

    Beatriz Ranea Triviño

    2016-12-01

    Full Text Available In this article, it is presented an exploratory research in which we analyzed the relationship between the construction of hegemonic masculinity and consumption of female prostitution. We have focused our attention on the experiences, attitudes and perceptions of young heterosexual men who have ever paid for sex. Following with a quantitative method of analysis, we conducted six semi-structured interviews with men between 18 to 35 years old. The analysis of the interviews shows the different demographic characteristics, such as, frequency of payment for sexual services, diversity of motivations, spaces where prostitutes are searched, opinions on prostitution and prostitutes. The main conclusions of this study are that the discourses of the interviewees reproduce gender stereotypes and gender sexual roles. And it is suggested that prostitution can be interpreted as a scenario where these men performance their hegemonic masculinity.

  18. Using wavelet features for analyzing gamma lines

    International Nuclear Information System (INIS)

    Medhat, M.E.; Abdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Uzhinskii, V.V.

    2004-01-01

    Data processing methods for analyzing gamma ray spectra with symmetric bell-shaped peaks form are considered. In many cases the peak form is symmetrical bell shaped in particular a Gaussian case is the most often used due to many physical reasons. The problem is how to evaluate parameters of such peaks, i.e. their positions, amplitudes and also their half-widths, that is for a single peak and overlapped peaks. Through wavelet features by using Marr wavelet (Mexican Hat) as a correlation method, it could be to estimate the optimal wavelet parameters and to locate peaks in the spectrum. The performance of the proposed method and others shows a better quality of wavelet transform method

  19. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  20. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    Rojas S, A.S.; Carrillo M, R.A.; Balderas, E.G.

    1999-01-01

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  1. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-10-01

    The Nuclear Plant Analyzer (NPA) is being developed as the US Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  2. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  3. Nuclear plant analyzer development at INEL

    International Nuclear Information System (INIS)

    Laats, E.T.; Russell, K.D.; Stewart, H.D.

    1983-01-01

    The Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC) has sponsored development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes the status of the NPA project at the INEL after one year of development. When completed, the NPA will be an integrated network of analytical tools for performing reactor plant analyses. Development of the NPA in FY-1983 progressed along two parallel pathways; namely, conceptual planning and software development. Regarding NPA planning, and extensive effort was conducted to define the function requirements of the NPA, conceptual design, and hardware needs. Regarding software development conducted in FY-1983, all development was aimed toward demonstrating the basic concept and feasibility of the NPA. Nearly all software was developed and resides on the INEL twin Control Data Corporation 176 mainframe computers

  4. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  5. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  6. RELAP5 nuclear plant analyzer capabilities

    International Nuclear Information System (INIS)

    Wagner, R.J.; Ransom, V.H.

    1982-01-01

    An interactive execution capability has been developed for the RELAP5 code which permits it to be used as a Nuclear Plant Analyzer. This capability has been demonstrated using a simplified primary and secondary loop model of a PWR. A variety of loss-of-feed-water accidents have been simulated using this model. The computer execution time on a CDC Cyber 176 is one half of the transient simulation time so that the results can be displayed in real time. The results of the demonstration problems are displayed in digital form on a color schematic of the plant model using a Textronics 4027 CRT terminal. The interactive feature allows the user to enter commands in much the same manner as a reactor operator

  7. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  8. Diffractive interference optical analyzer (DiOPTER)

    Science.gov (United States)

    Sasikumar, Harish; Prasad, Vishnu; Pal, Parama; Varma, Manoj M.

    2016-03-01

    This report demonstrates a method for high-resolution refractometric measurements using, what we have termed as, a Diffractive Interference Optical Analyzer (DiOpter). The setup consists of a laser, polarizer, a transparent diffraction grating and Si-photodetectors. The sensor is based on the differential response of diffracted orders to bulk refractive index changes. In these setups, the differential read-out of the diffracted orders suppresses signal drifts and enables time-resolved determination of refractive index changes in the sample cell. A remarkable feature of this device is that under appropriate conditions, the measurement sensitivity of the sensor can be enhanced by more than two orders of magnitude due to interference between multiply reflected diffracted orders. A noise-equivalent limit of detection (LoD) of 6x10-7 RIU was achieved in glass. This work focuses on devices with integrated sample well, made on low-cost PDMS. As the detection methodology is experimentally straightforward, it can be used across a wide array of applications, ranging from detecting changes in surface adsorbates via binding reactions to estimating refractive index (and hence concentration) variations in bulk samples. An exciting prospect of this technique is the potential integration of this device to smartphones using a simple interface based on transmission mode configuration. In a transmission configuration, we were able to achieve an LoD of 4x10-4 RIU which is sufficient to explore several applications in food quality testing and related fields. We are envisioning the future of this platform as a personal handheld optical analyzer for applications ranging from environmental sensing to healthcare and quality testing of food products.

  9. Relativistic effects in the calibration of electrostatic electron analyzers. I. Toroidal analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Keski Rahkonen, O [Helsinki University of Technology, Espoo (Finland). Laboratory of Physics; Krause, M O [Oak Ridge National Lab., Tenn. (USA)

    1978-02-01

    Relativistic correction terms up to the second order are derived for the kinetic energy of an electron travelling along the circular central trajectory of a toroidal analyzer. Furthermore, a practical energy calibration equation of the spherical sector plate analyzer is written for the variable-plate-voltage recording mode. Accurate measurements with a spherical analyzer performed using kinetic energies from 600 to 2100 eV are in good agreement with this theory showing our approximation (neglect of fringing fields, and source and detector geometry) is realistic enough for actual calibration purposes.

  10. Automatic Compensation of Workpiece Positioning Tolerances for Precise Laser

    Directory of Open Access Journals (Sweden)

    N. C. Stache

    2008-01-01

    Full Text Available Precise laser welding plays a fundamental role in the production of high-tech goods, particularly in precision engineering. In this working field, precise adjustment and compensation of positioning tolerances of the parts to be welded with respect to the laser beam is of paramount importance. This procedure mostly requires tedious and error-prone manual adjustment, which additionally results in a sharp increase in production costs. We therefore developed a system which automates and thus accelerates this procedure significantly. To this end, the welding machine is equipped with a camera to acquire high resolution images of the parts to be welded. In addition, a software framework is developed which enables precise automatic position detection of these parts and adjusts the position of the welding contour correspondingly. As a result, the machine is rapidly prepared for welding, and it is much more flexible in adapting to unknown parts.This paper describes the entire concept of extending a conventional welding machine with means for image acquisition and position estimation. In addition to this description, the algorithms, the results of an evaluation of position estimation, and a final welding result are presented. 

  11. Video control system for a drilling in furniture workpiece

    Science.gov (United States)

    Khmelev, V. L.; Satarov, R. N.; Zavyalova, K. V.

    2018-05-01

    During last 5 years, Russian industry has being starting to be a robotic, therefore scientific groups got new tasks. One of new tasks is machine vision systems, which should solve problem of automatic quality control. This type of systems has a cost of several thousand dollars each. The price is impossible for regional small business. In this article, we describe principle and algorithm of cheap video control system, which one uses web-cameras and notebook or desktop computer as a computing unit.

  12. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-01

    Super-insulated homes offer many benefits including improved comfort, reduced exterior noise penetration, lower energy bills, and the ability to withstand power and fuel outages under much more comfortable conditions than a typical home. While these homes aren't necessarily constructed with excessive mass in the form of concrete floors and walls, the amount of insulation and the increase in the thickness of the building envelope can lead to a mass effect, resulting in the structures ability to store much more heat than a code built home. This results in a very low thermal inertia making the building much less sensitive to drastic temperature swings thereby decreasing the peak heating load demand. During the winter of 2013/2014, CARB monitored the energy use of three homes in climate zone 6 in an attempt to evaluate the accuracy of two different mechanical system sizing methods for low load homes. Based on the results, it is recommended that internal and solar gains be included and some credit for thermal inertia be used in sizing calculations for super insulated homes.

  13. Mass Analyzers Facilitate Research on Addiction

    Science.gov (United States)

    2012-01-01

    The famous go/no go command for Space Shuttle launches comes from a place called the Firing Room. Located at Kennedy Space Center in the Launch Control Center (LCC), there are actually four Firing Rooms that take up most of the third floor of the LCC. These rooms comprise the nerve center for Space Shuttle launch and processing. Test engineers in the Firing Rooms operate the Launch Processing System (LPS), which is a highly automated, computer-controlled system for assembly, checkout, and launch of the Space Shuttle. LPS monitors thousands of measurements on the Space Shuttle and its ground support equipment, compares them to predefined tolerance levels, and then displays values that are out of tolerance. Firing Room operators view the data and send commands about everything from propellant levels inside the external tank to temperatures inside the crew compartment. In many cases, LPS will automatically react to abnormal conditions and perform related functions without test engineer intervention; however, firing room engineers continue to look at each and every happening to ensure a safe launch. Some of the systems monitored during launch operations include electrical, cooling, communications, and computers. One of the thousands of measurements derived from these systems is the amount of hydrogen and oxygen inside the shuttle during launch.

  14. Analyzing the development of Indonesia shrimp industry

    Science.gov (United States)

    Wati, L. A.

    2018-04-01

    This research aimed to analyze the development of shrimp industry in Indonesia. Porter’s Diamond Theory was used for analysis. The Porter’s Diamond theory is one of framework for industry analysis and business strategy development. The Porter’s Diamond theory has five forces that determine the competitive intensity in an industry, namely (1) the threat of substitute products, (2) the threat of competition, (3) the threat of new entrants, (4) bargaining power of suppliers, and (5) bargaining power of consumers. The development of Indonesian shrimp industry pretty good, explained by Porter Diamond Theory analysis. Analysis of Porter Diamond Theory through four main components namely factor conditions; demand condition; related and supporting industries; and firm strategy, structure and rivalry coupled with a two-component supporting (regulatory the government and the factor of chance). Based on the result of this research show that two-component supporting (regulatory the government and the factor of chance) have positive. Related and supporting industries have negative, firm and structure strategy have negative, rivalry has positive, factor condition have positive (except science and technology resources).

  15. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  16. Alternative approach to analyzing occupational mortality data

    International Nuclear Information System (INIS)

    Gilbert, E.S.; Buchanan, J.A.

    1984-01-01

    It is widely recognized that analyzing occupational mortality by calculating standardized mortality ratios based on death rates from the general population is subject to a number of limitations. An alternative approach described in this report takes advantage of the fact that comparisons of mortality by subgroups and assessments of trends in mortality are often of equal or greater interest than overall assessments and that such comparisons do not require an external control. A computer program MOX (Mortality and Occupational Exposure) is available for performing the needed calculations for several diseases. MOX was written to asses the effect of radiation exposure on Hanford nuclear workers. For this application, analyses have been based on cumulative exposure computed (by MOX) from annual records of radiation exposure obtained from personal dosimeter readings. This program provides tests for differences and trends among subcategories defined by variables such as length of employment, job category, or exposure measurements and also provides control for age, calendar year, and several other potentially confounding variables. 29 references, 2 tables

  17. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  18. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  19. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  20. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  1. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  2. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  3. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  4. A Methodology to Analyze Photovoltaic Tracker Uptime

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Matthew T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Dan [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-17

    A metric is developed to analyze the daily performance of single-axis photovoltaic (PV) trackers. The metric relies on comparing correlations between the daily time series of the PV power output and an array of simulated plane-of-array irradiances for the given day. Mathematical thresholds and a logic sequence are presented, so the daily tracking metric can be applied in an automated fashion on large-scale PV systems. The results of applying the metric are visually examined against the time series of the power output data for a large number of days and for various systems. The visual inspection results suggest that overall, the algorithm is accurate in identifying stuck or functioning trackers on clear-sky days. Visual inspection also shows that there are days that are not classified by the metric where the power output data may be sufficient to identify a stuck tracker. Based on the daily tracking metric, uptime results are calculated for 83 different inverters at 34 PV sites. The mean tracker uptime is calculated at 99% based on 2 different calculation methods. The daily tracking metric clearly has limitations, but as there is no existing metrics in the literature, it provides a valuable tool for flagging stuck trackers.

  5. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  6. temperature overspecification

    Directory of Open Access Journals (Sweden)

    Mehdi Dehghan

    2001-01-01

    Full Text Available Two different finite difference schemes for solving the two-dimensional parabolic inverse problem with temperature overspecification are considered. These schemes are developed for indentifying the control parameter which produces, at any given time, a desired temperature distribution at a given point in the spatial domain. The numerical methods discussed, are based on the (3,3 alternating direction implicit (ADI finite difference scheme and the (3,9 alternating direction implicit formula. These schemes are unconditionally stable. The basis of analysis of the finite difference equation considered here is the modified equivalent partial differential equation approach, developed from the 1974 work of Warming and Hyett [17]. This allows direct and simple comparison of the errors associated with the equations as well as providing a means to develop more accurate finite difference schemes. These schemes use less central processor times than the fully implicit schemes for two-dimensional diffusion with temperature overspecification. The alternating direction implicit schemes developed in this report use more CPU times than the fully explicit finite difference schemes, but their unconditional stability is significant. The results of numerical experiments are presented, and accuracy and the Central Processor (CPU times needed for each of the methods are discussed. We also give error estimates in the maximum norm for each of these methods.

  7. Accelerated Testing of Polymeric Composites Using the Dynamic Mechanical Analyzer

    Science.gov (United States)

    Abdel-Magid, Becky M.; Gates, Thomas S.

    2000-01-01

    Creep properties of IM7/K3B composite material were obtained using three accelerated test methods at elevated temperatures. Results of flexural creep tests using the dynamic mechanical analyzer (DMA) were compared with results of conventional tensile and compression creep tests. The procedures of the three test methods are described and the results are presented. Despite minor differences in the time shift factor of the creep compliance curves, the DMA results compared favorably with the results from the tensile and compressive creep tests. Some insight is given into establishing correlations between creep compliance in flexure and creep compliance in tension and compression. It is shown that with careful consideration of the limitations of flexure creep, a viable and reliable accelerated test procedure can be developed using the DMA to obtain the viscoelastic properties of composites in extreme environments.

  8. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  9. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  10. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  11. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  12. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  13. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  14. Thromboelastography platelet mapping in healthy dogs using 1 analyzer versus 2 analyzers.

    Science.gov (United States)

    Blois, Shauna L; Banerjee, Amrita; Wood, R Darren; Park, Fiona M

    2013-07-01

    The objective of this study was to describe the results of thromboelastography platelet mapping (TEG-PM) carried out using 2 techniques in 20 healthy dogs. Maximum amplitudes (MA) generated by thrombin (MAthrombin), fibrin (MAfibrin), adenosine diphosphate (ADP) receptor activity (MAADP), and thromboxane A2 (TxA2) receptor activity (stimulated by arachidonic acid, MAAA) were recorded. Thromboelastography platelet mapping was carried out according to the manufacturer's guidelines (2-analyzer technique) and using a variation of this method employing only 1 analyzer (1-analyzer technique) on 2 separate blood samples obtained from each dog. Mean [± standard deviation (SD)] MA values for the 1-analyzer/2-analyzer techniques were: MAthrombin = 51.9 mm (± 7.1)/52.5 mm (± 8.0); MAfibrin = 20.7 mm (± 21.8)/23.0 mm (± 26.1); MAADP = 44.5 mm (± 15.6)/45.6 mm (± 17.0); and MAAA = 45.7 mm (± 11.6)/45.0 mm (± 15.4). Mean (± SD) percentage aggregation due to ADP receptor activity was 70.4% (± 32.8)/67.6% (± 33.7). Mean percentage aggregation due to TxA2 receptor activity was 77.3% (± 31.6)/78.1% (± 50.2). Results of TEG-PM were not significantly different for the 1-analyzer and 2-analyzer methods. High correlation was found between the 2 methods for MAfibrin [concordance correlation coefficient (r) = 0.930]; moderate correlation was found for MAthrombin (r = 0.70) and MAADP (r = 0.57); correlation between the 2 methods for MAAA was lower (r = 0.32). Thromboelastography platelet mapping (TEG-PM) should be further investigated to determine if it is a suitable method for measuring platelet dysfunction in dogs with thrombopathy.

  15. Martian ionosphere as observed by the Viking retarding potential analyzers

    International Nuclear Information System (INIS)

    Hanson, W.B.; Sanatani, S.; Zuccaro, D.R.

    1977-01-01

    The retarding potential analyzers on the Viking landers obtained the first in situ measurements of ions from another planetary ionosphere. Mars has an F 1 ionosphere layer with a peak ion concentration of approximately 10 5 cm -3 just below 130-km altitude, of which approx.90% are O 2 + and 10% CO 2 + . At higher altitudes, O + ions were detected with peak concentration near 225 km of less than 10 3 cm -3 . Viking 1 measured ion temperatures of approximately 150 0 K near the F 1 peak increasing to an apparent exospheric temperature of 210 0 K near 175 km. Above this altitude, departures from thermal equilibrium with the neutral gas occur, and T 1 increases rapidly to >1000 0 K at 250 km. An equatorward horizontal ion velocity of the order of 100--200 m/s was observed near 200 km and near the F 1 peak, with a minimum velocity at intermediate heights. Both landers entered the F 1 layer at a solar zenith angle near 44 0 , though the local times of the Viking 1 and 2 entries were 16:13 and 9:49 LT, respectively. On Viking 2, considerably more structure was observed in the height profiles of ionospheric quantities, although they were similar in shape to the Viking 1 profiles

  16. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  17. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  18. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  19. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  20. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  1. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  2. Temperature oscillations at critical temperature in two-phase flow

    International Nuclear Information System (INIS)

    Brevi, R.; Cumo, M.; Palmieri, A.; Pitimada, D.

    Some experiments on the temperature oscillations, or thermal cycling, which occur with steam-water flow in once-through cooling systems at the critical temperature zone, i.e., when dryout occurs, are described. A theoretical analysis is done on the characteristic frequency of the oscillations, and the parameters upon which the operating characteristics and the physical properties of the fluid depend. Finally, the temperature distribution in the critical zone is analyzed, examining the thermal transitions that occur due to the rapid variations in the coefficient of heat transfer

  3. Determination of enthalpy, temperature, surface tension and geometry of the material transfer in PGMAW for the system argon–iron

    International Nuclear Information System (INIS)

    Siewert, E; Schein, J; Forster, G

    2013-01-01

    The metal transfer is a fundamental process in gas metal arc welding, which substantially determines the shape of the weld seam and strongly influences arc formation and stability. In this investigation the material transfer from the wire electrode (anode) to the workpiece (cathode) is analysed experimentally with high accuracy using various innovative diagnostic techniques for a pulsed gas metal arc welding (PGMAW) process. A high-speed two-colour pyrometer, a calorimeter, thermocouples, a stereo optical setup and a droplet oscillation technique are used to analyse a precisely defined PGMAW process. Thus, results obtained are verified by different measurement techniques and enable a comprehensive description of the material transfer procedure. The surface temperature of both electrodes as well as the droplet temperature, enthalpy and surface tension were determined. Furthermore, the geometry of the arc, wire, droplets and weld pool were extracted in three dimensions in order to describe the interaction between the material transfer and the formation of the weld seam. The experiments are performed using argon as shielding gas and pure iron as filler and base material to reduce complex chemical processes. It turned out that the wire feed rate has the biggest influence on droplet temperature and detachment. A correlation between weld pool formation and weld pool surface temperature gradient was observed, which is mainly a function of welding speed and wire feed rate. The experimental results obtained provide a detailed data pool for use in modelling. (paper)

  4. Residential Indoor Temperature Study

    Energy Technology Data Exchange (ETDEWEB)

    Booten, Chuck [National Renewable Energy Lab. (NREL), Golden, CO (United States); Robertson, Joseph [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, Dane [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heaney, Mike [Arrow Electronics, Centennial, CO (United States); Brown, David [Univ. of Virginia, Charlottesville, VA (United States); Norton, Paul [Norton Energy Research and Development, Boulder, CO (United States); Smith, Chris [Ingersoll-Rand Corp., Dublin (Ireland)

    2017-04-07

    In this study, we are adding to the body of knowledge around answering the question: What are good assumptions for HVAC set points in U.S. homes? We collected and analyzed indoor temperature data from US homes using funding from the U.S. Department of Energy's Building America (BA) program, due to the program's reliance on accurate energy simulation of homes. Simulations are used to set Building America goals, predict the impact of new building techniques and technologies, inform research objectives, evaluate home performance, optimize efficiency packages to meet savings goals, customize savings approaches to specific climate zones, and myriad other uses.

  5. The Decompositioning of Volatile-Matter of Tanjung Enim Coal by using Thermogravimetry Analyzer (TGA

    Directory of Open Access Journals (Sweden)

    Nukman Nukman

    2010-10-01

    Full Text Available Coal is a nature material which a kind of energy source. The decompotition of coal could analyze by heat treated using thermogravimetry analyzer. The decomposition of the volatile matter for three kinds of Tanjung Enim coal could be known. The value of activation energy that be found diference, then for Semi Anthracite, Bitumonius and Sub Bituminous Coal, the initial temperatures are 60.8 oC, 70.7 oC, 97.8oC, and the last temperatures are 893.8 oC, 832 oC, 584.6oC.

  6. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    Science.gov (United States)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  7. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  8. Comparative evaluation of Plateletworks, Multiplate analyzer and Platelet function analyzer-200 in cardiology patients.

    Science.gov (United States)

    Kim, Jeeyong; Cho, Chi Hyun; Jung, Bo Kyeung; Nam, Jeonghun; Seo, Hong Seog; Shin, Sehyun; Lim, Chae Seung

    2018-04-14

    The objective of this study was to comparatively evaluate three commercial whole-blood platelet function analyzer systems: Platelet Function Analyzer-200 (PFA; Siemens Canada, Mississauga, Ontario, Canada), Multiplate analyzer (MP; Roche Diagnostics International Ltd., Rotkreuz, Switzerland), and Plateletworks Combo-25 kit (PLW; Helena Laboratories, Beaumont, TX, USA). Venipuncture was performed on 160 patients who visited a department of cardiology. Pairwise agreement among the three platelet function assays was assessed using Cohen's kappa coefficient and percent agreement within the reference limit. Kappa values with the same agonists were poor between PFA-collagen (COL; agonist)/adenosine diphosphate (ADP) and MP-ADP (-0.147), PFA-COL/ADP and PLW-ADP (0.089), MP-ADP and PLW-ADP (0.039), PFA-COL/ADP and MP-COL (-0.039), and between PFA-COL/ADP and PLW-COL (-0.067). Nonetheless, kappa values for the same assay principle with a different agonist were slightly higher between PFA-COL/ADP and PFA-COL/EPI (0.352), MP-ADP and MP-COL (0.235), and between PLW-ADP and PLW-COL (0.247). The range of percent agreement values was 38.7% to 73.8%. Therefore, various measurements of platelet function by more than one method were needed to obtain a reliable interpretation of platelet function considering low kappa coefficient and modest percent agreement rates among 3 different platelet function tests.

  9. AN EXPERIMENTAL STUDY OF CUTTING FLUID EFFECTS IN DRILLING. (R825370C057)

    Science.gov (United States)

    Experiments were designed and conducted on aluminum alloys and gray cast iron to determine the function of cutting fluid in drilling. The variables examined included speed, feed, hole depth, tool and workpiece material, cutting fluid condition, workpiece temperatures and drill...

  10. Borehole Stability in High-Temperature Formations

    Science.gov (United States)

    Yan, Chuanliang; Deng, Jingen; Yu, Baohua; Li, Wenliang; Chen, Zijian; Hu, Lianbo; Li, Yang

    2014-11-01

    In oil and gas drilling or geothermal well drilling, the temperature difference between the drilling fluid and formation will lead to an apparent temperature change around the borehole, which will influence the stress state around the borehole and tend to cause borehole instability in high geothermal gradient formations. The thermal effect is usually not considered as a factor in most of the conventional borehole stability models. In this research, in order to solve the borehole instability in high-temperature formations, a calculation model of the temperature field around the borehole during drilling is established. The effects of drilling fluid circulation, drilling fluid density, and mud displacement on the temperature field are analyzed. Besides these effects, the effect of temperature change on the stress around the borehole is analyzed based on thermoelasticity theory. In addition, the relationships between temperature and strength of four types of rocks are respectively established based on experimental results, and thermal expansion coefficients are also tested. On this basis, a borehole stability model is established considering thermal effects and the effect of temperature change on borehole stability is also analyzed. The results show that the fracture pressure and collapse pressure will both increase as the temperature of borehole rises, and vice versa. The fracture pressure is more sensitive to temperature. Temperature has different effects on collapse pressures due to different lithological characters; however, the variation of fracture pressure is unrelated to lithology. The research results can provide a reference for the design of drilling fluid density in high-temperature wells.

  11. Calculation of the electrical of induction heating coils in two dimensional axissymmetric geometry

    Energy Technology Data Exchange (ETDEWEB)

    Nerg, J.; Partanen, J. [Lappeenranta University of Technology (Finland). Department of Energy Technology, Laboratory of Electrical Engineering

    1997-12-31

    The effect of the workpiece temperature on the electrical parameters of a plane, spiral inductor is discussed. The effect of workpiece temperature on the electrical efficiency, power transfer to the workpiece and electromagnetic distortion are also presented. Calculation is performed in two dimensional axissymmetric geometry using a FEM program. (orig.) 5 refs.

  12. Development of remote controlled electron probe micro analyzer with crystal orientation analyzer

    International Nuclear Information System (INIS)

    Honda, Junichi; Matsui, Hiroki; Harada, Akio; Obata, Hiroki; Tomita, Takeshi

    2012-07-01

    The advanced utilization of Light Water Reactor (LWR) fuel is progressed in Japan to save the power generating cost and the volume of nuclear wastes. The electric power companies have continued the approach to the burnup extension and to rise up the thermal power increase of the commercial fuel. The government should be accumulating the detailed information on the newest technologies to make the regulations and guidelines for the safety of the advanced nuclear fuels. The remote controlled Electron Probe Micro Analyzer (EPMA) attached with crystal orientation analyzer has been developed in Japan Atomic Energy Agency (JAEA) to study the fuel behavior of the high burnup fuels under the accident condition. The effects of the cladding microstructure on the fuel behavior will be evaluated more conveniently and quantitatively by this EPMA. The commercial model of EPMA has been modified to have the performance of airtight and earthquake resistant in compliance with the safety regulation by the government for handling the high radioactive elements. This paper describes the specifications of EPMA which were specialised for post irradiation examination and the test results of the cold mock-up to confirm their performances and reliabilities. (author)

  13. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  14. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  15. Abbott prism: a multichannel heterogeneous chemiluminescence immunoassay analyzer.

    Science.gov (United States)

    Khalil, O S; Zurek, T F; Tryba, J; Hanna, C F; Hollar, R; Pepe, C; Genger, K; Brentz, C; Murphy, B; Abunimeh, N

    1991-09-01

    We describe a multichannel heterogeneous immunoassay analyzer in which a sample is split between disposable reaction trays in a group of linear tracks. The system's pipettor uses noninvasive sensing of the sample volume and disposable pipet tips. Each assay track has (a) a conveyor belt for moving reaction trays to predetermined functional stations, (b) temperature-controlled tunnels, (c) noncontact transfer of the reaction mixture between incubation and detection wells, and (d) single-photon counting to detect a chemiluminescence (CL) signal from the captured immunochemical product. A novel disposable reaction tray, with separate reaction and detection wells and self-contained fluid removal, is used in conjunction with the transfer device on the track to produce a carryover-free system. The linear immunoassay track has nine predetermined positions for performing individual assay steps. Assay step sequence and timing is selected by changing the location of the assay modules between these predetermined positions. The assay methodology, a combination of microparticle capture and direct detection of a CL signal on a porous matrix, offers excellent sensitivity, specificity, and ease of automation. Immunoassay configurations have been tested for hepatitis B surface antigen and for antibodies to hepatitis B core antigen, hepatitis C virus, human immunodeficiency virus I and II, and human T-cell leukemia virus I and II.

  16. Analyzing reflectance spectra of human skin in legal medicine

    Science.gov (United States)

    Belenki, Liudmila; Sterzik, Vera; Schulz, Katharina; Bohnert, Michael

    2013-01-01

    Our current research in the framework of an interdisciplinary project focuses on modelling the dynamics of the hemoglobin reoxygenation process in post-mortem human skin by reflectance spectrometry. The observations of reoxygenation of hemoglobin in livores after postmortem exposure to a cold environment relate the reoxygenation to the commonly known phenomenon that the color impression of livores changes from livid to pink under low ambient temperatures. We analyze the spectra with respect to a physical model describing the optical properties of human skin, discuss the dynamics of the reoxygenation, and propose a phenomenological model for reoxygenation. For additional characterization of the reflectance spectra, the curvature of the local minimum and maximum in the investigated spectral range is considered. There is a strong correlation between the curvature of specra at a wavelength of 560 nm and the concentration of O2-Hb. The analysis is carried out via C programs, as well as MySQL database queries in Java EE, JDBC, Matlab, and Python.

  17. Improved data visualization techniques for analyzing macromolecule structural changes.

    Science.gov (United States)

    Kim, Jae Hyun; Iyer, Vidyashankara; Joshi, Sangeeta B; Volkin, David B; Middaugh, C Russell

    2012-10-01

    The empirical phase diagram (EPD) is a colored representation of overall structural integrity and conformational stability of macromolecules in response to various environmental perturbations. Numerous proteins and macromolecular complexes have been analyzed by EPDs to summarize results from large data sets from multiple biophysical techniques. The current EPD method suffers from a number of deficiencies including lack of a meaningful relationship between color and actual molecular features, difficulties in identifying contributions from individual techniques, and a limited ability to be interpreted by color-blind individuals. In this work, three improved data visualization approaches are proposed as techniques complementary to the EPD. The secondary, tertiary, and quaternary structural changes of multiple proteins as a function of environmental stress were first measured using circular dichroism, intrinsic fluorescence spectroscopy, and static light scattering, respectively. Data sets were then visualized as (1) RGB colors using three-index EPDs, (2) equiangular polygons using radar charts, and (3) human facial features using Chernoff face diagrams. Data as a function of temperature and pH for bovine serum albumin, aldolase, and chymotrypsin as well as candidate protein vaccine antigens including a serine threonine kinase protein (SP1732) and surface antigen A (SP1650) from S. pneumoniae and hemagglutinin from an H1N1 influenza virus are used to illustrate the advantages and disadvantages of each type of data visualization technique. Copyright © 2012 The Protein Society.

  18. Investigation on caloric requirement of biomass pyrolysis using TG-DSC analyzer

    Energy Technology Data Exchange (ETDEWEB)

    He Fang [Institute of Utilization of Biomass, Shandong University of Technology, No. 12, Zhangzhou Road, Zibo, Shandong 255049 (China)]. E-mail: hf@sdut.edu.cn; Yi Weiming [Institute of Utilization of Biomass, Shandong University of Technology, No. 12, Zhangzhou Road, Zibo, Shandong 255049 (China); Bai Xueyuan [Institute of Utilization of Biomass, Shandong University of Technology, No. 12, Zhangzhou Road, Zibo, Shandong 255049 (China)

    2006-09-15

    The caloric requirement of biomass pyrolysis has an important influence on the course of the thermal conversion. However, precise data are difficult to achieve by the current calculation method because of the complexity of the process. A new method for achieving the caloric requirement of the process by integrating the differential scanning calorimetry (DSC) curves was proposed after the simultaneous thermal analyzer (TG-DSC) and DSC curves were investigated. Experiments were conducted for wheat straw, cotton stalk, pine and peanut shell on a Netsch STA 449C analyzer. Powder samples were put into a platinum crucible with a lid on a high accuracy DSC-cp sample holder in the furnace and then heated from ambient temperature up to the maximum temperature of 973 K at the heating rate of 10 K/min in the analyzer. The product gases were swept away by 25 ml/min nitrogen. Mass changes (TG) and calorimetric effects (DSC) were recorded and analyzed. The process was investigated in detail through comparison of the DTG (differential thermogravimetric) and DSC curves of wheat straw. After the water influence in the DSC was eliminated, the relationship of the caloric requirement with the temperature of the aforementioned dry biomass was obtained by integrating the DSC curve. The results showed that 523 kJ, 459 kJ, 646 kJ and 385 kJ were required, respectively, to increase the temperature of 1 kg of dried wheat straw, cotton stalk, pine and peanut from 303 K to 673 K.

  19. Design and calibration of a fast-time resolution charge exchange analyzer

    International Nuclear Information System (INIS)

    Scime, E.; Hokin, S.

    1992-04-01

    A five channel, fast time resolution, scanning charge exchange analyzer has been developed for the Madison Symmetric Torus (MST). The analyzer consists of an iron vacuum vessel, a gas stripping cell, an electrostatic bending field, and five continuous electron multiplier detectors. The incident neutral flux and operation of the detectors in current mode limits the time resolution of the analyzer to 10 μs. The analyzer was absolutely calibrated over the energy range of interest (500--2000 eV) with an H + beam, so that the charge exchange power loss could also be measured. The analyzer can be swiveled on a shot-to-shot basis for measurements of T i (r), where 0.3 < r/a < 0.7. The mechanical design was driven by the need for a low cost, expandable ion temperature diagnostic

  20. Analysis of the sensitivity and sample-furnace thermal-lag of a differential thermal analyzer

    International Nuclear Information System (INIS)

    Roura, P.; Farjas, J.

    2005-01-01

    The heat exchange between the horizontal furnace of a differential thermal analyzer (DTA) and the sample is analyzed with the aim of understanding the parameters governing the thermal signal. The resistance due to radiation and conduction through the gas has been calculated and compared to the experimental values of the thermal-lag between the sample and furnace and apparatus sensitivity. The overall evolution of these parameters with the temperature and their relative values are well understood by considering the temperature differences that arise between the sample and holder. Two RC thermal models are used for describing the apparatus performance at different temperature ranges. Finally, the possibility of improving the signal quality through the control of the leak resistances is stressed

  1. Plant analyzer for high-speed interactive simulation of BWR plant transients

    International Nuclear Information System (INIS)

    Cheng, H.S.; Lekach, S.V.; Mallen, A.N.; Wulff, W.; Cerbone, R.J.

    1984-01-01

    A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology was utilized to develop a plant analyzer which affords realistic predictions of plant transients and severe off-normal events in LWR power plants through on-line simulations at speeds up to 10 times faster than actual process speeds. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the entire balance of the plant. Reactor core models include point kinetics with reactivity feedback due to void fraction, fuel temperature, coolant temperature, and boron concentration as well as a conduction model for predicting fuel and clad temperatures. Control systems and trip logic for plant protection systems are also simulated. The AD10 of Applied Dynamics International, a special-purpose peripheral processor, is used as the principal hardware of the plant analyzer

  2. Temperature compensation and entrainment in circadian rhythms

    International Nuclear Information System (INIS)

    Bodenstein, C; Heiland, I; Schuster, S

    2012-01-01

    To anticipate daily variations in the environment and coordinate biological activities into a daily cycle many organisms possess a circadian clock. In the absence of external time cues the circadian rhythm persists with a period of approximately 24 h. The clock phase can be shifted by single pulses of light, darkness, chemicals, or temperature and this allows entrainment of the clock to exactly 24 h by cycles of these zeitgebers. On the other hand, the period of the circadian rhythm is kept relatively constant within a physiological range of constant temperatures, which means that the oscillator is temperature compensated. The mechanisms behind temperature compensation and temperature entrainment are not fully understood, neither biochemically nor mathematically. Here, we theoretically investigate the interplay of temperature compensation and entrainment in general oscillatory systems. We first give an analytical treatment for small temperature shifts and derive that every temperature-compensated oscillator is entrainable to external small-amplitude temperature cycles. Temperature compensation ensures that this entrainment region is always centered at the endogenous period regardless of possible seasonal temperature differences. Moreover, for small temperature cycles the entrainment region of the oscillator is potentially larger for rectangular pulses. For large temperature shifts we numerically analyze different circadian clock models proposed in the literature with respect to these properties. We observe that for such large temperature shifts sinusoidal or gradual temperature cycles allow a larger entrainment region than rectangular cycles. (paper)

  3. Thermography and Sonic Anemometry to Analyze Air Heaters in Mediterranean Greenhouses

    Directory of Open Access Journals (Sweden)

    Araceli Peña

    2012-10-01

    Full Text Available The present work has developed a methodology based on thermography and sonic anemometry for studying the microclimate in Mediterranean greenhouses equipped with air heaters and polyethylene distribution ducts to distribute the warm air. Sonic anemometry allows us to identify the airflow pattern generated by the heaters and to analyze the temperature distribution inside the greenhouse, while thermography provides accurate crop temperature data. Air distribution by means of perforated polyethylene ducts at ground level, widely used in Mediterranean-type greenhouses, can generate heterogeneous temperature distributions inside the greenhouse when the system is not correctly designed. The system analyzed in this work used a polyethylene duct with a row of hot air outlet holes (all of equal diameter that expel warm air toward the ground to avoid plant damage. We have observed that this design (the most widely used in Almería’s greenhouses produces stagnation of hot air in the highest part of the structure, reducing the heating of the crop zone. Using 88 kW heating power (146.7 W∙m−2 the temperature inside the greenhouse is maintained 7.2 to 11.2 °C above the outside temperature. The crop temperature (17.6 to 19.9 °C was maintained above the minimum recommended value of 10 °C.

  4. Thermography and sonic anemometry to analyze air heaters in Mediterranean greenhouses.

    Science.gov (United States)

    López, Alejandro; Valera, Diego L; Molina-Aiz, Francisco; Peña, Araceli

    2012-10-16

    The present work has developed a methodology based on thermography and sonic anemometry for studying the microclimate in Mediterranean greenhouses equipped with air heaters and polyethylene distribution ducts to distribute the warm air. Sonic anemometry allows us to identify the airflow pattern generated by the heaters and to analyze the temperature distribution inside the greenhouse, while thermography provides accurate crop temperature data. Air distribution by means of perforated polyethylene ducts at ground level, widely used in Mediterranean-type greenhouses, can generate heterogeneous temperature distributions inside the greenhouse when the system is not correctly designed. The system analyzed in this work used a polyethylene duct with a row of hot air outlet holes (all of equal diameter) that expel warm air toward the ground to avoid plant damage. We have observed that this design (the most widely used in Almería's greenhouses) produces stagnation of hot air in the highest part of the structure, reducing the heating of the crop zone. Using 88 kW heating power (146.7 W ∙ m(-2)) the temperature inside the greenhouse is maintained 7.2 to 11.2 °C above the outside temperature. The crop temperature (17.6 to 19.9 °C) was maintained above the minimum recommended value of 10 °C.

  5. Grinding temperature and energy ratio coe cient in MQL grinding of high-temperature nickel-base alloy by using di erent vegetable oils as base oil

    Institute of Scientific and Technical Information of China (English)

    Li Benkai; Li Changhe; Zhang Yanbin; Wang Yaogang; Jia Dongzhou; Yang Min

    2016-01-01

    Vegetable oil can be used as a base oil in minimal quantity of lubrication (MQL). This study compared the performances of MQL grinding by using castor oil, soybean oil, rapeseed oil, corn oil, sunflower oil, peanut oil, and palm oil as base oils. A K-P36 numerical-control precision surface grinder was used to perform plain grinding on a workpiece material with a high-temperature nickel base alloy. A YDM–III 99 three-dimensional dynamometer was used to measure grinding force, and a clip-type thermocouple was used to determine grinding temperature. The grinding force, grind-ing temperature, and energy ratio coefficient of MQL grinding were compared among the seven veg-etable oil types. Results revealed that (1) castor oil-based MQL grinding yields the lowest grinding force but exhibits the highest grinding temperature and energy ratio coefficient;(2) palm oil-based MQL grinding generates the second lowest grinding force but shows the lowest grinding temperature and energy ratio coefficient;(3) MQL grinding based on the five other vegetable oils produces similar grinding forces, grinding temperatures, and energy ratio coefficients, with values ranging between those of castor oil and palm oil;(4) viscosity significantly influences grinding force and grinding tem-perature to a greater extent than fatty acid varieties and contents in vegetable oils;(5) although more viscous vegetable oil exhibits greater lubrication and significantly lower grinding force than less vis-cous vegetable oil, high viscosity reduces the heat exchange capability of vegetable oil and thus yields a high grinding temperature;(6) saturated fatty acid is a more efficient lubricant than unsaturated fatty acid;and (7) a short carbon chain transfers heat more effectively than a long carbon chain. Palm oil is the optimum base oil of MQL grinding, and this base oil yields 26.98 N tangential grinding force, 87.10 N normal grinding force, 119.6 °C grinding temperature, and 42.7%energy ratio coefficient

  6. Grinding temperature and energy ratio coefficient in MQL grinding of high-temperature nickel-base alloy by using different vegetable oils as base oil

    Directory of Open Access Journals (Sweden)

    Li Benkai

    2016-08-01

    Full Text Available Vegetable oil can be used as a base oil in minimal quantity of lubrication (MQL. This study compared the performances of MQL grinding by using castor oil, soybean oil, rapeseed oil, corn oil, sunflower oil, peanut oil, and palm oil as base oils. A K-P36 numerical-control precision surface grinder was used to perform plain grinding on a workpiece material with a high-temperature nickel base alloy. A YDM–III 99 three-dimensional dynamometer was used to measure grinding force, and a clip-type thermocouple was used to determine grinding temperature. The grinding force, grinding temperature, and energy ratio coefficient of MQL grinding were compared among the seven vegetable oil types. Results revealed that (1 castor oil-based MQL grinding yields the lowest grinding force but exhibits the highest grinding temperature and energy ratio coefficient; (2 palm oil-based MQL grinding generates the second lowest grinding force but shows the lowest grinding temperature and energy ratio coefficient; (3 MQL grinding based on the five other vegetable oils produces similar grinding forces, grinding temperatures, and energy ratio coefficients, with values ranging between those of castor oil and palm oil; (4 viscosity significantly influences grinding force and grinding temperature to a greater extent than fatty acid varieties and contents in vegetable oils; (5 although more viscous vegetable oil exhibits greater lubrication and significantly lower grinding force than less viscous vegetable oil, high viscosity reduces the heat exchange capability of vegetable oil and thus yields a high grinding temperature; (6 saturated fatty acid is a more efficient lubricant than unsaturated fatty acid; and (7 a short carbon chain transfers heat more effectively than a long carbon chain. Palm oil is the optimum base oil of MQL grinding, and this base oil yields 26.98 N tangential grinding force, 87.10 N normal grinding force, 119.6 °C grinding temperature, and 42.7% energy

  7. Morphology and Temperatures at Pele

    Science.gov (United States)

    Howell, Robert R.; Lopes, R. M. C.

    2010-10-01

    The Pele region of Io has been the site of vigorous volcanic activity from the time of the first Voyager I observations in 1979 up through the final Galileo ones in 2001. There is high temperature thermal emission from what is thought to be a rapidly overturning lava lake, and also the source of a large sulfur-rich plume. We present a new analysis of Voyager I visible wavelength images, and Galileo Solid State Imager (SSI) and Near Infrared Mapping Spectrometer (NIMS) thermal emission observations which better define the morphology of the region and the intensity of the emission. The observations show remarkably correlations between the locations of the emission and the features seen in the Voyager images, which provide insight into eruption mechanisms and constrain the longevity of the activity. We also analyze an additional wavelength channel of NIMS data (1.87 micron) which paradoxically, because of reduced sensitivity, allows us to estimate temperatures at the peak locations of emission. Measurements of eruption temperatures on Io are crucial because they provide our best clues to the composition of the magma. High color temperatures indicative of ultramafic composition have been reported for the Pillan hot spot and possibly for Pele, although recent work has called into question the requirement for magma temperatures above those expected for ordinary basalts. Our new analysis of the Pele emission near the peak of the hot spot shows color temperatures near the upper end of the basalt range during the I27 and I32 encounters. We also analyze those temperatures in terms of lava cooling models to determine the required magma temperatures.

  8. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    Science.gov (United States)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  9. analysis of an analysis of an intelligent temperature transmitter

    African Journals Online (AJOL)

    eobe

    temperature sensors and analyze a typical Rosemount Intelligent Temperature Transmitter (RITT) with a view to identifying and ... material science and communication technologies [2]. ... Some benefits of the 4-20mA transmission standard.

  10. Thermal study of sintered (Th-U)O2 MOX pellet by a commercial thermo-gravimetric analyzer coupled with an evolved gas analyzer

    International Nuclear Information System (INIS)

    Mahanty, B.N.; Khan, F.A.; Karande, A.; Prakash, A.; Afzal, Md.; Panakkal, J.P.; Kamath, H.S.

    2010-01-01

    Full text: Fabrication of (Th-U)O 2 MOX pellets by the impregnation agglomerate pelletization (lAP) process is being explored in Advanced Fuel Fabrication Facility, BARC, Tarapur for the forthcoming Advanced Heavy Water Reactor (AHWR). High temperature thermal study of this fuel is important in order to understand the behaviour of the fuel under the operational temperature of the reactor. In this study, fabrication of ThO 2 -3%UO 2 was carried out by impregnation agglomerate pelletization process and subsequently sintered in reducing or air atmosphere. The degassed pellets were broken into small pieces and subjected to high temperature (1050 deg C-1250 deg C) heating under high pure argon gas in a commercial thermal analyzer. Subsequently the evolved gases were qualitatively analyzed by a quadrupole mass analyzer. The pellet sintered in reducing atmosphere (IAP-R) shows an increase in weight after the analysis where as the pellet sintered in oxidizing atmosphere (IAP-O) shows a decrease in final weight. The IAP-R pellet may become slightly hyper-stoichiometric on heating due to the presence of small amount of oxygen in the high pure argon gas. This is further supported by the mass spectrum at m/z 32(O 2 + ) that shows a decrease in the signal intensity as the temperature of analysis increases. The sharp decrease of the signal intensity at m/z 32(O 2 + ) started at 920 deg C temperature may be attributed to the formation of SO 2 (m/z=64) and CO 2 (m/z=44) gases. On the other hand the IAP-O pellet being hyper stoichiometric initially may lose its weight to form water on reaction with the excess oxygen on heating due to the presence ( small amount of hydrogen in the high pure argon gas. This is being supported by the appearance of small peak at m/z 18 (H 2 O + ) in the mass spectrum. The formation of SO 2 and CO 2 gases started at higher temperature in case of IAP-O pellet as compared to that of IAP-R pellet. This may be due to the higher density achieved in case of

  11. Evaluation of system codes for analyzing naturally circulating gas loop

    International Nuclear Information System (INIS)

    Lee, Jeong Ik; No, Hee Cheon; Hejzlar, Pavel

    2009-01-01

    Steady-state natural circulation data obtained in a 7 m-tall experimental loop with carbon dioxide and nitrogen are presented in this paper. The loop was originally designed to encompass operating range of a prototype gas-cooled fast reactor passive decay heat removal system, but the results and conclusions are applicable to any natural circulation loop operating in regimes having buoyancy and acceleration parameters within the ranges validated in this loop. Natural circulation steady-state data are compared to numerical predictions by two system analysis codes: GAMMA and RELAP5-3D. GAMMA is a computational tool for predicting various transients which can potentially occur in a gas-cooled reactor. The code has a capability of analyzing multi-dimensional multi-component mixtures and includes models for friction, heat transfer, chemical reaction, and multi-component molecular diffusion. Natural circulation data with two gases show that the loop operates in the deteriorated turbulent heat transfer (DTHT) regime which exhibits substantially reduced heat transfer coefficients compared to the forced turbulent flow. The GAMMA code with an original heat transfer package predicted conservative results in terms of peak wall temperature. However, the estimated peak location did not successfully match the data. Even though GAMMA's original heat transfer package included mixed-convection regime, which is a part of the DTHT regime, the results showed that the original heat transfer package could not reproduce the data with sufficient accuracy. After implementing a recently developed correlation and corresponding heat transfer regime map into GAMMA to cover the whole range of the DTHT regime, we obtained better agreement with the data. RELAP5-3D results are discussed in parallel.

  12. Innovative application of the moisture analyzer for determination of dry mass content of processed cheese

    Science.gov (United States)

    Kowalska, Małgorzata; Janas, Sławomir; Woźniak, Magdalena

    2018-04-01

    The aim of this work was the presentation of an alternative method of determination of the total dry mass content in processed cheese. The authors claim that the presented method can be used in industry's quality control laboratories for routine testing and for quick in-process control. For the test purposes both reference method of determination of dry mass in processed cheese and moisture analyzer method were used. The tests were carried out for three different kinds of processed cheese. In accordance with the reference method, the sample was placed on a layer of silica sand and dried at the temperature of 102 °C for about 4 h. The moisture analyzer test required method validation, with regard to drying temperature range and mass of the analyzed sample. Optimum drying temperature of 110 °C was determined experimentally. For Hochland cream processed cheese sample, the total dry mass content, obtained using the reference method, was 38.92%, whereas using the moisture analyzer method, it was 38.74%. An average analysis time in case of the moisture analyzer method was 9 min. For the sample of processed cheese with tomatoes, the reference method result was 40.37%, and the alternative method result was 40.67%. For the sample of cream processed cheese with garlic the reference method gave value of 36.88%, and the alternative method, of 37.02%. An average time of those determinations was 16 min. Obtained results confirmed that use of moisture analyzer is effective. Compliant values of dry mass content were obtained for both of the used methods. According to the authors, the fact that the measurement took incomparably less time for moisture analyzer method, is a key criterion of in-process control and final quality control method selection.

  13. Evaluation of coronary band temperatures in healthy horses

    DEFF Research Database (Denmark)

    Rosenmeier, Jesper G.; Strathe, Anders Bjerring; Andersen, Pia Haubro

    2012-01-01

    To measure coronary band temperature (CBT) in healthy horses fed high-fructan or low-carbohydrate diets and to analyze the association of CBT with diet, time of day, and ambient temperature.......To measure coronary band temperature (CBT) in healthy horses fed high-fructan or low-carbohydrate diets and to analyze the association of CBT with diet, time of day, and ambient temperature....

  14. Low-temperature thermal expansion

    International Nuclear Information System (INIS)

    Collings, E.W.

    1986-01-01

    This chapter discusses the thermal expansion of insulators and metals. Harmonicity and anharmonicity in thermal expansion are examined. The electronic, magnetic, an other contributions to low temperature thermal expansion are analyzed. The thermodynamics of the Debye isotropic continuum, the lattice-dynamical approach, and the thermal expansion of metals are discussed. Relative linear expansion at low temperatures is reviewed and further calculations of the electronic thermal expansion coefficient are given. Thermal expansions are given for Cu, Al and Ti. Phenomenologic thermodynamic relationships are also discussed

  15. Superhigh Temperatures and Acoustic Cavitation

    CERN Document Server

    Belyaev, V B; Miller, M B; Sermyagin, A V; Topolnikov, A S

    2003-01-01

    The experimental results on thermonuclear synthesis under acoustic cavitation have been analyzed with the account of the latest data and their discussion. The analysis testifies that this avenue of research is a very promising one. The numerical calculations of the D(d, n)^{3}He reaction rate in the deuterated acetone (C_{3}D_{6}O) under the influence of ultrasound depending on T environment temperature within the range T=249-295 K have been carried out within the framework of hydrodynamic model. The results show that it is possible to improve substantially the effect/background relationship in experiments by decreasing the fluid temperature twenty-thirty degrees below zero.

  16. Effect of cutting temperature on hardness of SiC and diamond in the nano-cutting process of monocrystalline silicon

    Science.gov (United States)

    Wang, Jiachun; Li, Yuntao; Liu, Xiaoxuan; Lv, Maoqiang

    2016-10-01

    In the process of cutting silicon by natural diamond tools, groove wear happens on the flank face of cutting tool frequently.Scholars believe that one of the wear reasons is mechanical scratching effect by hard particles like SiC. To reveal the mechanical scratching mechanism, it is essential to study changes in the mechanical properties of hard particles and diamond, especially the effect of cutting temperature on hardness of diamond and hard particles. Molecular dynamics (MD) model that contact-zone temperature between tool and workpiece was calculated by dividing zone while nano-cutting monocrystalline silicon was established, cutting temperature values in different regions were computed as the simulation was carried out.On this basis, the models of molecular dynamics simulation of SiC and diamond were established separately with setting the initial temperature to room temperature. The laws of length change of C-C bond and Si-C bond varing with increase of simulation temperature were studied. And drawing on predecessors' research on theoretical calculation of hardness of covalent crystals and the relationship between crystal valence electron density and bond length, the curves that the hardness of diamond and SiC varing with bond length were obtained. The effect of temperature on the hardness was calculated. Results show that, local cutting temperature can reach 1300K.The rise in cutting temperature leaded to a decrease in the diamond local atomic clusters hardness,SiC local atomic clusters hardness increased. As the cutting temperature was more than 1100K,diamond began to soften, the local clusters hardness was less than that of SiC.

  17. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  18. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  19. Portable Programmable Multifunction Body Fluids Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both simple...

  20. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  1. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  2. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  3. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  4. Hardware Realization of an Ethernet Packet Analyzer Search Engine

    Science.gov (United States)

    2000-06-30

    specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The

  5. 40 CFR 86.1322-84 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... columns is one form of corrective action which may be taken.) (b) Initial and periodic calibration. Prior... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with...

  6. Low temperature monitoring system for subsurface barriers

    Science.gov (United States)

    Vinegar, Harold J [Bellaire, TX; McKinzie, II Billy John [Houston, TX

    2009-08-18

    A system for monitoring temperature of a subsurface low temperature zone is described. The system includes a plurality of freeze wells configured to form the low temperature zone, one or more lasers, and a fiber optic cable coupled to at least one laser. A portion of the fiber optic cable is positioned in at least one freeze well. At least one laser is configured to transmit light pulses into a first end of the fiber optic cable. An analyzer is coupled to the fiber optic cable. The analyzer is configured to receive return signals from the light pulses.

  7. Macroscopic cross sections for analyzing the transport of neutral particles in plasmas

    International Nuclear Information System (INIS)

    Suzuki, Tadakazu; Taji, Yuukichi; Nakahara, Yasuaki

    1975-05-01

    Algorithms have been developed for calculating the ionization and charge exchange cross sections required for analyzing the neutral transport in plasmas. In our algorithms, the integration of the expression for reaction rate of neutrals with plasmas is performed by expanding the integrand with the use of polynomials. At present, multi-energy-group sets of the cross sections depending on plasma temperature and energy of neutrals can be prepared by means of Maxwellian averages over energy. Calculational results are printed out in the FIDO format. Some numerical examples are given for several forms of spatial distributions assumed for the plasma ion temperature and source neutral energy. (auth.)

  8. Correlation between temperature dependence of elastic moduli and Debye temperature of paramagnetic metal

    International Nuclear Information System (INIS)

    Bodryakov, V.Yu.; Povzner, A.A.

    2000-01-01

    The correlation between the temperature dependence of elastic moduli and the Debye temperature of paramagnetic metal is analyzed in neglect of the temperature dependence of the Poison coefficient σ within the frames of the Debye-Grueneisen presentations. It is shown, that namely the temperature dependence of the elastic moduli determines primarily the temperature dependence of the Debye temperature Θ(T). On the other hand, the temperature dependence Θ(T) very weakly effects the temperature dependence of the elastic moduli. The later made it possible to formulate the self-consistent approach to calculation of the elastic moduli temperature dependence. The numerical estimates of this dependence parameters are conducted by the example of the all around compression modulus of the paramagnetic lutetium [ru

  9. Body temperature norms

    Science.gov (United States)

    Normal body temperature; Temperature - normal ... Morrison SF. Regulation of body temperature. In: Boron WF, Boulpaep EL, eds. Medical Physiology . 3rd ed. Philadelphia, PA: Elsevier; 2017:chap 59. Sajadi MM, Mackowiak ...

  10. Research of fuel temperature control in fuel pipeline of diesel engine using positive temperature coefficient material

    Directory of Open Access Journals (Sweden)

    Xiaolu Li

    2016-01-01

    Full Text Available As fuel temperature increases, both its viscosity and surface tension decrease, and this is helpful to improve fuel atomization and then better combustion and emission performances of engine. Based on the self-regulated temperature property of positive temperature coefficient material, this article used a positive temperature coefficient material as electric heating element to heat diesel fuel in fuel pipeline of diesel engine. A kind of BaTiO3-based positive temperature coefficient material, with the Curie temperature of 230°C and rated voltage of 24 V, was developed, and its micrograph and element compositions were also analyzed. By the fuel pipeline wrapped in six positive temperature coefficient ceramics, its resistivity–temperature and heating characteristics were tested on a fuel pump bench. The experiments showed that in this installation, the surface temperature of six positive temperature coefficient ceramics rose to the equilibrium temperature only for 100 s at rated voltage. In rated power supply for six positive temperature coefficient ceramics, the temperature of injection fuel improved for 21°C–27°C within 100 s, and then could keep constant. Using positive temperature coefficient material to heat diesel in fuel pipeline of diesel engine, the injection mass per cycle had little change, approximately 0.3%/°C. This study provides a beneficial reference for improving atomization of high-viscosity liquids by employing positive temperature coefficient material without any control methods.

  11. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  12. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    OpenAIRE

    Jaehyo Jung; Jihoon Lee; Siho Shin; Youn Tae Kim

    2017-01-01

    In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO) glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V) converter to convert the current generated by the reduction-oxidation (redox) reaction of the buffer solution to a voltage signal. This signa...

  13. Gribov gap equation at finite temperature

    International Nuclear Information System (INIS)

    Canfora, Fabrizio; Pais, Pablo; Salgado-Rebolledo, Patricio

    2014-01-01

    In this paper the Gribov gap equation at finite temperature is analyzed. The solutions of the gap equation (which depend explicitly on the temperature) determine the structure of the gluon propagator within the semi-classical Gribov approach. The present analysis is consistent with the standard confinement scenario for low temperatures, while for high enough temperatures, deconfinement takes place and a free gluon propagator is obtained. An intermediate regime in between the confined and free phases can be read off from the resulting gluon propagator, which appears to be closely related to partial deconfinement. (orig.)

  14. Gribov gap equation at finite temperature

    Energy Technology Data Exchange (ETDEWEB)

    Canfora, Fabrizio; Pais, Pablo [Centro de Estudios Cientificos (CECS), Valdivia (Chile); Universidad Andres Bello, Santiago (Chile); Salgado-Rebolledo, Patricio [Centro de Estudios Cientificos (CECS), Valdivia (Chile); Universidad de Concepcion, Departamento de Fisica, Concepcion (Chile); Universite Libre de Bruxelles and International Solvay Insitutes, Physique Theorique et Mathematique, Bruxelles (Belgium)

    2014-05-15

    In this paper the Gribov gap equation at finite temperature is analyzed. The solutions of the gap equation (which depend explicitly on the temperature) determine the structure of the gluon propagator within the semi-classical Gribov approach. The present analysis is consistent with the standard confinement scenario for low temperatures, while for high enough temperatures, deconfinement takes place and a free gluon propagator is obtained. An intermediate regime in between the confined and free phases can be read off from the resulting gluon propagator, which appears to be closely related to partial deconfinement. (orig.)

  15. Ionometric determination of fluorides at low temperatures

    International Nuclear Information System (INIS)

    Kostyukova, I.S.; Ennan, A.A.; Dzerzhko, E.K.; Leivikova, A.A.

    1995-01-01

    A method for determining fluoride ions in solution at low temperatures using a solid-contact fluorine-selective electrode (FSE) has been developed. The effect of temperature (60 to -15 degrees C) on the calibration slope, potential equilibrium time, and operational stability is studied; the effect of an organic additive (cryoprotector) on the calibration slope is also studied. The temperature relationships obtained for the solid-contact FSEs allow appropriate corrections to be applied to the operational algorithm of the open-quotes Ftoringclose quotes hand-held semiautomatic HF gas analyzer for the operational temperature range of -16 to 60 degrees C

  16. L G-2 Scintrex manual.Fluorescence analyzer

    International Nuclear Information System (INIS)

    Pirelli, H.

    1987-01-01

    The Scintrex Fluorescence Analyzer LG-2 selectively detects the presence of certain fluorescent minerals through UV photoluminescence induced and provides quantitative information on its distribution.

  17. Visualizing Stress and Temperature Distribution During Elevated Temperature Deformation of IN-617 Using Nanomechanical Raman Spectroscopy

    Science.gov (United States)

    Zhang, Yang; Wang, Hao; Tomar, Vikas

    2018-04-01

    This work presents direct measurements of stress and temperature distribution during the mesoscale microstructural deformation of Inconel-617 (IN-617) during 3-point bending tests as a function of temperature. A novel nanomechanical Raman spectroscopy (NMRS)-based measurement platform was designed for simultaneous in situ temperature and stress mapping as a function of microstructure during deformation. The temperature distribution was found to be directly correlated to stress distribution for the analyzed microstructures. Stress concentration locations are shown to be directly related to higher heat conduction and result in microstructural hot spots with significant local temperature variation.

  18. 40 CFR 91.314 - Analyzer accuracy and specifications.

    Science.gov (United States)

    2010-07-01

    .... (3) Zero drift. The analyzer zero-response drift during a one-hour period must be less than two percent of full-scale chart deflection on the lowest range used. The zero-response is defined as the mean... calibration or span gas. (2) Noise. The analyzer peak-to-peak response to zero and calibration or span gases...

  19. A data mining approach to analyze occupant behavior motivation

    NARCIS (Netherlands)

    Ren, X.; Zhao, Y.; Zeiler, W.; Boxem, G.; Li, T.

    2017-01-01

    Occupants' behavior could bring significant impact on the performance of built environment. Methods of analyzing people's behavior have not been adequately developed. The traditional methods such as survey or interview are not efficient. This study proposed a data-driven method to analyze the

  20. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... any flow rate into the reaction chamber. This includes, but is not limited to, sample capillary, ozone... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new...

  1. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  2. Control of a pulse height analyzer using an RDX workstation

    International Nuclear Information System (INIS)

    Montelongo, S.; Hunt, D.N.

    1984-12-01

    The Nuclear Chemistry Division of Lawrence Livermore National laboratory is in the midst of upgrading its radiation counting facilities to automate data acquisition and quality control. This upgrade requires control of a pulse height analyzer (PHA) from an interactive LSI-11/23 workstation running RSX-11M. The PHA is a micro-computer based multichannel analyzer system providing data acquisition, storage, display, manipulation and input/output from up to four independent acquisition interfaces. Control of the analyzer includes reading and writing energy spectra, issuing commands, and servicing device interrupts. The analyzer communicates to the host system over a 9600-baud serial line using the Digital Data Communications link level Protocol (DDCMP). We relieved the RSX workstation CPU from the DDCMP overhead by implementing a DEC compatible in-house designed DMA serial line board (the ISL-11) to communicate with the analyzer. An RSX I/O device driver was written to complete the path between the analyzer and the RSX system by providing the link between the communication board and an application task. The I/O driver is written to handle several ISL-11 cards all operating in parallel thus providing support for control of multiple analyzers from a single workstation. The RSX device driver, its design and use by application code controlling the analyzer, and its operating environment will be discussed

  3. Analyzing FCS Professionals in Higher Education: A Case Study

    Science.gov (United States)

    Hall, Scott S.; Harden, Amy; Pucciarelli, Deanna L.

    2016-01-01

    A national study of family and consumer sciences (FCS) professionals in higher education was analyzed as a case study to illustrate procedures useful for investigating issues related to FCS. The authors analyzed response rates of more than 1,900 FCS faculty and administrators by comparing those invited to participate and the 345 individuals who…

  4. A multichannel analyzer computer system for simultaneously measuring 64 spectra

    International Nuclear Information System (INIS)

    Jin Yuheng; Wan Yuqing; Zhang Jiahong; Li Li; Chen Guozhu

    2000-01-01

    The author introduces a multichannel analyzer computer system for simultaneously measuring 64 spectra with 64 coded independent inputs. The system is developed for a double chopper neutron scattering time-of-flight spectrometer. The system structure, coding method, operating principle and performances are presented. The system can also be used for other nuclear physics experiments which need multichannel analyzer with independent coded inputs

  5. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  6. A Morphological Analyzer for Vocalized or Not Vocalized Arabic Language

    Science.gov (United States)

    El Amine Abderrahim, Med; Breksi Reguig, Fethi

    This research has been to show the realization of a morphological analyzer of the Arabic language (vocalized or not vocalized). This analyzer is based upon our object model for the Arabic Natural Language Processing (NLP) and can be exploited by NLP applications such as translation machine, orthographical correction and the search for information.

  7. Analyzing Population Genetics Data: A Comparison of the Software

    Science.gov (United States)

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  8. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 90.318 Section 90.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the...

  9. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 91.318 Section 91.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of...

  10. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 89.321 Section 89.321 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent...

  11. THE EXPERIENCE OF COMPARISON OF STATIC SECURITY CODE ANALYZERS

    Directory of Open Access Journals (Sweden)

    Alexey Markov

    2015-09-01

    Full Text Available This work presents a methodological approach to comparison of static security code analyzers. It substantiates the comparison of the static analyzers as to efficiency and functionality indicators, which are stipulated in the international regulatory documents. The test data for assessment of static analyzers efficiency is represented by synthetic sets of open-source software, which contain vulnerabilities. We substantiated certain criteria for quality assessment of the static security code analyzers subject to standards NIST SP 500-268 and SATEC. We carried out experiments that allowed us to assess a number of the Russian proprietary software tools and open-source tools. We came to the conclusion that it is of paramount importance to develop Russian regulatory framework for testing software security (firstly, for controlling undocumented features and evaluating the quality of static security code analyzers.

  12. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  13. Analysis and discussion on the experimental data of electrolyte analyzer

    Science.gov (United States)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  14. Transit time spreads in biased paracentric hemispherical deflection analyzers

    International Nuclear Information System (INIS)

    Sise, Omer; Zouros, Theo J.M.

    2016-01-01

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  15. Transit time spreads in biased paracentric hemispherical deflection analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Sise, Omer, E-mail: omersise@sdu.edu.tr [Dept. of Science Education, Faculty of Education, Suleyman Demirel Univ., 32260 Isparta (Turkey); Zouros, Theo J.M. [Dept. of Physics, Univ. of Crete, P.O. Box 2208, GR 71003 Heraklion (Greece); Tandem Lab, INPP, NCSR Demokritos, P.O. Box 60228, GR 15310 Ag. Paraskevi (Greece)

    2016-02-15

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  16. Transit time spreads in biased paracentric hemispherical deflection analyzers

    Science.gov (United States)

    Sise, Omer; Zouros, Theo J. M.

    2016-02-01

    The biased paracentric hemispherical deflection analyzers (HDAs) are an alternative to conventional (centric) HDAs maintaining greater dispersion, lower angular aberrations, and hence better energy resolution without the use of any additional fringing field correctors. In the present work, the transit time spread of the biased paracentric HDA is computed over a wide range of analyzer parameters. The combination of high energy resolution with good time resolution and simplicity of design makes the biased paracentric analyzers very promising for both coincidence and singles spectroscopy applications.

  17. Emergency response training with the BNL plant analyzer

    International Nuclear Information System (INIS)

    Cheng, H.S.; Guppy, J.G.; Mallen, A.N.; Wulff, W.

    1987-01-01

    Presented is the experience in the use of the BNL Plant Analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training

  18. Intermittency in multiparticle production analyzed by means of stochastic theories

    International Nuclear Information System (INIS)

    Bartl, A.; Suzuki, N.

    1990-01-01

    Intermittency in multiparticle production is described by means of probability distributions derived from pure birth stochastic equations. The UA1, TASSO, NA22 and cosmic ray data are analyzed. 24 refs., 1 fig. (Authors)

  19. Automated Real-Time Clearance Analyzer (ARCA), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The Automated Real-Time Clearance Analyzer (ARCA) addresses the future safety need for Real-Time System-Wide Safety Assurance (RSSA) in aviation and progressively...

  20. Triple Isotope Water Analyzer for Extraplanetary Studies, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  1. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  2. Mini Total Organic Carbon Analyzer (miniTOCA)

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this development is to create a prototype hand-held, 1 to 2 liter size battery-powered Total Organic Carbon Analyzer (TOCA). The majority of...

  3. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  4. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    International Nuclear Information System (INIS)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V.

    1994-01-01

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented

  5. Analyzing radial acceleration with a smartphone acceleration sensor

    Science.gov (United States)

    Vogt, Patrik; Kuhn, Jochen

    2013-03-01

    This paper continues the sequence of experiments using the acceleration sensor of smartphones (for description of the function and the use of the acceleration sensor, see Ref. 1) within this column, in this case for analyzing the radial acceleration.

  6. The Photo-Pneumatic CO2 Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  7. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  8. Methyl-Analyzer--whole genome DNA methylation profiling.

    Science.gov (United States)

    Xin, Yurong; Ge, Yongchao; Haghighi, Fatemeh G

    2011-08-15

    Methyl-Analyzer is a python package that analyzes genome-wide DNA methylation data produced by the Methyl-MAPS (methylation mapping analysis by paired-end sequencing) method. Methyl-MAPS is an enzymatic-based method that uses both methylation-sensitive and -dependent enzymes covering >80% of CpG dinucleotides within mammalian genomes. It combines enzymatic-based approaches with high-throughput next-generation sequencing technology to provide whole genome DNA methylation profiles. Methyl-Analyzer processes and integrates sequencing reads from methylated and unmethylated compartments and estimates CpG methylation probabilities at single base resolution. Methyl-Analyzer is available at http://github.com/epigenomics/methylmaps. Sample dataset is available for download at http://epigenomicspub.columbia.edu/methylanalyzer_data.html. fgh3@columbia.edu Supplementary data are available at Bioinformatics online.

  9. Analyzed method for calculating the distribution of electrostatic field

    International Nuclear Information System (INIS)

    Lai, W.

    1981-01-01

    An analyzed method for calculating the distribution of electrostatic field under any given axial gradient in tandem accelerators is described. This method possesses satisfactory accuracy compared with the results of numerical calculation

  10. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  11. NRC nuclear-plant-analyzer concept and status at INEL

    International Nuclear Information System (INIS)

    Aguilar, F.; Wagner, R.J.

    1982-01-01

    The Office of Research of the US NRC has proposed development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes how we of the INEL envision the nuclear-plant analyzer. The paper also describes a pilot RELAP5 plant-analyzer project completed during the past year and current work. A great deal of analysis is underway to determine nuclear-steam-system response. System transient analysis being so complex, there is the need to present analytical results in a way that interconnections among phenomena and all the nuances of the transient are apparent. There is the need for the analyst to dynamically control system calculations to simulate plant operation in order to perform what if studies as well as the need to perform system analysis within hours of a plant emergency to diagnose the state of the stricken plant and formulate recovery actions. The NRC-proposed nuclear-plant analyzer can meet these needs

  12. AmAMorph: Finite State Morphological Analyzer for Amazighe

    Directory of Open Access Journals (Sweden)

    Fatima Zahra Nejme

    2016-03-01

    Full Text Available This paper presents AmAMorph, a morphological analyzer for Amazighe language using a system based on the NooJ linguistic development environment. The paper begins with the development of Amazighe lexicons with large coverage formalization. The built electronic lexicons, named ‘NAmLex’, ‘VAmLex’ and ‘PAmLex’ which stand for ‘Noun Amazighe Lexicon’, ‘Verb Amazighe Lexicon’ and ‘Particles Amazighe Lexicon’, link inflectional, morphological, and syntacticsemantic information to the list of lemmas. Automated inflectional and derivational routines are applied to each lemma producing over inflected forms. To our knowledge,AmAMorph is the first morphological analyzer for Amazighe. It identifies the component morphemes of the forms using large coverage morphological grammars. Along with the description of how the analyzer is implemented, this paper gives an evaluation of the analyzer.

  13. Radiometric flow injection analysis with an ASIA (Ismatec) analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Win, N; San, K; Han, B; Myoe, K M [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-07-01

    Radiometric Flow Injection Analysis of a radioactive ([sup 131]I) sample is described. For analysis an ASIA (Ismatec) analyzer with a NaI(Tl) scintillation detector was used. (author) 5 refs.; 3 figs.

  14. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  15. Quality Performance of Drugs Analyzed in the Drug Analysis and ...

    African Journals Online (AJOL)

    ICT TEAM

    performance of drug samples analyzed therein. Previous reports have ... wholesalers, non-governmental organizations, hospitals, analytical ..... a dispute concerning discharge of waste water ... Healthcare Industry in Kenya, December. 2008.

  16. Generating and analyzing non-diffracting vector vortex beams

    CSIR Research Space (South Africa)

    Li, Y

    2013-08-01

    Full Text Available single order Bessel beam and superposition cases are studied. The polarization and the azimuthal modes of the generated beams are analyzed. The results of modal decompositions on polarization components are in good agreement with theory. We demonstrate...

  17. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  18. Analyzing Spread of Influence in Social Networks for Transportation Applications

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  19. Analyzing Spread of Influence in Social Networks for Transportation Application.

    Science.gov (United States)

    2016-09-02

    This project analyzed the spread of influence in social media, in particular, the Twitter social media site, and identified the individuals who exert the most influence to those they interact with. There are published studies that use social media to...

  20. Airspace Analyzer for Assessing Airspace Directional Permeability, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  1. Modelling of temperature distribution and temperature pulsations in elements of fast breeder reactor

    International Nuclear Information System (INIS)

    Sorokin, A.P.; Bogoslovskaia, G.P.; Ushakov, P.A.; Zhukov, A.V.; Ivanov, Eu.F.; Matjukhin, N.M.

    2004-01-01

    From thermophysical point of view, integrated configuration of liquid metal cooled reactor has some limitations. Large volume of mixing chamber causes a complex behavior of thermal hydraulic characteristics in such facilities. Also, this volume is responsible for large-scale eddies in the coolant, existence of stagnant areas and flow stratification, occurrence of temperature non-uniformity and pulsation of coolant and structure temperatures. Temperature non-uniformities and temperature pulsations depend heavily even on small variations in reactor core design. The paper presents some results on modeling of thermal hydraulic processes occurring in liquid metal cooled reactor. The behavior of following parameters are discussed: temperature non-uniformities at the core output and related temperature pulsations; temperature pulsations due to mixing of sodium jets at different temperatures; temperature pulsations arising if a part of loop (circuit) is shut off; temperature non-uniformities and pulsation at the core output and related temperature pulsation; temperature pulsations due to mixing of sodium jets at different temperatures; temperature pulsations arising if a part of loop (circuit) is shut off; temperature non-uniformities and pulsation of temperature during transients and during transition to natural convection cooling. Also, the issue of modeling of temperature behavior in compact arrangement of fast reactor fuel pins using water as modeling liquid is considered in the paper. One more discussion is concerned with experimental method of modeling of liquid metal mixing with the use of air. The method is based on freon tracer technique. The results of simulation of the thermal hydraulic processes mentioned above have been analyzed, that will allow the main lines of the study to be determined and conclusion to be drawn regarding the temperature behavior in fast reactor units. (author)

  2. Giessen polarization facility. III. Multi-detector analyzing system

    Energy Technology Data Exchange (ETDEWEB)

    Krause, H H; Stock, R; Arnold, W; Berg, H; Huttel, E; Ulbricht, J; Clausnitzer, G [Giessen Univ. (Germany, F.R.). Strahlenzentrum

    1977-06-15

    An analyzing system with a PDP 11 computer and a digital multiplexer is described. It allows to accept signals from 16 detectors with individual ADCs simultaneously. For measurements of analyzing powers the polarization of the ion beam can be switched to zero with a frequency of 1 kHz. The switching operation additionally controls the handling of the detector pulses. The software contains special programs for the analysis of polarization experiments.

  3. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  4. Tests of the Royce ultrasonic interface level analyzer

    International Nuclear Information System (INIS)

    WITWER, K.S.

    1999-01-01

    This document describes testing carried out in 1995 on the Royce Interface Level Analyzer. The testing was carried out in the 305 Bldg., Engineering Testing Laboratory, 300 Area. The Level Analyzer was shown to be able to effectively locate the solid liquid interface layer of two different simulants under various conditions and was able to do so after being irradiated with over 5 million RADS gamma from a Cobalt 60 source

  5. Evaluation of haematology analyzer CELL-DYN 3700 SL

    Directory of Open Access Journals (Sweden)

    Enver Suljević

    2003-05-01

    Full Text Available Research on the parameters of full blood count and differential white blood count is included in the program of all medical laboratories of primary, secondary and tertiary health care levels. Today, all haematological tests are exclusively performed on the haematology analyzers. Automation of haematology laboratories is a result of the huge requires for haematological test performing, timely issuing of the haematological findings, and possibility of the usage of modern techniques.This work is an evaluation of laser haematology analyzer Cell-Dyn 3700 SL. It investigates the reliability of test results throughout the following parameters: precision, accuracy, sensitivity and specificity of determination methods. It also explores the influence of sample transferring and correlation with haematology analyzer MAXM Retti. Haematology parameters that have been investigated are: white blood cell (WBC, neutrophils (NEU, lymphocytes (LXM, monocytes (MONO, eosinophils (EOS, basophils (BASO, red blood cells (RBC, haemoglobin (HGB, haematocrit (HCT, mean corpuscular volume (MCV, mean corpuscular haemoglobin (MCHC red cell distribution width (RDW, platelet (PLT, mean platelet volume (MPV, plateletocrit (PCT, and platelet distribution width (PDW.The results confirms that precision of analyzer fulfils the reproducibility of testing parameters: WBC, RBC, HGB, MCV, MCH, MCHC, and PLT. Correlation coefficient values (r gained throughout the statistical analysis, that is linear regression results obtained throughout the comparison of two analyzers are adequate except for MCHC (r = 0.64, what is in accordance with literature data.Accuracy is tested by haematology analyzer method and microscopic differentiating method. Correlation coefficient results for granulocytes, lymphocytes and monocytes point the accuracy of methods. Sensitivity and specificity parameters fulfil the analytical criteria.It is confirmed that haematology analyzer Cell-Dyn 3700 SL is reliable for

  6. Magnetic systems for wide-aperture neutron polarizers and analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Gilev, A.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Pleshanov, N.K., E-mail: pnk@pnpi.spb.ru [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Bazarov, B.A.; Bulkin, A.P.; Schebetov, A.F. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Syromyatnikov, V.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Physical Department, St. Petersburg State University, Ulyanovskaya, 1, Petrodvorets, St. Petersburg 198504 (Russian Federation); Tarnavich, V.V.; Ulyanov, V.A. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation)

    2016-10-11

    Requirements on the field uniformity in neutron polarizers are analyzed in view of the fact that neutron polarizing coatings have been improved during the past decade. The design of magnetic systems that meet new requirements is optimized by numerical simulations. Magnetic systems for wide-aperture multichannel polarizers and analyzers are represented, including (a) the polarizer to be built at channel 4-4′ of the reactor PIK (Gatchina, Russia) for high-flux experiments with a 100×150 mm{sup 2} beam of polarized cold neutrons; (b) the fan analyzer covering a 150×100 mm{sup 2} window of the detector at the Magnetism Reflectometer (SNS, ORNL, USA); (c) the polarizer and (d) the fan analyzer covering a 220×110 mm{sup 2} window of the detector at the reflectometer NERO, which is transferred to PNPI (Russia) from HZG (Germany). Deviations of the field from the vertical did not exceed 2°. The polarizing efficiency of the analyzer at the Magnetism Reflectometer reached 99%, a record level for wide-aperture supermirror analyzers.

  7. Temperature indicating device

    International Nuclear Information System (INIS)

    Angus, J.P.; Salt, D.

    1988-01-01

    A temperature indicating device comprises a plurality of planar elements some undergoing a reversible change in appearance at a given temperature the remainder undergoing an irreversible change in appearance at a given temperature. The device is useful in indicating the temperature which an object has achieved as well as its actual temperature. The reversible change is produced by liquid crystal devices. The irreversible change is produced by an absorbent surface carrying substances e.g. waxes which melt at predetermined temperatures and are absorbed by the surface; alternatively paints may be used. The device is used for monitoring processes of encapsulation of radio active waste. (author)

  8. High temperature estimation through computer vision

    International Nuclear Information System (INIS)

    Segovia de los R, J.A.

    1996-01-01

    The form recognition process has between his purposes to conceive and to analyze the classification algorithms applied to the image representations, sounds or signals of any kind. In a process with a thermal plasma reactor in which cannot be employed conventional dispositives or methods for the measurement of the very high temperatures. The goal of this work was to determine these temperatures in an indirect way. (Author)

  9. Maine River Temperature Monitoring

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — We collect seasonal and annual temperature measurements on an hourly or quarter hourly basis to monitor habitat suitability for ATS and other species. Temperature...

  10. GISS Surface Temperature Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The GISTEMP dataset is a global 2x2 gridded temperature anomaly dataset. Temperature data is updated around the middle of every month using current data files from...

  11. Supersymmetry at high temperatures

    International Nuclear Information System (INIS)

    Das, A.; Kaku, M.

    1978-01-01

    We investigate the properties of Green's functions in a spontaneously broken supersymmetric model at high temperatures. We show that, even at high temperatures, we do not get restoration of supersymmetry, at least in the one-loop approximation

  12. Identify the dominant variables to predict stream water temperature

    Science.gov (United States)

    Chien, H.; Flagler, J.

    2016-12-01

    Stream water temperature is a critical variable controlling water quality and the health of aquatic ecosystems. Accurate prediction of water temperature and the assessment of the impacts of environmental variables on water temperature variation are critical for water resources management, particularly in the context of water quality and aquatic ecosystem sustainability. The objective of this study is to measure stream water temperature and air temperature and to examine the importance of streamflow on stream water temperature prediction. The measured stream water temperature and air temperature will be used to test two hypotheses: 1) streamflow is a relatively more important factor than air temperature in regulating water temperature, and 2) by combining air temperature and streamflow data stream water temperature can be more accurately estimated. Water and air temperature data loggers are placed at two USGS stream gauge stations #01362357and #01362370, located in the upper Esopus Creek watershed in Phonecia, NY. The ARIMA (autoregressive integrated moving average) time series model is used to analyze the measured water temperature data, identify the dominant environmental variables, and predict the water temperature with identified dominant variable. The preliminary results show that streamflow is not a significant variable in predicting stream water temperature at both USGS gauge stations. Daily mean air temperature is sufficient to predict stream water temperature at this site scale.

  13. The comparison of automated urine analyzers with manual microscopic examination for urinalysis automated urine analyzers and manual urinalysis

    OpenAIRE

    ?nce, Fatma Demet; Ellida?, Hamit Ya?ar; Koseo?lu, Mehmet; ?im?ek, Ne?e; Yal??n, H?lya; Zengin, Mustafa Osman

    2016-01-01

    Objectives: Urinalysis is one of the most commonly performed tests in the clinical laboratory. However, manual microscopic sediment examination is labor-intensive, time-consuming, and lacks standardization in high-volume laboratories. In this study, the concordance of analyses between manual microscopic examination and two different automatic urine sediment analyzers has been evaluated. Design and methods: 209 urine samples were analyzed by the Iris iQ200 ELITE (Ä°ris Diagnostics, USA), Dirui...

  14. Thermochemical stability of Li-Cu-O ternary compounds stable at room temperature analyzed by experimental and theoretical methods

    Energy Technology Data Exchange (ETDEWEB)

    Lepple, Maren [Karlsruhe Institute of Technology, Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics; Technische Univ. Darmstadt (Germany). Eduard-Zintl-Inst. of Inorganic and Physical Chemistry; Rohrer, Jochen; Albe, Karsten [Technische Univ. Darmstadt (Germany). Fachgebiet Materialmodellierung; Adam, Robert; Rafaja, David [Technical Univ. Freiberg (Germany). Inst. of Materials Science; Cupid, Damian M. [Karlsruhe Institute of Technology, Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics; Austrian Institute of Technology GmbH, Vienna (Austria). Center for Low-Emission Transport TECHbase; Seifert, Hans J. [Karlsruhe Institute of Technology, Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics

    2017-11-15

    Compounds in the Li-Cu-O system are of technological interest due to their electrochemical properties which make them attractive as electrode materials, i.e., in future lithium ion batteries. In order to select promising compositions for such applications reliable thermochemical data are a prerequisite. Although various groups have investigated individual ternary phases using different experimental setups, up to now, no systematic study of all relevant phases is available in the literature. In this study, we combine drop solution calorimetry with density function theory calculations to systematically investigate the thermodynamic properties of ternary Li-Cu-O phases. In particular, we present a consistently determined set of enthalpies of formation, Gibbs energies and heat capacities for LiCuO, Li{sub 2}CuO{sub 2} and LiCu{sub 2}O{sub 2} and compare our results with existing literature.

  15. Supersymmetry at finite temperature

    International Nuclear Information System (INIS)

    Clark, T.E.; Love, S.T.

    1983-01-01

    Finite-temperature supersymmetry (SUSY) is characterized by unbroken Ward identities for SUSY variations of ensemble averages of Klein-operator inserted imaginary time-ordered products of fields. Path-integral representations of these products are defined and the Feynman rules in superspace are given. The finite-temperature no-renormalization theorem is derived. Spontaneously broken SUSY at zero temperature is shown not to be restored at high temperature. (orig.)

  16. Room temperature superconductors

    International Nuclear Information System (INIS)

    Sleight, A.W.

    1995-01-01

    If the Holy Grail of room temperature superconductivity could be achieved, the impact on could be enormous. However, a useful room temperature superconductor for most applications must possess a T c somewhat above room temperature and must be capable of sustaining superconductivity in the presence of magnetic fields while carrying a significant current load. The authors will return to the subject of just what characteristics one might seek for a compound to be a room temperature superconductor. 30 refs., 3 figs., 1 tab

  17. Applications of Electronstatic Lenses to Electron Gun and Energy Analyzers

    International Nuclear Information System (INIS)

    Sise, O.

    2004-01-01

    Focal properties and geometries are given for several types of electrostatic lens systems commonly needed in electron impact studies. One type is an electron gun which focuses electrons over a wide range of energy onto a fixed point, such as target, and the other type is an analyzer system which focuses scattered electrons of variable energy onto a fixed position, such as the entrance plane of an analyzer. There are many different types and geometries of these lenses for controlling and focusing of the electron beams. In this presentation we discussed the criteria used for the design of the electrostatic lenses associated with the electron gun and energy analyzers and determined the fundamental relationships between the operation and behaviour of multi-element electrostatic lenses, containing five, six and seven elements. The focusing of the electron beam was achieved by applying suitable voltages to the series of these lens elements, Design of the lens system for electron gun was based on our requirements that the beam at the target had a small spot size and zero beam angle, that is, afocal mode. For energy analyzer systems we considered the entrance of the hemispherical analyzer which determines the energy of the electron beam and discussed the focusing condition of this lens systems

  18. Health Services Cost Analyzing in Tabriz Health Centers 2008

    Directory of Open Access Journals (Sweden)

    Massumeh gholizadeh

    2015-08-01

    Full Text Available Background and objectives : Health Services cost analyzing is an important management tool for evidence-based decision making in health system. This study was conducted with the purpose of cost analyzing and identifying the proportion of different factors on total cost of health services that are provided in urban health centers in Tabriz. Material and Methods : This study was a descriptive and analytic study. Activity Based Costing method (ABC was used for cost analyzing. This cross–sectional survey analyzed and identified the proportion of different factors on total cost of health services that are provided in Tabriz urban health centers. The statistical population of this study was comprised of urban community health centers in Tabriz. In this study, a multi-stage sampling method was used to collect data. Excel software was used for data analyzing. The results were described with tables and graphs. Results : The study results showed the portion of different factors in various health services. Human factors by 58%, physical space 8%, medical equipment 1.3% were allocated with high portion of expenditures and costs of health services in Tabriz urban health centers. Conclusion : Based on study results, since the human factors included the highest portion of health services costs and expenditures in Tabriz urban health centers, balancing workload with staff number, institutionalizing performance-based management and using multidisciplinary staffs may lead to reduced costs of services. ​

  19. Test of a two-dimensional neutron spin analyzer

    International Nuclear Information System (INIS)

    Falus, Peter; Vorobiev, Alexei; Krist, Thomas

    2006-01-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 A impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mmx190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4 o x4 o . The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities

  20. Test of a two-dimensional neutron spin analyzer

    Science.gov (United States)

    Falus, Péter; Vorobiev, Alexei; Krist, Thomas

    2006-11-01

    The aim of this measurement was to test the new large-area spin polarization analyzer for the EVA-SERGIS beamline at Institute Laue Langevin (ILL). The spin analyzer, which was built in Berlin selects one of the two spin states of a neutron beam of wavelength 5.5 Å impinging on a horizontal sample and reflected or scattered from the sample. The spin is analyzed for all neutrons scattered into a detector with an area of 190 mm×190 mm positioned 2.7 m behind the sample, thus covering an angular interval of 4°×4°. The tests were done at the HMI V14 beamline followed by tests at the EVA beamline at ILL. The transmission for the two spin components, the flipping ratio and small angle scattering were recorded while scanning the incoming beam on the analyzer. It was clearly visible, that due to the stacked construction the intensity is blocked at regular intervals. Careful inspection shows that the transmission of the good spin component is more than 0.72 for 60% of the detector area and the corrected flipping ratio is more than 47 for 60% of the detector area. Although some small-angle scattering is visible, it is notable that this analyzer design has small scattering intensities.

  1. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  2. Application of grey model on analyzing the passive natural circulation residual heat removal system of HTR-10

    Institute of Scientific and Technical Information of China (English)

    ZHOU Tao; PENG Changhong; WANG Zenghui; WANG Ruosu

    2008-01-01

    Using the grey correlation analysis, it can be concluded that the reactor pressure vessel wall temperature has the strongest effect on the passive residual heat removal system in HTR (High Temperature gas-cooled Reactor),the chimney height takes the second place, and the influence of inlet air temperature of the chimney is the least. This conclusion is the same as that analyzed by the traditional method. According to the grey model theory, the GM(1,1) and GM(1, 3) model are built based on the inlet air temperature of chimney, pressure vessel temperature and the chimney height. Then the effect of three factors on the heat removal power is studied in this paper. The model plays an important role on data prediction, and is a new method for studying the heat removal power. The method can provide a new theoretical analysis to the passive residual heat removal system of HTR.

  3. Digital temperature meter

    Energy Technology Data Exchange (ETDEWEB)

    Glowacki, S

    1982-01-01

    Digital temperature meter for precise temperature measurements is presented. Its parts such as thermostat, voltage-frequency converter and digital frequency meter are described. Its technical parameters such as temperature range 50degC-700degC, measurement precision 1degC, measurement error +-1degC are given. (A.S.).

  4. Rescaling Temperature and Entropy

    Science.gov (United States)

    Olmsted, John, III

    2010-01-01

    Temperature and entropy traditionally are expressed in units of kelvin and joule/kelvin. These units obscure some important aspects of the natures of these thermodynamic quantities. Defining a rescaled temperature using the Boltzmann constant, T' = k[subscript B]T, expresses temperature in energy units, thereby emphasizing the close relationship…

  5. High-temperature superconductivity

    International Nuclear Information System (INIS)

    Lynn, J.W.

    1990-01-01

    This book discusses development in oxide materials with high superconducting transition temperature. Systems with Tc well above liquid nitrogen temperature are already a reality and higher Tc's are anticipated. The author discusses how the idea of a room-temperature superconductor appears to be a distinctly possible outcome of materials research

  6. Analyzing microporosity with vapor thermogravimetry and gas pycnometry

    NARCIS (Netherlands)

    Dral, A. Petra; ten Elshof, Johan E.

    2018-01-01

    The complementary use of thermogravimetry and pycnometry is demonstrated to expand the toolbox for experimental micropore analysis <1 nm. Thermogravimetry is employed to assess the uptake of water, methanol, ethanol, 1-propanol and cyclohexane vapors in microporous structures at room temperature and

  7. Analyzing Control Challenges for Thermal Energy Storage in Foodstuffs

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F. S.; Skovrup, Morten Juel

    2012-01-01

    foodstuffs make them behave differently when exposed to changes in air temperature. We present a novel analysis based on Biot and Fourier numbers for the different foodstuffs. This provides a valuable tool for determining how different items can be utilized in load-shifting schemes on different timescales...

  8. Analyzing the two-dimensional plot of the interannual climate ...

    African Journals Online (AJOL)

    For this purpose, at first it is necessary to calculate the interannual variability range of the region climatic variables, resulting from the interaction between the climate systems of the 'earth' (atmosphere, biosphere, etc.). Hence, long-term statistics (1000 years) of the temperature and precipitation, resulting from control run (fix ...

  9. The Common Technique for Analyzing the Financial Results Report

    Directory of Open Access Journals (Sweden)

    Pasternak Maria M.

    2017-04-01

    Full Text Available The article is aimed at generalizing the theoretical approaches to the structure and elements of the technique for analysis of the Financial results report (Cumulative income report and providing suggestions for its improvement. The current methods have been analyzed, relevance of the application of a common technique for such analysis has been substantiated. A common technique for analyzing the Financial results report has been proposed, which includes definition of the objectives and tasks of analysis, subjects and objects of analysis, sources of its information. Stages of such an analysis were allocated and described. The findings of the article can be used to theoretically substantiate and to practically develop a technique for analyzing the Financial results report in the branches of Ukrainian economy.

  10. Development of a low energy neutral analyzer (LENA). Final report

    International Nuclear Information System (INIS)

    Curtis, C.C.; Fan, C.Y.; Hsieh, K.C.; McCullen, J.D.

    1986-05-01

    A low energy neutral particle analyzer (LENA) has been developed at the University of Arizona to detect particles originating in the edge plasma of fusion reactors. LENA was designed to perform energy analysis and measure flux levels of neutrals having energies between 5 and 50 eV (with possible extension to 500 eV neutrals), and do this with 1 to 10 ms time resolution. The instrument uses hot filaments to produce a 10 mA diffusion electron beam which ionizes incoming neutrals in a nearly field free region so that their velocity distribution is nearly undisturbed. The resultant ions are energy analyzed in a hyperbolic electrostatic analyzer, and detected by an MCP detector. LENA has been installed and operated on the ALCATOR C tokamak at the MIT Plasma Fusion Center. Results to date are discussed. At present, the LENA exhibits excessive sensitivity to the extremely high ultraviolet photon flux emanating from the plasma. Measures to correct this are suggested

  11. Demonstration of analyzers for multimode photonic time-bin qubits

    Science.gov (United States)

    Jin, Jeongwan; Agne, Sascha; Bourgoin, Jean-Philippe; Zhang, Yanbao; Lütkenhaus, Norbert; Jennewein, Thomas

    2018-04-01

    We demonstrate two approaches for unbalanced interferometers as time-bin qubit analyzers for quantum communication, robust against mode distortions and polarization effects as expected from free-space quantum communication systems including wavefront deformations, path fluctuations, pointing errors, and optical elements. Despite strong spatial and temporal distortions of the optical mode of a time-bin qubit, entangled with a separate polarization qubit, we verify entanglement using the Negative Partial Transpose, with the measured visibility of up to 0.85 ±0.01 . The robustness of the analyzers is further demonstrated for various angles of incidence up to 0 .2∘ . The output of the interferometers is coupled into multimode fiber yielding a high system throughput of 0.74. Therefore, these analyzers are suitable and efficient for quantum communication over multimode optical channels.

  12. Coherent error study in a retarding field energy analyzer

    International Nuclear Information System (INIS)

    Cui, Y.; Zou, Y.; Reiser, M.; Kishek, R.A.; Haber, I.; Bernal, S.; O'Shea, P.G.

    2005-01-01

    A novel cylindrical retarding electrostatic field energy analyzer for low-energy beams has been designed, simulated, and tested with electron beams of several keV, in which space charge effects play an important role. A cylindrical focusing electrode is used to overcome the beam expansion inside the device due to space-charge forces, beam emittance, etc. In this paper, we present the coherent error analysis for this energy analyzer with beam envelope equation including space charge and emittance effects. The study shows that this energy analyzer can achieve very high resolution (with relative error of around 10 -5 ) if taking away the coherent errors by using proper focusing voltages. The theoretical analysis is compared with experimental results

  13. Atmospheric analyzer, carbon monoxide monitor and toluene diisocyanate monitor

    Science.gov (United States)

    Shannon, A. V.

    1977-01-01

    The purpose of the atmospheric analyzer and the carbon monoxide and toluene diisocyanate monitors is to analyze the atmospheric volatiles and to monitor carbon monoxide and toluene diisocyanate levels in the cabin atmosphere of Skylab. The carbon monoxide monitor was used on Skylab 2, 3, and 4 to detect any carbon monoxide levels above 25 ppm. Air samples were taken once each week. The toluene diisocyanate monitor was used only on Skylab 2. The loss of a micrometeoroid shield following the launch of Skylab 1 resulted in overheating of the interior walls of the Orbital Workshop. A potential hazard existed from outgassing of an isocyanate derivative resulting from heat-decomposition of the rigid polyurethane wall insulation. The toluene diisocyanate monitor was used to detect any polymer decomposition. The atmospheric analyzer was used on Skylab 4 because of a suspected leak in the Skylab cabin. An air sample was taken at the beginning, middle, and the end of the mission.

  14. A cascading failure model for analyzing railway accident causation

    Science.gov (United States)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  15. Set-up with electrostatic analyzer for mass spectrometers

    International Nuclear Information System (INIS)

    Ivanov, V.P.; Sysoev, A.A.; Samsonov, G.A.

    1977-01-01

    An attachment with an electrostatic analyzer that enables to implement a double focusing of ion beams when used in conjunction with a magnetic analyzer, is suggested. Used as the electrostatic analyzer is a cylindrical capacitor placed in a vacuum chamber. Apart from this, the attachment includes a vacuum pump, a nitrogen trap, a battery supply unit, one-beam ion receivers and a bellows inlet for capacitor adjustment. All assemblies and parts of the attachment are made of stainless steel. The test of a combined operation of the mass-spactrometer and the attachment indicate that the use of the attachment enables the utilization of sources which form ion beams with an energy dispersion of up to 1.5%, the mass-spectrometer resolving power being unchanged

  16. Research on key techniques in portable XRF analyzers

    International Nuclear Information System (INIS)

    Li Guodong; Jia Wenyi; Zhou Rongsheng; Tang Hong

    1999-01-01

    Focused on the problems of low sensitivity, poor detection limits, small number of determined elements and poor ability of matrix effect correction of the current field-portable X-ray fluorescence (XRF) analyzers, research work on key units of excitation source, detector, measurement circuit and microcomputerization is carried out. A miniature, low power X-ray tube excitation source is developed. A low dissipative 1024 channel analyzer, fitting to high resolution detectors, is prepared. Microcomputerization based on a notebook computer is realized. On the basis, a field, highly sensitive XRF system is constituted. With this system, multielements can be determined with the detection limits of less than 20 μg/g for the elements with medium or lower atomic numbers, one order of magnitude or more lower than those of the current portable XRF analyzers. The capabilities for matrix effect correction and data processing are enhanced. This system gets rid of radionuclide sources, making its use and carry safe and convenient

  17. np elastic scattering analyzing power characteristics at intermediate energies

    International Nuclear Information System (INIS)

    Abegg, R.; Davis, C.A.; Delheij, P.P.J.; Green, P.W.; Greeniaus, L.G.; Healey, D.C.; Miller, C.A.; Rodning, N.L.; Wait, G.D.; Ahmad, M.; Cairns, E.B.; Coombes, G.H.; Lapointe, C.; McDonald, W.J.; Moss, G.A.; Roy, G.; Soukup, J.; Tkachuk, R.R.; Ye, Y.; Watson, J.W.

    1989-06-01

    Recent measurements of charge symmetry breaking in the np system at 477 MeV, and of A oonn for np elastic scattering at 220, 325 and 425 MeV also yield accurate analyzing power data. These data allow the energy dependence of the analyzing power zero-crossing angle and the slope of the analyzing power at the zero-crossing to be determined. The incident neutron energies span a region where the zero-crossing angle is strongly energy dependent (Ε n n > 350 MeV). The results are compared to current phase shift analysis predictions, recently published LAMPF data, and the predictions of the Bonn and Paris potentials. (Author) 13 refs., 2 tabs., 2 figs

  18. On-line analyzers to distributed control system linking

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, S.F.; Buchanan, B.R.; Sanders, M.A.

    1990-01-01

    The Analytical Development Section (ADS) of the Savannah River Laboratory is developing on-line analyzers to monitor various site processes. Data from some of the on-line analyzers (OLA's) will be used for process control by distributed control systems (DCS's) such as the Fisher PRoVOX. A problem in the past has been an efficient and cost effective way to get analyzer data onto the DCS data highway. ADS is developing a system to accomplish the linking of OLA's to PRoVOX DCS's. The system will be described, and results of operation in a research and development environment given. Plans for the installation in the production environment will be discussed.

  19. Novel Approach to Analyzing MFE of Noncoding RNA Sequences.

    Science.gov (United States)

    George, Tina P; Thomas, Tessamma

    2016-01-01

    Genomic studies have become noncoding RNA (ncRNA) centric after the study of different genomes provided enormous information on ncRNA over the past decades. The function of ncRNA is decided by its secondary structure, and across organisms, the secondary structure is more conserved than the sequence itself. In this study, the optimal secondary structure or the minimum free energy (MFE) structure of ncRNA was found based on the thermodynamic nearest neighbor model. MFE of over 2600 ncRNA sequences was analyzed in view of its signal properties. Mathematical models linking MFE to the signal properties were found for each of the four classes of ncRNA analyzed. MFE values computed with the proposed models were in concordance with those obtained with the standard web servers. A total of 95% of the sequences analyzed had deviation of MFE values within ±15% relative to those obtained from standard web servers.

  20. A nuclear facility Security Analyzer written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1987-01-01

    The Security Analyzer project was undertaken to use the Prolog artificial intelligence programming language and Entity-Relationship database construction techniques to produce an intelligent database computer program capable of analyzing the effectiveness of a nuclear facility's security systems. The Security Analyzer program can search through a facility to find all possible surreptitious entry paths that meet various user-selected time and detection probability criteria. The program can also respond to user-formulated queries concerning the database information. The intelligent database approach allows the program to perform a more comprehensive path search than other programs that only find a single optimal path. The program also is more flexible in that the database, once constructed, can be interrogated and used for purposes independent of the searching function