WorldWideScience

Sample records for tool temperature analysis

  1. Finite Element Modelling of the effect of tool rake angle on tool temperature and cutting force during high speed machining of AISI 4340 steel

    International Nuclear Information System (INIS)

    Sulaiman, S; Roshan, A; Ariffin, M K A

    2013-01-01

    In this paper, a Finite Element Method (FEM) based on the ABAQUS explicit software which involves Johnson-Cook material model was used to simulate cutting force and tool temperature during high speed machining (HSM) of AISI 4340 steel. In this simulation work, a tool rake angle ranging from 0° to 20° and a range of cutting speeds between 300 to 550 m/min was investigated. The purpose of this simulation analysis was to find optimum tool rake angle where cutting force is smallest as well as tool temperature is lowest during high speed machining. It was found that cutting forces to have a decreasing trend as rake angle increased to positive direction. The optimum rake angle observed between 10° and 18° due to decrease of cutting force as 20% for all simulated cutting speeds. In addition, increasing cutting tool rake angle over its optimum value had negative influence on tool's performance and led to an increase in cutting temperature. The results give a better understanding and recognition of the cutting tool design for high speed machining processes

  2. Development of a High-Temperature Diagnostics-While-Drilling Tool

    Energy Technology Data Exchange (ETDEWEB)

    Blankenship, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavira, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Henfling, Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetmaniak, Chris [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Huey, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jacobson, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); King, Dennis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Knudsen, Steve [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mansure, A. J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Polsky, Yarom [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2009-01-01

    This report documents work performed in the second phase of the Diagnostics While-Drilling (DWD) project in which a high-temperature (HT) version of the phase 1 low-temperature (LT) proof-of-concept (POC) DWD tool was built and tested. Descriptions of the design, fabrication and field testing of the HT tool are provided.

  3. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States

    Directory of Open Access Journals (Sweden)

    Min-Uk Kim

    2018-05-01

    Full Text Available We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA tools. We used OCA tools Korea Offsite Risk Assessment (KORA and Areal Location of Hazardous Atmospheres (ALOHA in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH3, 35% hydrogen chloride (HCl, 50% hydrofluoric acid (HF, and 69% nitric acid (HNO3. The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  4. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    Science.gov (United States)

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  5. LOW-TEMPERATURE SURFACE HARDENING FOR DIAMOND TOOLS

    Directory of Open Access Journals (Sweden)

    A. A. Shmatov

    2009-01-01

    Full Text Available The structure and properties of cutting diamond tools subjected to thermo-hydro-chemical treatment are examined in the paper. The process involves a chemical treatment of tools in a specially prepared aqueous suspension of oxides Ti, Mo and other ingredients and subsequent heat treatment (minimal process temperature 130 °C. Thermo-hydro-chemical method permits to increase a wear resistance of cutting diamond tools by the factor of 1.3–4.0 in comparison with traditional one.

  6. Development of a new methodology for the creation of water temperature scenarios using frequency analysis tool.

    Science.gov (United States)

    Val, Jonatan; Pino, María Rosa; Chinarro, David

    2018-03-15

    Thermal quality in river ecosystems is a fundamental property for the development of biological processes and many of the human activities linked to the aquatic environment. In the future, this property is going to be threatened due to global change impacts, and basin managers will need useful tools to evaluate these impacts. Currently, future projections in temperature modelling are based on the historical data for air and water temperatures, and the relationship with past temperature scenarios; however, this represents a problem when evaluating future scenarios with new thermal impacts. Here, we analysed the thermal impacts produced by several human activities, and linked them with the decoupling degree of the thermal transfer mechanism from natural systems measured with frequency analysis tools (wavelet coherence). Once this relationship has been established we develop a new methodology for simulating different thermal impacts scenarios in order to project them into future. Finally, we validate this methodology using a site that changed its thermal quality during the studied period due to human impacts. Results showed a high correlation (r 2 =0.84) between the decoupling degree of the thermal transfer mechanisms and the quantified human impacts, obtaining 3 thermal impact scenarios. Furthermore, the graphic representation of these thermal scenarios with its wavelet coherence spectrums showed the impacts of an extreme drought period and the agricultural management. The inter-conversion between the scenarios gave high morphological similarities in the obtained wavelet coherence spectrums, and the validation process clearly showed high efficiency of the developed model against old methodologies when comparing with Nash-Stucliffe criterion. Although there is need for further investigation with different climatic and anthropic management conditions, the developed frequency models could be useful in decision-making processes by managers when faced with future global

  7. Implementation Analysis of Cutting Tool Carbide with Cast Iron Material S45 C on Universal Lathe

    Science.gov (United States)

    Junaidi; hestukoro, Soni; yanie, Ahmad; Jumadi; Eddy

    2017-12-01

    Cutting tool is the tools lathe. Cutting process tool CARBIDE with Cast Iron Material Universal Lathe which is commonly found at Analysiscutting Process by some aspects numely Cutting force, Cutting Speed, Cutting Power, Cutting Indication Power, Temperature Zone 1 and Temperatur Zone 2. Purpose of this Study was to determine how big the cutting Speed, Cutting Power, electromotor Power,Temperatur Zone 1 and Temperatur Zone 2 that drives the chisel cutting CARBIDE in the Process of tur ning Cast Iron Material. Cutting force obtained from image analysis relationship between the recommended Component Cuting Force with plane of the cut and Cutting Speed obtained from image analysis of relationships between the recommended Cutting Speed Feed rate.

  8. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  9. Development of a high-temperature diagnostics-while-drilling tool.

    Energy Technology Data Exchange (ETDEWEB)

    Chavira, David J.; Huey, David (Stress Engineering Services, Inc.); Hetmaniak, Chris (Stress Engineering Services, Inc.); Polsky, Yarom; King, Dennis K.; Jacobson, Ronald David; Blankenship, Douglas Alan; Knudsen, Steven Dell; Henfling, Joseph Anthony; Mansure, Arthur James

    2009-01-01

    The envisioned benefits of Diagnostics-While-Drilling (DWD) are based on the principle that high-speed, real-time information from the downhole environment will promote better control of the drilling process. Although in practice a DWD system could provide information related to any aspect of exploration and production of subsurface resources, the current DWD system provides data on drilling dynamics. This particular set of new tools provided by DWD will allow quicker detection of problems, reduce drilling flat-time and facilitate more efficient drilling (drilling optimization) with the overarching result of decreased drilling costs. In addition to providing the driller with an improved, real-time picture of the drilling conditions downhole, data generated from DWD systems provides researchers with valuable, high fidelity data sets necessary for developing and validating enhanced understanding of the drilling process. Toward this end, the availability of DWD creates a synergy with other Sandia Geothermal programs, such as the hard-rock bit program, where the introduction of alternative rock-reduction technologies are contingent on the reduction or elimination of damaging dynamic effects. More detailed descriptions of the rationale for the program and early development efforts are described in more detail by others [SAND2003-2069 and SAND2000-0239]. A first-generation low-temperature (LT) DWD system was fielded in a series of proof-of-concept tests (POC) to validate functionality. Using the LT system, DWD was subsequently used to support a single-laboratory/multiple-partner CRADA (Cooperative Research and Development Agreement) entitled Advanced Drag Bits for Hard-Rock Drilling. The drag-bit CRADA was established between Sandia and four bit companies, and involved testing of a PDC bit from each company [Wise, et al., 2003, 2004] in the same lithologic interval at the Gas Technology Institute (GTI) test facility near Catoosa, OK. In addition, the LT DWD system has

  10. Technology development for high temperature logging tools

    Energy Technology Data Exchange (ETDEWEB)

    Veneruso, A.F.; Coquat, J.A.

    1979-01-01

    A set of prototype, high temperature logging tools (temperature, pressure and flow) were tested successfully to temperatures up to 275/sup 0/C in a Union geothermal well during November 1978 as part of the Geothermal Logging Instrumentation Development Program. This program is being conducted by Sandia Laboratories for the Department of Energy's Division of Geothermal Energy. The progress and plans of this industry based program to develop and apply the high temperature instrumentation technology needed to make reliable geothermal borehole measurements are described. Specifically, this program is upgrading existing sondes for improved high temperature performance, as well as applying new materials (elastomers, polymers, metals and ceramics) and developing component technology such as high temperature cables, cableheads and electronics to make borehole measurements such as formation temperature, flow rate, high resolution pressure and fracture mapping. In order to satisfy critical existing needs, the near term goal is for operation up to 275/sup 0/C and 7000 psi by the end of FY80. The long term goal is for operation up to 350/sup 0/C and 20,000 psi by the end of FY84.

  11. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  12. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  13. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  15. A LOW TEMPERATURE ALUMINIZING TREATMENT OF HOT WORK TOOL STEEL

    OpenAIRE

    Matijević, Božidar

    2013-01-01

    Conventional aluminizing processes by pack cementation are typically carried out at elevated temperatures. A low temperature powder aluminizing technology was applied to the X40CrMoV5-1 hot tool steel. The aluminizing temperature was from 550 °C to 620 °C. Effects of temperature and time on the microstructure and phase evolution were investigated. Also, the intermetallic layer thickness was measured in the aluminized layer of a steel substrate. The cross-sectional microstructures, the alumini...

  16. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  17. An all-optical system designed for the heating and temperature measurement of the diamond tool

    CSIR Research Space (South Africa)

    Masina, BN

    2012-07-01

    Full Text Available and then subsequent temperature measurement of the diamond tool. A laser beam was used as the source to raise the temperature of the diamond tool, and the resultant temperature was measured by using the blackbody principle. In this poster, we have successfully...

  18. Evaluating comfort with varying temperatures: a graphic design tool

    Energy Technology Data Exchange (ETDEWEB)

    Evans, J.M. [Research Centre Habitat and Energy, Faculty of Architecture, Design and Urbanism, University of Buenos Aires, Ciudad Universitaria (Argentina)

    2002-07-01

    This paper considers the need to define comfort of indoor and outdoor spaces in relation to the daily variations of temperature. A graphical tool is presented, which indicates the daily swings of temperature, shown as a single point on a graph representing the average temperature and the maximum temperature swing. This point can be compared with the comfort zones for different activity levels, such as sedentary activity, sleeping, indoor and outdoor circulation according to the design proposals for different spaces. The graph allows the representation of climatic variables, the definition of comfort zones, the selection of bio climatic design resources and the evaluation of indoor temperatures, measured in actual buildings or obtained from computer simulations. The development of the graph is explained and examples given with special emphasis on the use of thermal mass. (author)

  19. Incorporation of the equilibrium temperature approach in a Soil and Water Assessment Tool hydroclimatological stream temperature model

    Science.gov (United States)

    Du, Xinzhong; Shrestha, Narayan Kumar; Ficklin, Darren L.; Wang, Junye

    2018-04-01

    Stream temperature is an important indicator for biodiversity and sustainability in aquatic ecosystems. The stream temperature model currently in the Soil and Water Assessment Tool (SWAT) only considers the impact of air temperature on stream temperature, while the hydroclimatological stream temperature model developed within the SWAT model considers hydrology and the impact of air temperature in simulating the water-air heat transfer process. In this study, we modified the hydroclimatological model by including the equilibrium temperature approach to model heat transfer processes at the water-air interface, which reflects the influences of air temperature, solar radiation, wind speed and streamflow conditions on the heat transfer process. The thermal capacity of the streamflow is modeled by the variation of the stream water depth. An advantage of this equilibrium temperature model is the simple parameterization, with only two parameters added to model the heat transfer processes. The equilibrium temperature model proposed in this study is applied and tested in the Athabasca River basin (ARB) in Alberta, Canada. The model is calibrated and validated at five stations throughout different parts of the ARB, where close to monthly samplings of stream temperatures are available. The results indicate that the equilibrium temperature model proposed in this study provided better and more consistent performances for the different regions of the ARB with the values of the Nash-Sutcliffe Efficiency coefficient (NSE) greater than those of the original SWAT model and the hydroclimatological model. To test the model performance for different hydrological and environmental conditions, the equilibrium temperature model was also applied to the North Fork Tolt River Watershed in Washington, United States. The results indicate a reasonable simulation of stream temperature using the model proposed in this study, with minimum relative error values compared to the other two models

  20. Analyzing the effect of tool edge radius on cutting temperature in micro-milling process

    Science.gov (United States)

    Liang, Y. C.; Yang, K.; Zheng, K. N.; Bai, Q. S.; Chen, W. Q.; Sun, G. Y.

    2010-10-01

    Cutting heat is one of the important physical subjects in the cutting process. Cutting heat together with cutting temperature produced by the cutting process will directly have effects on the tool wear and the life as well as on the workpiece processing precision and surface quality. The feature size of the workpiece is usually several microns. Thus, the tiny changes of cutting temperature will affect the workpiece on the surface quality and accuracy. Therefore, cutting heat and temperature generated in micro-milling will have significantly different effect than the one in the traditional tools cutting. In this paper, a two-dimensional coupled thermal-mechanical finite element model is adopted to determine thermal fields and cutting temperature during the Micro-milling process, by using software Deform-2D. The effect of tool edge radius on effective stress, effective strain, velocity field and cutting temperature distribution in micro-milling of aluminum alloy Al2024-T6 were investigated and analyzed. Also, the transient cutting temperature distribution was simulated dynamically. The simulation results show that the cutting temperature in Micro-milling is lower than those occurring in conventional milling processes due to the small loads and low cutting velocity. With increase of tool edge radius, the maximum temperature region gradually occurs on the contact region between finished surfaced and flank face of micro-cutter, instead of the rake face or the corner of micro-cutter. And this phenomenon shows an obvious size effect.

  1. Multivariate data analysis as a fast tool in evaluation of solid state phenomena

    DEFF Research Database (Denmark)

    Jørgensen, Anna Cecilia; Miroshnyk, Inna; Karjalainen, Milja

    2006-01-01

    of information generated can be overwhelming and the need for more effective data analysis tools is well recognized. The aim of this study was to investigate the use of multivariate data analysis, in particular principal component analysis (PCA), for fast analysis of solid state information. The data sets...... the molecular level interpretation of the structural changes related to the loss of water, as well as interpretation of the phenomena related to the crystallization. The critical temperatures or critical time points were identified easily using the principal component analysis. The variables (diffraction angles...... or wavenumbers) that changed could be identified by the careful interpretation of the loadings plots. The PCA approach provides an effective tool for fast screening of solid state information....

  2. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  3. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  4. Analysis of the temperature of the hot tool in the cut of woven fabric using infrared images

    Science.gov (United States)

    Borelli, Joao E.; Verderio, Leonardo A.; Gonzaga, Adilson; Ruffino, Rosalvo T.

    2001-03-01

    Textile manufacture occupies a prominence place in the national economy. By virtue of its importance researches have been made on the development of new materials, equipment and methods used in the production process. The cutting of textiles starts in the basic stage, to be followed within the process of the making of clothes and other articles. In the hot cutting of fabric, one of the variables of great importance in the control of the process is the contact temperature between the tool and the fabric. The work presents a technique for the measurement of the temperature based on the processing of infrared images. For this a system was developed composed of an infrared camera, a framegrabber PC board and software that analyzes the punctual temperature in the cut area enabling the operator to achieve the necessary control of the other variables involved in the process.

  5. A low temperature aluminizing treatment of hot work tool steel

    Energy Technology Data Exchange (ETDEWEB)

    Matijevic, B., E-mail: bozidar.matijevic@fsb.hr [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecture, Zagreb (Croatia)

    2010-07-01

    Conventional aluminizing processes by pack cementation are typically carried out at elevated temperatures. A low temperature powder aluminizing technology was applied to hot tool steel H13. The aluminizing treating temperature was from 550 to 620°C. Effects of temperature and time on the microstructure and phase evolution were investigated. Also, the intermetallic layer thickness was measured in the aluminized layer of a steel substrate. The cross-sectional microstructures, the aluminized layer thickness and the oxide layer were studied. Scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDX), glow discharge optical spectroscopy (GDOS) were applied to observe the cross-sections and the distribution of elements. (author)

  6. A low temperature aluminizing treatment of hot work tool steel

    International Nuclear Information System (INIS)

    Matijevic, B.

    2010-01-01

    Conventional aluminizing processes by pack cementation are typically carried out at elevated temperatures. A low temperature powder aluminizing technology was applied to hot tool steel H13. The aluminizing treating temperature was from 550 to 620°C. Effects of temperature and time on the microstructure and phase evolution were investigated. Also, the intermetallic layer thickness was measured in the aluminized layer of a steel substrate. The cross-sectional microstructures, the aluminized layer thickness and the oxide layer were studied. Scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDX), glow discharge optical spectroscopy (GDOS) were applied to observe the cross-sections and the distribution of elements. (author)

  7. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  8. Channel CAT: A Tactical Link Analysis Tool

    Science.gov (United States)

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  9. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  10. Influence of the ion nitriding temperature in the wear resistance of AISI H13 tool steel

    International Nuclear Information System (INIS)

    Heck, Stenio Cristaldo; Fernandes, Frederico Augusto Pires; Pereira, Ricardo Gomes; Casteletti, Luiz Carlos; Totten, George Edward

    2010-01-01

    The AISI H13 tool steel for hot work is the most used in its category. This steel was developed for injection molds and extrusion of hot metals as well as for conformation in hot presses and hammers. Plasma nitriding can improve significantly the surface properties of these steels, but the treatments conditions, such as temperature, must be optimized. In this work the influence of nitriding treatment temperature on the wear behavior of this steel is investigated. Samples of AISI H13 steel were quenched and tempered and then ion nitrided in the temperatures of 450, 550 and 650 deg C, at 4mbar pressure, during 5 hours. Samples of the treated material were characterized by optical microscopy, Vickers microhardness, x-ray analysis and wear tests. Plasma nitriding formed hard diffusion zones in all the treated samples. White layers were formed in samples treated at 550 deg C and 650 deg C. The treatment temperature of 450 deg C produced the highest hardness. Treatment temperature showed great influence in the diffusion layer thickness. X-ray analysis indicated the formation of the Fe_3N, Fe_4N and CrN phases for all temperatures, but with different concentrations. Nitriding increased significantly the AISI H13 wear resistance. (author)

  11. A temperature dependent cyclic plasticity model for hot work tool steel including particle coarsening

    Science.gov (United States)

    Jilg, Andreas; Seifert, Thomas

    2018-05-01

    Hot work tools are subjected to complex thermal and mechanical loads during hot forming processes. Locally, the stresses can exceed the material's yield strength in highly loaded areas as e.g. in small radii in die cavities. To sustain the high loads, the hot forming tools are typically made of martensitic hot work steels. While temperatures for annealing of the tool steels usually lie in the range between 400 and 600 °C, the steels may experience even higher temperatures during hot forming, resulting in softening of the material due to coarsening of strengthening particles. In this paper, a temperature dependent cyclic plasticity model for the martensitic hot work tool steel 1.2367 (X38CrMoV5-3) is presented that includes softening due to particle coarsening and that can be applied in finite-element calculations to assess the effect of softening on the thermomechanical fatigue life of hot work tools. To this end, a kinetic model for the evolution of the mean size of secondary carbides based on Ostwald ripening is coupled with a cyclic plasticity model with kinematic hardening. Mechanism-based relations are developed to describe the dependency of the mechanical properties on carbide size and temperature. The material properties of the mechanical and kinetic model are determined on the basis of tempering hardness curves as well as monotonic and cyclic tests.

  12. Temperature Dependence and Magnetic Properties of Injection Molding Tool Materials Used in Induction Heating

    DEFF Research Database (Denmark)

    Guerrier, Patrick; Nielsen, Kaspar Kirstein; Hattel, Jesper Henri

    2015-01-01

    To analyze the heating phase of an induction heated injection molding tool precisely, the temperature-dependent magnetic properties, B–H curves, and the hysteresis loss are necessary for the molding tool materials. Hence, injection molding tool steels, core materials among other materials have...

  13. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  14. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  15. Analysis of Long-Term Temperature Variations in the Human Body.

    Science.gov (United States)

    Dakappa, Pradeepa Hoskeri; Mahabala, Chakrapani

    2015-01-01

    Body temperature is a continuous physiological variable. In normal healthy adults, oral temperature is estimated to vary between 36.1°C and 37.2°C. Fever is a complex host response to many external and internal agents and is a potential contributor to many clinical conditions. Despite being one of the foremost vital signs, temperature and its analysis and variations during many pathological conditions has yet to be examined in detail using mathematical techniques. Classical fever patterns based on recordings obtained every 8-12 h have been developed. However, such patterns do not provide meaningful information in diagnosing diseases. Because fever is a host response, it is likely that there could be a unique response to specific etiologies. Continuous long-term temperature monitoring and pattern analysis using specific analytical methods developed in engineering and physics could aid in revealing unique fever responses of hosts and in different clinical conditions. Furthermore, such analysis can potentially be used as a novel diagnostic tool and to study the effect of pharmaceutical agents and other therapeutic protocols. Thus, the goal of our article is to present a comprehensive review of the recent relevant literature and analyze the current state of research regarding temperature variations in the human body.

  16. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  17. The Development of High Temperature Thermoplastic Composite Materials for Additive Manufactured Autoclave Tooling

    Energy Technology Data Exchange (ETDEWEB)

    Kunc, Vlastimil [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lindahl, John M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hassen, Ahmed A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    In this work, ORNL and Techmer investigated and screened different high temperature thermoplastic reinforced materials to fabricate composite molds for autoclave processes using Additive Manufacturing (AM) techniques. This project directly led to the development and commercial release of two printable, high temperature composite materials available through Techmer PM. These new materials are targeted for high temperature tooling made via large scale additive manufacturing.

  18. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  19. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  20. Finite element analysis and modeling of temperature distribution in turning of titanium alloys

    Directory of Open Access Journals (Sweden)

    Moola Mohan Reddy

    2018-04-01

    Full Text Available The titanium alloys (Ti-6Al-4V have been widely used in aerospace, and medical applications and the demand is ever-growing due to its outstanding properties. In this paper, the finite element modeling on machinability of Ti-6Al-4V using cubic boron nitride and polycrystalline diamond tool in dry turning environment was investigated. This research was carried out to generate mathematical models at 95% confidence level for cutting force and temperature distribution regarding cutting speed, feed rate and depth of cut. The Box-Behnken design of experiment was used as Response Surface Model to generate combinations of cutting variables for modeling. Then, finite element simulation was performed using AdvantEdge®. The influence of each cutting parameters on the cutting responses was investigated using Analysis of Variance. The analysis shows that depth of cut is the most influential parameter on resultant cutting force whereas feed rate is the most influential parameter on cutting temperature. Also, the effect of the cutting-edge radius was investigated for both tools. This research would help to maximize the tool life and to improve surface finish.

  1. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module

    Directory of Open Access Journals (Sweden)

    Yuan-Chieh Lo

    2018-02-01

    Full Text Available Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe. Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t| °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR technique and implemented into the real-time embedded system.

  2. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  3. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  4. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    Science.gov (United States)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  5. Structural analysis of ITER sub-assembly tools

    International Nuclear Information System (INIS)

    Nam, K.O.; Park, H.K.; Kim, D.J.; Ahn, H.J.; Lee, J.H.; Kim, K.K.; Im, K.; Shaw, R.

    2011-01-01

    The ITER Tokamak assembly tools are purpose-built assembly tools to complete the ITER Tokamak machine which includes the cryostat and the components contained therein. The sector sub-assembly tools descried in this paper are main assembly tools to assemble vacuum vessel, thermal shield and toroidal filed coils into a complete 40 o sector. The 40 o sector sub-assembly tools are composed of sector sub-assembly tool, including radial beam, vacuum vessel supports and mid-plane brace tools. These tools shall have sufficient strength to transport and handle heavy weight of the ITER Tokamak machine reached several hundred tons. Therefore these tools should be designed and analyzed to confirm both the strength and structural stability even in the case of conservative assumptions. To verify structural stabilities of the sector sub-assembly tools in terms of strength and deflection, ANSYS code was used for linear static analysis. The results of the analysis show that these tools are designed with sufficient strength and stiffness. The conceptual designs of these tools are briefly described in this paper also.

  6. The RUBA Watchdog Video Analysis Tool

    DEFF Research Database (Denmark)

    Bahnsen, Chris Holmberg; Madsen, Tanja Kidholm Osmann; Jensen, Morten Bornø

    We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA.......We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA....

  7. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  8. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  9. Design of Monitoring Tool Heartbeat Rate and Human Body Temperature Based on WEB

    Directory of Open Access Journals (Sweden)

    Jalinas

    2018-01-01

    Full Text Available The heart is one of the most important organs in the human body. One way to know heart health is to measure the number of heart beats per minute and body temperature also shows health, many heart rate and body temperature devices but can only be accessed offline. This research aims to design a heart rate detector and human body temperature that the measurement results can be accessed via web pages anywhere and anytime. This device can be used by many users by entering different ID numbers. The design consists of input blocks: pulse sensor, DS18B20 sensor and 3x4 keypad button. Process blocks: Arduino Mega 2560 Microcontroller, Ethernet Shield, router and USB modem. And output block: 16x2 LCD and mobile phone or PC to access web page. Based on the test results, this tool successfully measures the heart rate with an average error percentage of 2.702 % when compared with the oxymeter tool. On the measurement of body temperature get the result of the average error percentage of 2.18 %.

  10. Infrared thermography--a non-invasive tool to evaluate thermal status of neonatal pigs based on surface temperature.

    Science.gov (United States)

    Kammersgaard, T S; Malmkvist, J; Pedersen, L J

    2013-12-01

    Hypothermia is a major cause of mortality in neonatal pigs. Infrared (IR) thermography is a promising non-invasive method to assess thermal status, but has not been evaluated for use on neonatal pigs from birth. The aim of this study was to evaluate the application of IR thermography as a non-invasive tool to estimate body temperature and assess the thermal status in newborn pigs by (1) estimating the relationship between surface temperature and rectal temperature (RT) in neonatal pigs; and (2) estimating the influence of air temperature (AT), birth weight and the time from birth on the relationship between surface temperature and RT. The method was evaluated on the basis of 1695 thermograms and 915 RTs on 91 neonatal pigs born in loose farrowing pens with floor heating at 34°C, and three different ATs (15°C, 20°C and 25°C). Full-body thermograms of the back and the side of the pigs and RT were acquired at 11 sampling times between birth and 48 h after birth. The maximum (IRmax), minimum, average of the full body and ear minimum IR surface temperatures were derived from the thermograms. IRmax had the highest correlation with RT (0.82) and was therefore used in the statistical analysis. The relation of RT by IRmax depended on time at: 0 h (slope: 0.20°C, Pmethod has the potential to be used without the need for manual restraint of the pigs. On the basis of the results of this study, we propose that IRmax temperature from full-body thermograms has implication as a valid tool to assess the thermal status in neonatal piglets but not as an identical substitute for RT.

  11. New methods to get valid signals at high temperature conditions by using DSP tools of the ASSA (Abnormal Signal Simulation Analyzer)

    International Nuclear Information System (INIS)

    Koo, Kil-Mo; Hong, Seong-Wan; Song, Jin-Ho; Baek, Won-Pil; Jung, Myung-Kwan

    2012-01-01

    A new method to get valid signals under high temperature conditions using DSP (Digital Signal Processing) tools of an ASSA (Abnormal Signal Simulation Analyzer) module through a signal analysis of important circuit modeling under severe accident conditions has been suggested. Already exist, such kinds of DSP technique operated by LabVIEW or MatLab code linked with PSpice code, which have convenient tools as a special function of the ASSA module including a signal reconstruction method. If we can obtain a shift data of the transient parameters such as the time constant of the R-L-C circuit affected by high temperature under a severe accident condition, it will be possible to reconstruct an abnormal signal using a trained deconvolution algorithm as a sort of DSP technique. (author)

  12. Analysis and modeling of the seasonal South China Sea temperature cycle using remote sensing

    Science.gov (United States)

    Twigt, Daniel J.; de Goede, Erik D.; Schrama, Ernst J. O.; Gerritsen, Herman

    2007-10-01

    The present paper describes the analysis and modeling of the South China Sea (SCS) temperature cycle on a seasonal scale. It investigates the possibility to model this cycle in a consistent way while not taking into account tidal forcing and associated tidal mixing and exchange. This is motivated by the possibility to significantly increase the model’s computational efficiency when neglecting tides. The goal is to develop a flexible and efficient tool for seasonal scenario analysis and to generate transport boundary forcing for local models. Given the significant spatial extent of the SCS basin and the focus on seasonal time scales, synoptic remote sensing is an ideal tool in this analysis. Remote sensing is used to assess the seasonal temperature cycle to identify the relevant driving forces and is a valuable source of input data for modeling. Model simulations are performed using a three-dimensional baroclinic-reduced depth model, driven by monthly mean sea surface anomaly boundary forcing, monthly mean lateral temperature, and salinity forcing obtained from the World Ocean Atlas 2001 climatology, six hourly meteorological forcing from the European Center for Medium range Weather Forecasting ERA-40 dataset, and remotely sensed sea surface temperature (SST) data. A sensitivity analysis of model forcing and coefficients is performed. The model results are quantitatively assessed against climatological temperature profiles using a goodness-of-fit norm. In the deep regions, the model results are in good agreement with this validation data. In the shallow regions, discrepancies are found. To improve the agreement there, we apply a SST nudging method at the free water surface. This considerably improves the model’s vertical temperature representation in the shallow regions. Based on the model validation against climatological in situ and SST data, we conclude that the seasonal temperature cycle for the deep SCS basin can be represented to a good degree. For shallow

  13. Analysis of logging data from nuclear borehole tools

    International Nuclear Information System (INIS)

    Hovgaard, J.; Oelgaard, P.L.

    1989-12-01

    The processing procedure for logging data from a borehole of the Stenlille project of Dansk Naturgas A/S has been analysed. The tools considered in the analysis were an integral, natural-gamma tool, a neutron porosity tool, a gamma density tool and a caliper tool. It is believed that in most cases the processing procedure used by the logging company in the interpretation of the raw data is fully understood. An exception is the epithermal part of the neutron porosity tool where all data needed for an interpretation were not available. The analysis has shown that some parts of the interpretation procedure may not be consistent with the physical principle of the tools. (author)

  14. Design A Prototype of Temperature Logging Tools for Geothermal Prospecting Areas

    Directory of Open Access Journals (Sweden)

    Supriyanto

    2013-08-01

    Full Text Available The costs of geothermal exploration are very high because technology is still imported from other countries. The local business players in the geothermal sector do not have the ability to compete with global companies. To reduce costs, we need to develop our own equipment with competitive prices. Here in Indonesia, we have started to design a prototype of temperature logging tools for geothermal prospecting areas. This equipment can be used to detect temperature versus depth variations. To measure the thermal gradient, the platinum resistor temperature sensor is moved slowly down along the borehole. The displacement along the borehole is measured by a rotary encoder. This system is controlled by a 16-bit H8/3069F microcontroller. The acquired temperature data is displayed on a PC monitor using a Python Graphical User Interface. The system has been already tested in the Gunung Pancar geothermal prospect area in Bogor.

  15. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  16. The CANDU alarm analysis tool (CAAT)

    Energy Technology Data Exchange (ETDEWEB)

    Davey, E C; Feher, M P; Lupton, L R [Control Centre Technology Branch, ON (Canada)

    1997-09-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs.

  17. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  18. Affordances of agricultural systems analysis tools

    NARCIS (Netherlands)

    Ditzler, Lenora; Klerkx, Laurens; Chan-Dentoni, Jacqueline; Posthumus, Helena; Krupnik, Timothy J.; Ridaura, Santiago López; Andersson, Jens A.; Baudron, Frédéric; Groot, Jeroen C.J.

    2018-01-01

    The increasingly complex challenges facing agricultural systems require problem-solving processes and systems analysis (SA) tools that engage multiple actors across disciplines. In this article, we employ the theory of affordances to unravel what tools may furnish users, and how those affordances

  19. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  20. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  1. Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining

    Science.gov (United States)

    Rizzuti, S.; Umbrello, D.

    2011-01-01

    Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.

  2. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  3. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    Lenzi, Bruno

    2012-01-01

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  4. Cluster tool for in situ processing and comprehensive characteriza tion of thin films at high temperatures.

    Science.gov (United States)

    Wenisch, Robert; Lungwitz, Frank; Hanf, Daniel; Heller, Rene; Zscharschuch, Jens; Hübner, René; von Borany, Johannes; Abrasonis, Gintautas; Gemming, Sibylle; Escobar-Galindo, Ramon; Krause, Matthias

    2018-05-31

    A new cluster tool for in situ real-time processing and depth-resolved compositional, structural and optical characterization of thin films at temperatures from -100 to 800 °C is described. The implemented techniques comprise magnetron sputtering, ion irradiation, Rutherford backscattering spectrometry, Raman spectroscopy and spectroscopic ellipsometry. The capability of the cluster tool is demonstrated for a layer stack MgO/ amorphous Si (~60 nm)/ Ag (~30 nm), deposited at room temperature and crystallized with partial layer exchange by heating up to 650°C. Its initial and final composition, stacking order and structure were monitored in situ in real time and a reaction progress was defined as a function of time and temperature.

  5. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  6. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  7. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  8. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  9. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  10. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  11. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  12. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  13. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  14. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  15. Study of Cutting Edge Temperature and Cutting Force of End Mill Tool in High Speed Machining

    Directory of Open Access Journals (Sweden)

    Kiprawi Mohammad Ashaari

    2017-01-01

    Full Text Available A wear of cutting tools during machining process is unavoidable due to the presence of frictional forces during removing process of unwanted material of workpiece. It is unavoidable but can be controlled at slower rate if the cutting speed is fixed at certain point in order to achieve optimum cutting conditions. The wear of cutting tools is closely related with the thermal deformations that occurred between the frictional contact point of cutting edge of cutting tool and workpiece. This research paper is focused on determinations of relationship among cutting temperature, cutting speed, cutting forces and radial depth of cutting parameters. The cutting temperature is determined by using the Indium Arsenide (InAs and Indium Antimonide (InSb photocells to measure infrared radiation that are emitted from cutting tools and cutting forces is determined by using dynamometer. The high speed machining process is done by end milling the outer surface of carbon steel. The signal from the photocell is digitally visualized in the digital oscilloscope. Based on the results, the cutting temperature increased as the radial depth and cutting speed increased. The cutting forces increased when radial depth increased but decreased when cutting speed is increased. The setup for calibration and discussion of the experiment will be explained in this paper.

  16. Determining Ms temperature on a AISI D2 cold work tool steel using magnetic Barkhausen noise

    Energy Technology Data Exchange (ETDEWEB)

    Huallpa, Edgar Apaza, E-mail: gared1@gmail.com [Escola Politécnica da Universidade de São Paulo, Av. Prof. Mello Moraes 2463, 05508-030 SP (Brazil); Sánchez, J. Capó, E-mail: jcapo@usp.br [Departamento de Física, Facultad de Ciencias Naturales, Universidad de Oriente, Av. Patricio Lumumba s/n 90500, Santiago de Cuba (Cuba); Padovese, L.R., E-mail: lrpadove@usp.br [Escola Politécnica da Universidade de São Paulo, Av. Prof. Mello Moraes 2463, 05508-030 SP (Brazil); Goldenstein, Hélio, E-mail: hgoldens@usp.br [Escola Politécnica da Universidade de São Paulo, Av. Prof. Mello Moraes 2463, 05508-030 SP (Brazil)

    2013-11-15

    Highlights: ► MBN was used to follow the martensite transformation in a tool steel. ► The results were compared with resistivity experiments. ► The Ms was estimated with Andrews equation coupled to ThermoCalc calculations. The experimental results showed good agreement. -- Abstract: The use of Magnetic Barkhausen Noise (MBN) as a experimental method for measuring the martensite start (Ms) temperature was explored, using as model system a cold-work tool steel (AISI D2) austenitized at a very high temperature (1473 K), so as to transform in sub-zero temperatures. The progress of the transformation was also followed with electrical resistance measurements, optical microscopy and scanning electron microscopy. Both MBN and resistivity measurements showed a change near 230 K during cooling, corresponding to the Ms temperature, as compared with 245 K, estimated with Andrews empirical equation applied to the austenite composition calculated using ThermoCalc.

  17. Direct implementation of an axial-flow helium gas turbine tool in a system analysis tool for HTGRs

    International Nuclear Information System (INIS)

    Kim, Ji Hwan; No, Hee Cheon; Kim, Hyeun Min; Lim, Hong Sik

    2008-01-01

    This study concerns the development of dynamic models for a high-temperature gas-cooled reactor (HTGR) through direct implementation of a gas turbine analysis code with a transient analysis code. We have developed a streamline curvature analysis code based on the Newton-Raphson numerical application (SANA) to analyze the off-design performance of helium gas turbines under conditions of normal operation. The SANA code performs a detailed two-dimensional analysis by means of throughflow calculation with allowances for losses in axial-flow multistage compressors and turbines. To evaluate the performance in the steady-state and load transient of HTGRs, we developed GAMMA-T by implementing SANA in the transient system code, GAMMA, which is a multidimensional, multicomponent analysis tool for HTGRs. The reactor, heat exchangers, and connecting pipes were designed with a one-dimensional thermal-hydraulic model that uses the GAMMA code. We assessed GAMMA-T by comparing its results with the steady-state results of the GTHTR300 of JAEA. We concluded that the results are in good agreement, including the results of the vessel cooling bypass flow and the turbine cooling flow

  18. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  19. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  20. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  1. Post-Flight Data Analysis Tool

    Science.gov (United States)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  2. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.

    2006-01-01

    be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots......We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...

  3. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  4. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  5. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  6. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  7. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  8. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  9. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  10. "Soft"or "hard" ionisation? Investigation of metastable gas temperature effect on direct analysis in real-time analysis of Voriconazole.

    Science.gov (United States)

    Lapthorn, Cris; Pullen, Frank

    2009-01-01

    The performance of the direct analysis in real-time (DART) technique was evaluated across a range of metastable gas temperatures for a pharmaceutical compound, Voriconazole, in order to investigate the effect of metastable gas temperature on molecular ion intensity and fragmentation. The DART source has been used to analyse a range of analytes and from a range of matrices including drugs in solid tablet form and preparations, active ingredients in ointment, naturally occurring plant alkaloids, flavours and fragrances, from thin layer chromatography (TLC) plates, melting point tubes and biological matrices including hair, urine and blood. The advantages of this technique include rapid analysis time (as little as 5 s), a reduction in sample preparation requirements, elimination of mobile phase requirement and analysis of samples not typically amenable to atmospheric pressure ionisation (API) techniques. This technology has therefore been proposed as an everyday tool for identification of components in crude organic reaction mixtures.

  11. INNOVATIVE INSTRUMENTATION AND ANALYSIS OF THE TEMPERATURE MEASUREMENT FOR HIGH TEMPERATURE GASIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Seong W. Lee

    2003-09-01

    During this reporting period, the literature survey including the gasifier temperature measurement literature, the ultrasonic application and its background study in cleaning application, and spray coating process are completed. The gasifier simulator (cold model) testing has been successfully conducted. Four factors (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. The Analysis of Variance (ANOVA) was applied to analyze the test data. The analysis shows that all four factors are significant to the temperature measurements in the gasifier simulator (cold model). The regression analysis for the case with the normalized room temperature shows that linear model fits the temperature data with 82% accuracy (18% error). The regression analysis for the case without the normalized room temperature shows 72.5% accuracy (27.5% error). The nonlinear regression analysis indicates a better fit than that of the linear regression. The nonlinear regression model's accuracy is 88.7% (11.3% error) for normalized room temperature case, which is better than the linear regression analysis. The hot model thermocouple sleeve design and fabrication are completed. The gasifier simulator (hot model) design and the fabrication are completed. The system tests of the gasifier simulator (hot model) have been conducted and some modifications have been made. Based on the system tests and results analysis, the gasifier simulator (hot model) has met the proposed design requirement and the ready for system test. The ultrasonic cleaning method is under evaluation and will be further studied for the gasifier simulator (hot model) application. The progress of this project has been on schedule.

  12. Cemented carbide cutting tool: Laser processing and thermal stress analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)]. E-mail: bsyilbas@kfupm.edu.sa; Arif, A.F.M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia); Karatas, C. [Engineering Faculty, Hacettepe University, Ankara (Turkey); Ahsan, M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)

    2007-04-15

    Laser treatment of cemented carbide tool surface consisting of W, C, TiC, TaC is examined and thermal stress developed due to temperature gradients in the laser treated region is predicted numerically. Temperature rise in the substrate material is computed numerically using the Fourier heating model. Experiment is carried out to treat the tool surfaces using a CO{sub 2} laser while SEM, XRD and EDS are carried out for morphological and structural characterization of the treated surface. Laser parameters were selected include the laser output power, duty cycle, assisting gas pressure, scanning speed, and nominal focus setting of the focusing lens. It is found that temperature gradient attains significantly high values below the surface particularly for titanium and tantalum carbides, which in turn, results in high thermal stress generation in this region. SEM examination of laser treated surface and its cross section reveals that crack initiation below the surface occurs and crack extends over the depth of the laser treated region.

  13. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  14. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  15. Dedicated tool to assess the impact of a rhetorical task on human body temperature.

    Science.gov (United States)

    Koprowski, Robert; Wilczyński, Sławomir; Martowska, Katarzyna; Gołuch, Dominik; Wrocławska-Warchala, Emilia

    2017-10-01

    Functional infrared thermal imaging is a method widely used in medicine, including analysis of the mechanisms related to the effect of emotions on physiological processes. The article shows how the body temperature may change during stress associated with performing a rhetorical task and proposes new parameters useful for dynamic thermal imaging measurements MATERIALS AND METHODS: 29 healthy male subjects were examined. They were given a rhetorical task that induced stress. Analysis and processing of collected body temperature data in a spatial resolution of 256×512pixels and a temperature resolution of 0.1°C enabled to show the dynamics of temperature changes. This analysis was preceded by dedicated image analysis and processing methods RESULTS: The presented dedicated algorithm for image analysis and processing allows for fully automated, reproducible and quantitative assessment of temperature changes and time constants in a sequence of thermal images of the patient. When performing the rhetorical task, the temperature rose by 0.47±0.19°C in 72.41% of the subjects, including 20.69% in whom the temperature decreased by 0.49±0.14°C after 237±141s. For 20.69% of the subjects only a drop in temperature was registered. For the remaining 6.89% of the cases, no temperature changes were registered CONCLUSIONS: The performance of the rhetorical task by the subjects causes body temperature changes. The ambiguous temperature response to the given stress factor indicates the complex mechanisms responsible for regulating stressful situations. Stress associated with the examination itself induces body temperature changes. These changes should always be taken into account in the analysis of infrared data. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  17. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  18. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  19. Investigation, sensitivity analysis, and multi-objective optimization of effective parameters on temperature and force in robotic drilling cortical bone.

    Science.gov (United States)

    Tahmasbi, Vahid; Ghoreishi, Majid; Zolfaghari, Mojtaba

    2017-11-01

    The bone drilling process is very prominent in orthopedic surgeries and in the repair of bone fractures. It is also very common in dentistry and bone sampling operations. Due to the complexity of bone and the sensitivity of the process, bone drilling is one of the most important and sensitive processes in biomedical engineering. Orthopedic surgeries can be improved using robotic systems and mechatronic tools. The most crucial problem during drilling is an unwanted increase in process temperature (higher than 47 °C), which causes thermal osteonecrosis or cell death and local burning of the bone tissue. Moreover, imposing higher forces to the bone may lead to breaking or cracking and consequently cause serious damage. In this study, a mathematical second-order linear regression model as a function of tool drilling speed, feed rate, tool diameter, and their effective interactions is introduced to predict temperature and force during the bone drilling process. This model can determine the maximum speed of surgery that remains within an acceptable temperature range. Moreover, for the first time, using designed experiments, the bone drilling process was modeled, and the drilling speed, feed rate, and tool diameter were optimized. Then, using response surface methodology and applying a multi-objective optimization, drilling force was minimized to sustain an acceptable temperature range without damaging the bone or the surrounding tissue. In addition, for the first time, Sobol statistical sensitivity analysis is used to ascertain the effect of process input parameters on process temperature and force. The results show that among all effective input parameters, tool rotational speed, feed rate, and tool diameter have the highest influence on process temperature and force, respectively. The behavior of each output parameters with variation in each input parameter is further investigated. Finally, a multi-objective optimization has been performed considering all the

  20. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  1. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  2. A combined stochastic analysis of mean daily temperature and diurnal temperature range

    Science.gov (United States)

    Sirangelo, B.; Caloiero, T.; Coscarelli, R.; Ferrari, E.

    2018-03-01

    In this paper, a stochastic model, previously proposed for the maximum daily temperature, has been improved for the combined analysis of mean daily temperature and diurnal temperature range. In particular, the procedure applied to each variable sequentially performs the deseasonalization, by means of truncated Fourier series expansions, and the normalization of the temperature data, with the use of proper transformation functions. Then, a joint stochastic analysis of both the climatic variables has been performed by means of a FARIMA model, taking into account the stochastic dependency between the variables, namely introducing a cross-correlation between the standardized noises. The model has been applied to five daily temperature series of southern Italy. After the application of a Monte Carlo simulation procedure, the return periods of the joint behavior of the mean daily temperature and the diurnal temperature range have been evaluated. Moreover, the annual maxima of the temperature excursions in consecutive days have been analyzed for the synthetic series. The results obtained showed different behaviors probably linked to the distance from the sea and to the latitude of the station.

  3. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  4. Physics analysis tools for beauty physics in ATLAS

    International Nuclear Information System (INIS)

    Anastopoulos, C; Bouhova-Thacker, E; Catmore, J; Mora, L de; Dallison, S; Derue, F; Epp, B; Jussel, P; Kaczmarska, A; Radziewski, H v; Stahl, T; Reznicek, P

    2008-01-01

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools

  5. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  6. Risk analysis tools for force protection and infrastructure/asset protection

    International Nuclear Information System (INIS)

    Jaeger, C.D.; Duggan, R.A.; Paulus, W.K.

    1998-01-01

    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework

  7. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  8. Development of temperature statistical model when machining of aerospace alloy materials

    Directory of Open Access Journals (Sweden)

    Kadirgama Kumaran

    2014-01-01

    Full Text Available This paper presents to develop first-order models for predicting the cutting temperature for end-milling operation of Hastelloy C-22HS by using four different coated carbide cutting tools and two different cutting environments. The first-order equations of cutting temperature are developed using the response surface methodology (RSM. The cutting variables are cutting speed, feed rate, and axial depth. The analyses are carried out with the aid of the statistical software package. It can be seen that the model is suitable to predict the longitudinal component of the cutting temperature close to those readings recorded experimentally with a 95% confident level. The results obtained from the predictive models are also compared with results obtained from finite-element analysis (FEA. The developed first-order equations for the cutting temperature revealed that the feed rate is the most crucial factor, followed by axial depth and cutting speed. The PVD coated cutting tools perform better than the CVD-coated cutting tools in terms of cutting temperature. The cutting tools coated with TiAlN perform better compared with other cutting tools during the machining performance of Hastelloy C-22HS. It followed by TiN/TiCN/TiN and CVD coated with TiN/TiCN/Al2O3 and TiN/TiCN/TiN. From the finite-element analysis, the distribution of the cutting temperature can be discussed. High temperature appears in the lower sliding friction zone and at the cutting tip of the cutting tool. Maximum temperature is developed at the rake face some distance away from the tool nose, however, before the chip lift away.

  9. Data-base tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    The authors use a commercial data-base software package to create several data-base products that enhance the ability of experimental physicists to analyze data from the TMX-U experiment. This software resides on a Dec-20 computer in M-Divisions's user service center (USC), where data can be analyzed separately from the main acquisition computers. When these data-base tools are combined with interactive data analysis programs, physicists can perform automated (batch-style) processing or interactive data analysis on the computers in the USC or on the supercomputers of the NMFECC, in addition to the normal processing done on the acquisition system. One data-base tool provides highly reduced data for searching and correlation analysis of several diagnostic signals for a single shot or many shots. A second data-base tool provides retrieval and storage of unreduced data for detailed analysis of one or more diagnostic signals. The authors report how these data-base tools form the core of an evolving off-line data-analysis environment on the USC computers

  10. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  11. Temperature sensitivity analysis of polarity controlled electrostatically doped tunnel field-effect transistor

    Science.gov (United States)

    Nigam, Kaushal; Pandey, Sunil; Kondekar, P. N.; Sharma, Dheeraj

    2016-09-01

    The conventional tunnel field-effect transistors (TFETs) have shown potential to scale down in sub-22 nm regime due to its lower sub-threshold slope and robustness against short-channel effects (SCEs), however, sensitivity towards temperature variation is a major concern. Therefore, for the first time, we investigate temperature sensitivity analysis of a polarity controlled electrostatically doped tunnel field-effect transistor (ED-TFET). Different performance metrics and analog/RF figure-of-merits were considered and compared for both devices, and simulations were performed using Silvaco ATLAS device tool. We found that the variation in ON-state current in ED-TFET is almost temperature independent due to electrostatically doped mechanism, while, it increases in conventional TFET at higher temperature. Above room temperature, the variation in ION, IOFF, and SS sensitivity in ED-TFET are only 0.11%/K, 2.21%/K, and 0.63%/K, while, in conventional TFET the variations are 0.43%/K, 2.99%/K, and 0.71%/K, respectively. However, below room temperature, the variation in ED-TFET ION is 0.195%/K compared to 0.27%/K of conventional TFET. Moreover, it is analysed that the incomplete ionization effect in conventional TFET severely affects the drive current and the threshold voltage, while, ED-TFET remains unaffected. Hence, the proposed ED-TFET is less sensitive towards temperature variation and can be used for cryogenics as well as for high temperature applications.

  12. Soft X-ray and cathodoluminescence measurement, optimisation and analysis at liquid nitrogen temperatures

    Science.gov (United States)

    MacRae, C. M.; Wilson, N. C.; Torpy, A.; Delle Piane, C.

    2018-01-01

    Advances in field emission gun electron microprobes have led to significant gains in the beam power density and when analysis at high resolution is required then low voltages are often selected. The resulting beam power can lead to damage and this can be minimised by cooling the sample down to cryogenic temperatures allowing sub-micrometre imaging using a variety of spectrometers. Recent advances in soft X-ray emission spectrometers (SXES) offer a spectral tool to measure both chemistry and bonding and when combined with spectral cathodoluminescence the complementary techniques enable new knowledge to be gained from both mineral and materials. Magnesium and aluminium metals have been examined at both room and liquid nitrogen temperatures by SXES and the L-emission Fermi-edge has been observed to sharpen at the lower temperatures directly confirming thermal broadening of the X-ray spectra. Gains in emission intensity and resolution have been observed in cathodoluminescence for liquid nitrogen cooled quartz grains compared to ambient temperature quartz. This has enabled subtle growth features at quartz to quartz-cement boundaries to be imaged for the first time.

  13. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  14. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  15. Response Analysis on Electrical Pulses under Severe Nuclear Accident Temperature Conditions Using an Abnormal Signal Simulation Analysis Module

    Directory of Open Access Journals (Sweden)

    Kil-Mo Koo

    2012-01-01

    Full Text Available Unlike design basis accidents, some inherent uncertainties of the reliability of instrumentations are expected while subjected to harsh environments (e.g., high temperature and pressure, high humidity, and high radioactivity occurring in severe nuclear accident conditions. Even under such conditions, an electrical signal should be within its expected range so that some mitigating actions can be taken based on the signal in the control room. For example, an industrial process control standard requires that the normal signal level for pressure, flow, and resistance temperature detector sensors be in the range of 4~20 mA for most instruments. Whereas, in the case that an abnormal signal is expected from an instrument, such a signal should be refined through a signal validation process so that the refined signal could be available in the control room. For some abnormal signals expected under severe accident conditions, to date, diagnostics and response analysis have been evaluated with an equivalent circuit model of real instruments, which is regarded as the best method. The main objective of this paper is to introduce a program designed to implement a diagnostic and response analysis for equivalent circuit modeling. The program links signal analysis tool code to abnormal signal simulation engine code not only as a one body order system, but also as a part of functions of a PC-based ASSA (abnormal signal simulation analysis module developed to obtain a varying range of the R-C circuit elements in high temperature conditions. As a result, a special function for abnormal pulse signal patterns can be obtained through the program, which in turn makes it possible to analyze the abnormal output pulse signals through a response characteristic of a 4~20 mA circuit model and a range of the elements changing with temperature under an accident condition.

  16. Towards the standardization of time--temperature parameter usage in elevated temperature data analysis

    International Nuclear Information System (INIS)

    Goldhoff, R.M.

    1975-01-01

    Work devoted to establishment of recommended practices for correlating and extrapolating relevant data on creep-rupture properties of materials at high temperatures is described. An analysis of the time-temperature parameter is included along with descriptions of analysis and evaluation methods. Results of application of the methods are compared

  17. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  18. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  19. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  20. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  1. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  2. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  3. Draper Station Analysis Tool

    Science.gov (United States)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  4. Inverse analysis of inner surface temperature history from outer surface temperature measurement of a pipe

    International Nuclear Information System (INIS)

    Kubo, S; Ioka, S; Onchi, S; Matsumoto, Y

    2010-01-01

    When slug flow runs through a pipe, nonuniform and time-varying thermal stresses develop and there is a possibility that thermal fatigue occurs. Therefore it is necessary to know the temperature distributions and the stress distributions in the pipe for the integrity assessment of the pipe. It is, however, difficult to measure the inner surface temperature directly. Therefore establishment of the estimation method of the temperature history on inner surface of pipe is needed. As a basic study on the estimation method of the temperature history on the inner surface of a pipe with slug flow, this paper presents an estimation method of the temperature on the inner surface of a plate from the temperature on the outer surface. The relationship between the temperature history on the outer surface and the inner surface is obtained analytically. Using the results of the mathematical analysis, the inverse analysis method of the inner surface temperature history estimation from the outer surface temperature history is proposed. It is found that the inner surface temperature history can be estimated from the outer surface temperature history by applying the inverse analysis method, even when it is expressed by the multiple frequency components.

  5. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  6. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  7. Database tools for enhanced analysis of TMX-U data. Revision 1

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  8. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  9. High Temperature Plasmas Theory and Mathematical Tools for Laser and Fusion Plasmas

    CERN Document Server

    Spatschek, Karl-Heinz

    2012-01-01

    Filling the gap for a treatment of the subject as an advanced course in theoretical physics with a huge potential for future applications, this monograph discusses aspects of these applications and provides theoretical methods and tools for their investigation. Throughout this coherent and up-to-date work the main emphasis is on classical plasmas at high-temperatures, drawing on the experienced author's specialist background. As such, it covers the key areas of magnetic fusion plasma, laser-plasma-interaction and astrophysical plasmas, while also including nonlinear waves and phenomena.

  10. Analysis of PL spectrum shape of Si-based materials as a tool for determination of Si crystallites' distribution

    Energy Technology Data Exchange (ETDEWEB)

    Khomenkova, L., E-mail: khomen@isp.kiev.ua

    2014-11-15

    This paper represents the analysis of the shape of photoluminescence spectra of Si-based nano-materials vs. energy of excitation light and temperature of measurements as a tool for the estimation of Si nanocrystallites' distribution. The samples fabricated by electrochemical etching (allowed different termination of Si nanocrystallites to be obtained) were used as modeling material. Bright emission at room temperature was observed for oxygen-terminated Si nanocrytallites, whereas hydrogen-terminated samples emit at low temperatures only. For most samples the photoluminescence spectrum was found to be complex, demonstrating competitive emission from Si crystallites and oxide defects. In latter case to separate the contribution of each recombination channel and to obtain information about crystallite distribution, low-temperature measurements of photoluminescence spectra under different excitation light energy were performed.

  11. Network Analysis Tools: from biological networks to clusters and pathways.

    Science.gov (United States)

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  12. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  13. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  14. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  15. A Useful Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Science.gov (United States)

    Rivalland, Vincent; Tardy, Benjamin; Huc, Mireille; Hagolle, Olivier; Marcq, Sébastien; Boulet, Gilles

    2016-04-01

    Land Surface temperature (LST) is a critical variable for studying the energy and water budgets at the Earth surface, and is a key component of many aspects of climate research and services. The Landsat program jointly carried out by NASA and USGS has been providing thermal infrared data for 40 years, but no associated LST product has been yet routinely proposed to community. To derive LST values, radiances measured at sensor-level need to be corrected for the atmospheric absorption, the atmospheric emission and the surface emissivity effect. Until now, existing LST products have been generated with multi channel methods such as the Temperature/Emissivity Separation (TES) adapted to ASTER data or the generalized split-window algorithm adapted to MODIS multispectral data. Those approaches are ill-adapted to the Landsat mono-window data specificity. The atmospheric correction methodology usually used for Landsat data requires detailed information about the state of the atmosphere. This information may be obtained from radio-sounding or model atmospheric reanalysis and is supplied to a radiative transfer model in order to estimate atmospheric parameters for a given coordinate. In this work, we present a new automatic tool dedicated to Landsat thermal data correction which improves the common atmospheric correction methodology by introducing the spatial dimension in the process. The python tool developed during this study, named LANDARTs for LANDsat Automatic Retrieval of surface Temperature, is fully automatic and provides atmospheric corrections for a whole Landsat tile. Vertical atmospheric conditions are downloaded from the ERA Interim dataset from ECMWF meteorological organization which provides them at 0.125 degrees resolution, at a global scale and with a 6-hour-time step. The atmospheric correction parameters are estimated on the atmospheric grid using the commercial software MODTRAN, then interpolated to 30m resolution. We detail the processing steps

  16. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  17. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  18. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  19. Simulation and statistical analysis for the optimization of nitrogen liquefaction plant with cryogenic Claude cycle using process modeling tool: ASPEN HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.

    2017-01-01

    Cryogenic technology is used for liquefaction of many gases and it has several applications in food process engineering. Temperatures below 123 K are considered to be in the field of cryogenics. Extreme low temperatures are a basic need for many industrial processes and have several applications, such as superconductivity of magnets, space, medicine and gas industries. Several methods can be used to obtain the low temperatures required for liquefaction of gases. The process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure, which is below the critical pressure, is the basic liquefaction process. Different cryogenic cycle configurations are designed for getting the liquefied form of gases at different temperatures. Each of the cryogenic cycles like Linde cycle, Claude cycle, Kapitza cycle or modified Claude cycle has its own advantages and disadvantages. The placement of heat exchangers, Joule-Thompson valve and turboexpander decides the configuration of a cryogenic cycle. Each configuration has its own efficiency according to the application. Here, a nitrogen liquefaction plant is used for the analysis purpose. The process modeling tool ASPEN HYSYS can provide a software simulation approach before the actual implementation of the plant in the field. This paper presents the simulation and statistical analysis of the Claude cycle with the process modeling tool ASPEN HYSYS. It covers the technique used to optimize the liquefaction of the plant. The simulation results so obtained can be used as a reference for the design and optimization of the nitrogen liquefaction plant. Efficient liquefaction will give the best performance and productivity to the plant.

  20. Simulation and statistical analysis for the optimization of nitrogen liquefaction plant with cryogenic Claude cycle using process modeling tool: ASPEN HYSYS

    Science.gov (United States)

    Joshi, D. M.

    2017-09-01

    Cryogenic technology is used for liquefaction of many gases and it has several applications in food process engineering. Temperatures below 123 K are considered to be in the field of cryogenics. Extreme low temperatures are a basic need for many industrial processes and have several applications, such as superconductivity of magnets, space, medicine and gas industries. Several methods can be used to obtain the low temperatures required for liquefaction of gases. The process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure, which is below the critical pressure, is the basic liquefaction process. Different cryogenic cycle configurations are designed for getting the liquefied form of gases at different temperatures. Each of the cryogenic cycles like Linde cycle, Claude cycle, Kapitza cycle or modified Claude cycle has its own advantages and disadvantages. The placement of heat exchangers, Joule-Thompson valve and turboexpander decides the configuration of a cryogenic cycle. Each configuration has its own efficiency according to the application. Here, a nitrogen liquefaction plant is used for the analysis purpose. The process modeling tool ASPEN HYSYS can provide a software simulation approach before the actual implementation of the plant in the field. This paper presents the simulation and statistical analysis of the Claude cycle with the process modeling tool ASPEN HYSYS. It covers the technique used to optimize the liquefaction of the plant. The simulation results so obtained can be used as a reference for the design and optimization of the nitrogen liquefaction plant. Efficient liquefaction will give the best performance and productivity to the plant.

  1. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  2. Thermocouple and infrared sensor-based measurement of temperature distribution in metal cutting.

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-12

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  3. The use of case tools in OPG safety analysis code qualification

    International Nuclear Information System (INIS)

    Pascoe, J.; Cheung, A.; Westbye, C.

    2001-01-01

    Ontario Power Generation (OPG) is currently qualifying its critical safety analysis software. The software quality assurance (SQA) framework is described. Given the legacy nature of much of the safety analysis software the reverse engineering methodology has been adopted. The safety analysis suite of codes was developed over a period of many years to differing standards of quality and had sparse or incomplete documentation. Key elements of the reverse engineering process require recovery of design information from existing coding. This recovery, if performed manually, could represent an enormous effort. Driven by a need to maximize productivity and enhance the repeatability and objectivity of software qualification activities the decision was made to acquire or develop and implement Computer Aided Software Engineering (CASE) tools. This paper presents relevant background information on CASE tools and discusses how the OPG SQA requirements were used to assess the suitability of available CASE tools. Key findings from the application of CASE tools to the qualification of the OPG safety analysis software are discussed. (author)

  4. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    Science.gov (United States)

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  5. EXPERIMENTAL INVESTIGATION OF THE TOOL-CHIP INTERFACE TMPERATURES ON UNCOATED CEMENTIDE CARBIDE CUTTING TOOLS

    Directory of Open Access Journals (Sweden)

    Kasım HABALI

    2005-01-01

    Full Text Available It is known that the temperature as the result of the heat developed during machining at the tool-chip interface has an influence on the tool life and workpiece surface guality and the methods for measuring this temperature are constantly under investigation. In this study, the measurement of tool-chip interface temperature using toolworkpiece termocouple method was investigated. The test were carried out on a AISI 1040 steel and the toolchip interface temperature variation was examined depending on the cutting speed and feed rate. The obtained groups show that cutting speed has more influence on the temperature than feedrate has.

  6. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  7. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  8. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  9. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  10. APPLICATION OF THE SPECTRUM ANALYSIS WITH USING BERG METHOD TO DEVELOPED SPECIAL SOFTWARE TOOLS FOR OPTICAL VIBRATION DIAGNOSTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    E. O. Zaitsev

    2016-01-01

    Full Text Available The objective of this paper is development and experimental verification special software of spectral analysis. Spectral analysis use of controlled vibrations objects. Spectral analysis of vibration based on use maximum-entropy autoregressive method of spectral analysis by the Berg algorithm. For measured signals use preliminary analysis based on regression analysis. This analysis of the signal enables to eliminate uninformative parameters such as – the noise and the trend. For preliminary analysis developed special software tools. Non-contact measurement of mechanical vibrations parameters rotating diffusely-reflecting surfaces used in circumstances where the use of contact sensors difficult or impossible for a number of reasons, including lack of access to the object, the small size of the controlled area controlled portion has a high temperature or is affected by strong electromagnetic fields. For control use offered laser measuring system. This measuring system overcomes the shortcomings interference or Doppler optical measuring systems. Such as measure the large amplitude and inharmonious vibration. On the basis of the proposed methods developed special software tools for use measuring laser system. LabVIEW using for developed special software. Experimental research of the proposed method of vibration signals processing is checked in the analysis of the diagnostic information obtained by measuring the vibration system grinding diamond wheel cold solid tungsten-containing alloy TK8. A result of work special software tools was complex spectrum obtained «purified» from non-informative parameters. Spectrum of the signal corresponding to the vibration process observed object. 

  11. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools).

    Science.gov (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2016-06-01

    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Chemical Tool Peer Review Summary.

    Energy Technology Data Exchange (ETDEWEB)

    Cashion, Avery Ted [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cieslewski, Grzegorz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Chemical tracers are commonly used to characterize fracture networks and to determine the connectivity between the injection and production wells. Currently, most tracer experiments involve injecting the tracer at the injection well, manually collecting liquid samples at the wellhead of the production well, and sending the samples off for laboratory analysis. While this method provides accurate tracer concentration data, it does not provide information regarding the location of the fractures conducting the tracer between wellbores. The goal of this project is to develop chemical sensors and design a prototype tool to help understand the fracture properties of a geothermal reservoir by monitoring tracer concentrations along the depth of the well. The sensors will be able to detect certain species of the ionic tracers (mainly iodide) and pH in-situ during the tracer experiment. The proposed high-temperature (HT) tool will house the chemical sensors as well as a standard logging sensor package of pressure, temperature, and flow sensors in order to provide additional information on the state of the geothermal reservoir. The sensors and the tool will be able to survive extended deployments at temperatures up to 225 °C and high pressures to provide real-time temporal and spatial feedback of tracer concentration. Data collected from this tool will allow for the real-time identification of the fractures conducting chemical tracers between wellbores along with the pH of the reservoir fluid at various depths.

  13. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak

    2018-01-01

    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  14. Experimental and numerical investigations on the temperature distribution in PVD AlTiN coated and uncoated Al2O3/TiCN mixed ceramic cutting tools in hard turning of AISI 52100 steel

    Science.gov (United States)

    Sateesh Kumar, Ch; Patel, Saroj Kumar; Das, Anshuman

    2018-03-01

    Temperature generation in cutting tools is one of the major causes of tool failure especially during hard machining where machining forces are quite high resulting in elevated temperatures. Thus, the present work investigates the temperature generation during hard machining of AISI 52100 steel (62 HRC hardness) with uncoated and PVD AlTiN coated Al2O3/TiCN mixed ceramic cutting tools. The experiments were performed on a heavy duty lathe machine with both coated and uncoated cutting tools under dry cutting environment. The temperature of the cutting zone was measured using an infrared thermometer and a finite element model has been adopted to predict the temperature distribution in cutting tools during machining for comparative assessment with the measured temperature. The experimental and numerical results revealed a significant reduction of cutting zone temperature during machining with PVD AlTiN coated cutting tools when compared to uncoated cutting tools during each experimental run. The main reason for decrease in temperature for AlTiN coated tools is the lower coefficient of friction offered by the coating material which allows the free flow of the chips on the rake surface when compared with uncoated cutting tools. Further, the superior wear behaviour of AlTiN coating resulted in reduction of cutting temperature.

  15. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  16. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    Science.gov (United States)

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  17. INTERFACING INTERACTIVE DATA ANALYSIS TOOLS WITH THE GRID: THE PPDG CS-11 ACTIVITY

    International Nuclear Information System (INIS)

    Perl, Joseph

    2003-01-01

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services''. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed

  18. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  19. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  20. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  1. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  2. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  3. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Directory of Open Access Journals (Sweden)

    Abdil Kus

    2015-01-01

    Full Text Available In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining.

  4. Thermocouple and Infrared Sensor-Based Measurement of Temperature Distribution in Metal Cutting

    Science.gov (United States)

    Kus, Abdil; Isik, Yahya; Cakir, M. Cemal; Coşkun, Salih; Özdemir, Kadir

    2015-01-01

    In metal cutting, the magnitude of the temperature at the tool-chip interface is a function of the cutting parameters. This temperature directly affects production; therefore, increased research on the role of cutting temperatures can lead to improved machining operations. In this study, tool temperature was estimated by simultaneous temperature measurement employing both a K-type thermocouple and an infrared radiation (IR) pyrometer to measure the tool-chip interface temperature. Due to the complexity of the machining processes, the integration of different measuring techniques was necessary in order to obtain consistent temperature data. The thermal analysis results were compared via the ANSYS finite element method. Experiments were carried out in dry machining using workpiece material of AISI 4140 alloy steel that was heat treated by an induction process to a hardness of 50 HRC. A PVD TiAlN-TiN-coated WNVG 080404-IC907 carbide insert was used during the turning process. The results showed that with increasing cutting speed, feed rate and depth of cut, the tool temperature increased; the cutting speed was found to be the most effective parameter in assessing the temperature rise. The heat distribution of the cutting tool, tool-chip interface and workpiece provided effective and useful data for the optimization of selected cutting parameters during orthogonal machining. PMID:25587976

  5. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  6. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  7. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  8. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  9. SIMULATION TOOL OF VELOCITY AND TEMPERATURE PROFILES IN THE ACCELERATED COOLING PROCESS OF HEAVY PLATES

    Directory of Open Access Journals (Sweden)

    Antônio Adel dos Santos

    2014-10-01

    Full Text Available The aim of this paper was to develop and apply mathematical models for determining the velocity and temperature profiles of heavy plates processed by accelerated cooling at Usiminas’ Plate Mill in Ipatinga. The development was based on the mathematical/numerical representation of physical phenomena occurring in the processing line. Production data from 3334 plates processed in the Plate Mill were used for validating the models. A user-friendly simulation tool was developed within the Visual Basic framework, taking into account all steel grades produced, the configuration parameters of the production line and these models. With the aid of this tool the thermal profile through the plate thickness for any steel grade and dimensions can be generated, which allows the tuning of online process control models. The simulation tool has been very useful for the development of new steel grades, since the process variables can be related to the thermal profile, which affects the mechanical properties of the steels.

  10. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    Science.gov (United States)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  11. SpirPro: A Spirulina proteome database and web-based tools for the analysis of protein-protein interactions at the metabolic level in Spirulina (Arthrospira) platensis C1.

    Science.gov (United States)

    Senachak, Jittisak; Cheevadhanarak, Supapon; Hongsthong, Apiradee

    2015-07-29

    Spirulina (Arthrospira) platensis is the only cyanobacterium that in addition to being studied at the molecular level and subjected to gene manipulation, can also be mass cultivated in outdoor ponds for commercial use as a food supplement. Thus, encountering environmental changes, including temperature stresses, is common during the mass production of Spirulina. The use of cyanobacteria as an experimental platform, especially for photosynthetic gene manipulation in plants and bacteria, is becoming increasingly important. Understanding the mechanisms and protein-protein interaction networks that underlie low- and high-temperature responses is relevant to Spirulina mass production. To accomplish this goal, high-throughput techniques such as OMICs analyses are used. Thus, large datasets must be collected, managed and subjected to information extraction. Therefore, databases including (i) proteomic analysis and protein-protein interaction (PPI) data and (ii) domain/motif visualization tools are required for potential use in temperature response models for plant chloroplasts and photosynthetic bacteria. A web-based repository was developed including an embedded database, SpirPro, and tools for network visualization. Proteome data were analyzed integrated with protein-protein interactions and/or metabolic pathways from KEGG. The repository provides various information, ranging from raw data (2D-gel images) to associated results, such as data from interaction and/or pathway analyses. This integration allows in silico analyses of protein-protein interactions affected at the metabolic level and, particularly, analyses of interactions between and within the affected metabolic pathways under temperature stresses for comparative proteomic analysis. The developed tool, which is coded in HTML with CSS/JavaScript and depicted in Scalable Vector Graphics (SVG), is designed for interactive analysis and exploration of the constructed network. SpirPro is publicly available on the web

  12. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  13. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    Science.gov (United States)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  14. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  15. Using the Signal Tools and Statistical Tools to Redefine the 24 Solar Terms in Peasant Calendar by Analyzing Surface Temperature and Precipitation

    Science.gov (United States)

    Huang, J. Y.; Tung, C. P.

    2017-12-01

    There is an important book called "Peasant Calendar" in the Chinese society. The Peasant Calendar is originally based on the orbit of the Sun and each year is divided into 24 solar terms. Each term has its own special meaning and conception. For example, "Spring Begins" means the end of winter and the beginning of spring. In Taiwan, 24 solar terms play an important role in agriculture because farmers always use the Peasant Calendar to decide when to sow. However, the current solar term in Taiwan is fixed about 15 days. This way doesn't show the temporal variability of climate and also can't truly reflect the regional climate characteristics in different areas.The number of days in each solar term should be more flexible. Since weather is associated with climate, all weather phenomena can be regarded as a multiple fluctuation signal. In this research, 30 years observation data of surface temperature and precipitation from 1976 2016 are used. The data is cut into different time series, such as a week, a month, six months to one year and so on. Signal analysis tools such as wavelet, change point analysis and Fourier transform are used to determine the length of each solar term. After determining the days of each solar term, statistical tests are used to find the relationships between the length of solar terms and climate turbulent (e.g., ENSO and PDO).For example, one of the solar terms called "Major Heat" should typically be more than 20 days in Taiwan due to global warming and heat island effect. The advance of Peasant Calendar can help farmers to make better decision, controlling crop schedule and using the farmland more efficient. For instance, warmer condition can accelerate the accumulation of accumulated temperature, which is the key of crop's growth stage. The result also can be used on disaster reduction (e.g., preventing agricultural damage) and water resources project.

  16. Hybrid analysis for indicating patients with breast cancer using temperature time series.

    Science.gov (United States)

    Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura

    2016-07-01

    Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an

  17. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  18. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  19. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    Science.gov (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  20. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  1. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Science.gov (United States)

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for

  2. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Directory of Open Access Journals (Sweden)

    Carlos Alejandro Robles-Rubio

    Full Text Available Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA. POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP signals; (ii RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii a library of data segments representing each of the 6 patterns; (iv a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness. Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential

  3. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  4. 454-Pyrosequencing Analysis of Bacterial Communities from Autotrophic Nitrogen Removal Bioreactors Utilizing Universal Primers: Effect of Annealing Temperature.

    Science.gov (United States)

    Gonzalez-Martinez, Alejandro; Rodriguez-Sanchez, Alejandro; Rodelas, Belén; Abbas, Ben A; Martinez-Toledo, Maria Victoria; van Loosdrecht, Mark C M; Osorio, F; Gonzalez-Lopez, Jesus

    2015-01-01

    Identification of anaerobic ammonium oxidizing (anammox) bacteria by molecular tools aimed at the evaluation of bacterial diversity in autotrophic nitrogen removal systems is limited by the difficulty to design universal primers for the Bacteria domain able to amplify the anammox 16S rRNA genes. A metagenomic analysis (pyrosequencing) of total bacterial diversity including anammox population in five autotrophic nitrogen removal technologies, two bench-scale models (MBR and Low Temperature CANON) and three full-scale bioreactors (anammox, CANON, and DEMON), was successfully carried out by optimization of primer selection and PCR conditions (annealing temperature). The universal primer 530F was identified as the best candidate for total bacteria and anammox bacteria diversity coverage. Salt-adjusted optimum annealing temperature of primer 530F was calculated (47°C) and hence a range of annealing temperatures of 44-49°C was tested. Pyrosequencing data showed that annealing temperature of 45°C yielded the best results in terms of species richness and diversity for all bioreactors analyzed.

  5. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  6. Cintichem modified process - {sup 99}Mo precipitation step: application of statistical analysis tools over the reaction parameters

    Energy Technology Data Exchange (ETDEWEB)

    Teodoro, Rodrigo; Dias, Carla R.B.R.; Osso Junior, Joao A., E-mail: jaosso@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Fernandez Nunez, Eutimio Gustavo [Universidade de Sao Paulo (EP/USP), SP (Brazil). Escola Politecnica. Dept. de Engenharia Quimica

    2011-07-01

    Precipitation of {sup 99}Mo by {alpha}-benzoin oxime ({alpha}-Bz) is a standard precipitation method for molybdenum due the high selectivity of this agent. Nowadays, statistical analysis tools have been employed in analytical systems to prove its efficiency and feasibility. IPEN has a project aiming the production of {sup 99}Mo by the fission of {sup 235}U route. The processing uses as the first step the precipitation of {sup 99}Mo with {alpha}-Bz. This precipitation step involves many key reaction parameters. The aim of this work is based on the development of the already known acidic route to produce {sup 99}Mo as well as the optimization of the reactional parameters applying statistical tools. In order to simulate {sup 99}Mo precipitation, the study was conducted in acidic media using HNO{sub 3}, {alpha}Bz as precipitant agent and NaOH /1%H{sub 2}O{sub 2} as dissolver solution. Then, a Mo carrier, KMnO{sub 4} solutions and {sup 99}Mo tracer were added to the reaction flask. The reactional parameters ({alpha}-Bz/Mo ratio, Mo carrier, reaction time and temperature, and cooling reaction time before filtration) were evaluated under a fractional factorial design of resolution V. The best values of each reactional parameter were determined by a response surface statistical planning. The precipitation and recovery yields of {sup 99}Mo were measured using HPGe detector. Statistical analysis from experimental data suggested that the reactional parameters {alpha}-Bz/Mo ratio, reaction time and temperature have a significant impact on {sup 99}Mo precipitation. Optimization statistical planning showed that higher {alpha}Bz/Mo ratios, room temperature, and lower reaction time lead to higher {sup 99}Mo yields. (author)

  7. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  8. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    Science.gov (United States)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  9. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; Garcia-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; Da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  10. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  11. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  12. Parametric analysis of the curved slats fixed mirror solar concentrator for medium temperature applications

    International Nuclear Information System (INIS)

    Pujol-Nadal, Ramon; Martínez-Moll, Víctor

    2014-01-01

    Highlights: • We thermally modeled the Curved Slats Fixed Mirror Solar Concentrator (CSFMSC). • A parametric analysis for three climates and two axial orientations are given. • The optimum values are determined for a range of the design parameters. • The CSFMSC has been well characterized for medium range temperature operation. - Abstract: The Curved Slats Fixed Mirror Solar Concentrator (CSFMSC) is a solar concentrator with a static reflector and a moving receiver. An optical analysis using ray-tracing tools was presented in a previous study in function of three design parameters: the number of mirrors N, the ratio of focal length and reflector width F/W, and the aperture concentration C a . However, less is known about the thermal behavior of this geometry. In this communication, the integrated thermal output of the CSFMSC has been determined in order to find the optimal values for the design parameters at a working temperature of 200 °C. The results were obtained for three different climates and two axial orientations (North–South, and East–West). The results show that CSFMSC can produce heat at 200 °C with an annual thermal efficiency of 41, 47, and 51%, dependent of the location considered (Munich, Palma de Mallorca, and Cairo). The best FMSC geometries in function of the design parameters are exhibited for medium temperature applications

  13. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives.

    Science.gov (United States)

    Nichio, Bruno T L; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu

    2017-01-01

    Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST "all-against-all" methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing

  14. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives

    Directory of Open Access Journals (Sweden)

    Bruno T. L. Nichio

    2017-10-01

    Full Text Available Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST “all-against-all” methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm; or proteinOrtho (which improves the accuracy of ortholog groups; or ReMark (tackling the integration of the pipeline to turn the entry process automatic; or OrthAgogue (using algorithms developed to

  15. An Experimental Investigation of Cutting Temperature and Tool Wear in 2 Dimensional Ultrasonic Vibrations Assisted Micro-Milling

    Directory of Open Access Journals (Sweden)

    Ibrahim Mohd Rasidi

    2017-01-01

    Full Text Available Two dimensional Ultrasonic vibration assisted milling (2D UVAM well knows process that involved in high tech system to generate ultra range of frequency applied to the milling process. More industries nowadays become aware taking this opportunity to improve their productivity without decreasing their product accuracies. This paper investigate a comparative machining between UVAM and conventional machining (CM in tool wear and cutting temperature in milling process. Micro amplitude and sine wave frequency will be generate into the workpiece jig by piezo-actuator. Thus, creating a micro gap that allow heat remove effectively with the chip produces. A more complex tool trajectory mechanics of 2D UVAM has been found during this research. The approaching the tool tip into the workpiece surfaces is affected by the amplitude displacement along the frequency applied. It is found that the tool wear was reduce and surface roughness improvement by applying the 2D UVAM compared to the CM when choosing the optimum amplitude and appropriate frequency.

  16. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    Science.gov (United States)

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  17. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  18. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  19. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    Science.gov (United States)

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  20. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  1. The cumulative verification image analysis tool for offline evaluation of portal images

    International Nuclear Information System (INIS)

    Wong, John; Yan Di; Michalski, Jeff; Graham, Mary; Halverson, Karen; Harms, William; Purdy, James

    1995-01-01

    Purpose: Daily portal images acquired using electronic portal imaging devices contain important information about the setup variation of the individual patient. The data can be used to evaluate the treatment and to derive correction for the individual patient. The large volume of images also require software tools for efficient analysis. This article describes the approach of cumulative verification image analysis (CVIA) specifically designed as an offline tool to extract quantitative information from daily portal images. Methods and Materials: The user interface, image and graphics display, and algorithms of the CVIA tool have been implemented in ANSCI C using the X Window graphics standards. The tool consists of three major components: (a) definition of treatment geometry and anatomical information; (b) registration of portal images with a reference image to determine setup variation; and (c) quantitative analysis of all setup variation measurements. The CVIA tool is not automated. User interaction is required and preferred. Successful alignment of anatomies on portal images at present remains mostly dependent on clinical judgment. Predefined templates of block shapes and anatomies are used for image registration to enhance efficiency, taking advantage of the fact that much of the tool's operation is repeated in the analysis of daily portal images. Results: The CVIA tool is portable and has been implemented on workstations with different operating systems. Analysis of 20 sequential daily portal images can be completed in less than 1 h. The temporal information is used to characterize setup variation in terms of its systematic, random and time-dependent components. The cumulative information is used to derive block overlap isofrequency distributions (BOIDs), which quantify the effective coverage of the prescribed treatment area throughout the course of treatment. Finally, a set of software utilities is available to facilitate feedback of the information for

  2. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  3. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  4. Meta-analysis of changes in temperature and precipitation in Florida in the context of food-energy-water nexus

    Science.gov (United States)

    Anandhi, A.; Sharma, A.

    2017-12-01

    Florida is a hotspot of endemism for plants, vertebrates, and insects outside of the tropics. The state has extensive coastline, with the maximum distance from the coast less than 150 km which has diverse ecosystems and landscapes, as well as habitat for many endangered species. Additionally, agriculture is one of the most important economic resources in Florida and is ranked second in the U.S. for value of vegetable production. Florida's biodiversity is threatened by stressors such as increasing urbanization and population, land-use change and socio-economic growth. Given that, climate change and variability will interact with these stresses, potentially accentuating their negative impacts, there are several studies, concerning climate change impacts on Florida's ecosystem to date. The specific objectives of this study were to demonstrate the decision support tool developed from meta-analysis. The Tool was developed using the temperature and precipitation changes in Florida identified from peer reviewed studies. These change values were then synthesized using simple statistical techniques (e.g., histogram, line plots and density plots). Our results indicate a wide variability in the temperature and precipitation changes observed in the studies for Florida. The studies showed a temperature change ranged between +5 °C and -3 °C, while the precipitation change ranged between +30% and -40% in the state. These changes have series implications on the food-water-energy nexus. Some of the potential implications of these changes in the context of the nexus are discussed using causal chains developed from meta-analysis.

  5. Introduction, comparison, and validation of Meta-Essentials : A free and simple tool for meta-analysis

    NARCIS (Netherlands)

    R. Suurmond (Robert); H.J. van Rhee (Henk); A. Hak (Tony)

    2017-01-01

    markdownabstractWe present a new tool for meta‐analysis, _Meta‐Essentials_, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis.We also provide detailed information on the validation of the tool. Although free of

  6. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  7. Analysis of low-temperature tar fractions

    Energy Technology Data Exchange (ETDEWEB)

    Kikkawa, S; Yamada, F

    1952-01-01

    A preliminary comparative study was made on the applicability of the methods commonly used for the type analysis of petroleum products to the low-temperature tar fractions. The usability of chromatography was also studied.

  8. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  9. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  10. Validation of the dynamics of SDS and RRS flux, flow, pressure and temperature signals using noise analysis technique

    International Nuclear Information System (INIS)

    Glockler, O.; Cooke, D.F.; Tulett, M.V.

    1995-01-01

    In 1992, a program was initiated to establish reactor noise analysis as a practical tool for plant performance monitoring and system diagnostics in Ontario Hydro's CANDU reactors. Since then, various CANDU-specific noise analysis applications have been developed and validated. The noise-based statistical techniques are being successfully applied as powerful troubleshooting and diagnostic tools to a wide variety of actual operational I and C problems. Critical plant components, instrumentation and processes are monitored on a regular basis, and their dynamic characteristics are verified on-power. Recent applications of noise analysis include (1) validating the dynamics of in-core flux detectors (ICFDS) and ion chambers, (2) estimating the prompt fraction ICFDs in noise measurements at full power and in power rundown tests, (3) identifying the cause of excessive signal fluctuations in certain flux detectors, (4) validating the dynamic coupling between liquid zone control signals, (5) detecting and monitoring mechanical vibrations of detector tubes, reactivity devices and fuel channels induced by moderator/coolant flow, (6) estimating the dynamics and response time of RTD temperature signals, (7) isolating the cause of RTD signal anomalies, (8) investigating the source of abnormal flow signal behaviour, (9) estimating the overall response time of flow and pressure signals, (1 0) detecting coolant boiling in fully instrumented fuel channels, (1 1) monitoring moderator circulation via temperature noise, and (12) predicting the performance of shut-off rods. Some of these applications are performed on an as needed basis. The noise analysis program, in the Pickering-B station alone, has saved Ontario Hydro millions of dollars during its first three years. The results of the noise analysis program have been also reviewed by the regulator (Atomic Energy Control Board of Canada) with favorable results. The AECB have expressed interest in Ontario Hydro further exploiting the

  11. Correlation analysis on alpha attenuation and nasal skin temperature

    International Nuclear Information System (INIS)

    Nozawa, Akio; Tacano, Munecazu

    2009-01-01

    Some serious accidents caused by declines in arousal level, such as traffic accidents and mechanical control mistakes, have become issues of social concern. The physiological index obtained by human body measurement is expected to offer a leading tool for evaluating arousal level as an objective indicator. In this study, declines in temporal arousal levels were evaluated by nasal skin temperature. As arousal level declines, sympathetic nervous activity is decreased and blood flow in peripheral vessels is increased. Since peripheral vessels exist just under the skin on the fingers and nose, the psychophysiological state can be judged from the displacement of skin temperature caused by changing blood flow volume. Declining arousal level is expected to be observable as a temperature rise in peripheral parts of the body. The objective of this experiment was to obtain assessment criteria for judging declines in arousal level by nasal skin temperature using the alpha attenuation coefficient (AAC) of electroencephalography (EEG) as a reference benchmark. Furthermore, a psychophysical index of sleepiness was also measured using a visual analogue scale (VAS). Correlations between nasal skin temperature index and EEG index were analyzed. AAC and maximum displacement of nasal skin temperature displayed a clear negative correlation, with a correlation coefficient of −0.55

  12. EZ and GOSSIP, two new VO compliant tools for spectral analysis

    Science.gov (United States)

    Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.

    2008-10-01

    We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.

  13. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  14. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  15. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  16. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  17. MEL-IRIS: An Online Tool for Audio Analysis and Music Indexing

    Directory of Open Access Journals (Sweden)

    Dimitrios Margounakis

    2009-01-01

    Full Text Available Chroma is an important attribute of music and sound, although it has not yet been adequately defined in literature. As such, it can be used for further analysis of sound, resulting in interesting colorful representations that can be used in many tasks: indexing, classification, and retrieval. Especially in Music Information Retrieval (MIR, the visualization of the chromatic analysis can be used for comparison, pattern recognition, melodic sequence prediction, and color-based searching. MEL-IRIS is the tool which has been developed in order to analyze audio files and characterize music based on chroma. The tool implements specially designed algorithms and a unique way of visualization of the results. The tool is network-oriented and can be installed in audio servers, in order to manipulate large music collections. Several samples from world music have been tested and processed, in order to demonstrate the possible uses of such an analysis.

  18. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  19. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  20. BWR stability: analysis of cladding temperature for high amplitude oscillations - 146

    International Nuclear Information System (INIS)

    Pohl, P.; Wehle, F.

    2010-01-01

    Power oscillations associated with density waves in boiling water reactors (BWRs) have been studied widely. Industrial research in this area is active since the invention of the first BWR. Stability measurements have been performed in various plants during commissioning phase but especially the magnitude and divergent nature of the oscillations during the LaSalle Unit 2 nuclear power plant event on March 9, 1988, renewed concern about the state of knowledge on BWR instabilities and possible consequences to fuel rod integrity. The objective of this paper is to present a simplified stability tool, applicable for stability analysis in the non-linear regime, which extends to high amplitude oscillations where inlet reverse flow occurs. In case of high amplitude oscillations a cyclical dryout and rewetting process at the fuel rod may take place, which leads in turn to rapid changes of the heat transfer from the fuel rod to the coolant. The application of this stability tool allows for a conservative determination of the fuel rod cladding temperature in case of high amplitude oscillations during the dryout / re-wet phase. Moreover, it reveals in good agreement to experimental findings the stabilizing effect of the reverse bundle inlet flow, which might be obtained for large oscillation amplitudes. (authors)

  1. Hemispherical Resonator Gyroscope Accuracy Analysis Under Temperature Influence

    Directory of Open Access Journals (Sweden)

    Boran LI

    2014-06-01

    Full Text Available Frequency splitting of hemispherical resonator gyroscope will change as system operating temperature changes. This phenomenon leads to navigation accuracy of hemispherical resonator gyroscope reduces. By researching on hemispherical resonator gyroscope dynamical model and its frequency characteristic, the frequency splitting formula and the precession angle formula of gyroscope vibrating mode based on hemispherical resonator gyroscope dynamic equation parameters are derived. By comparison, gyroscope precession angle deviation caused by frequency splitting can be obtained. Based on analysis of temperature variation against gyroscope resonator, the design of hemispherical resonator gyroscope feedback controller under temperature variation conditions is researched and the maximum theoretical fluctuation of gyroscope dynamical is determined by using a numerical analysis example.

  2. Infrared thermography - a non-invasive tool to evaluate thermal status of neonatal pigs based on surface temperature

    DEFF Research Database (Denmark)

    Sund Kammersgaard, Trine; Malmkvist, Jens; Pedersen, Lene Juul

    2013-01-01

    and IRmax was improved (Ppiglets having RT... to be used without the need for manual restraint of the pigs. On the basis of the results of this study, we propose that IRmax temperature from full-body thermograms has implication as a valid tool to assess the thermal status in neonatal piglets but not as an identical substitute for RT....

  3. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    Science.gov (United States)

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  4. An operational analysis of Lake Surface Water Temperature

    Directory of Open Access Journals (Sweden)

    Emma K. Fiedler

    2014-07-01

    Full Text Available Operational analyses of Lake Surface Water Temperature (LSWT have many potential uses including improvement of numerical weather prediction (NWP models on regional scales. In November 2011, LSWT was included in the Met Office Operational Sea Surface Temperature and Ice Analysis (OSTIA product, for 248 lakes globally. The OSTIA analysis procedure, which has been optimised for oceans, has also been used for the lakes in this first version of the product. Infra-red satellite observations of lakes and in situ measurements are assimilated. The satellite observations are based on retrievals optimised for Sea Surface Temperature (SST which, although they may introduce inaccuracies into the LSWT data, are currently the only near-real-time information available. The LSWT analysis has a global root mean square difference of 1.31 K and a mean difference of 0.65 K (including a cool skin effect of 0.2 K compared to independent data from the ESA ARC-Lake project for a 3-month period (June to August 2009. It is demonstrated that the OSTIA LSWT is an improvement over the use of climatology to capture the day-to-day variation in global lake surface temperatures.

  5. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  6. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  7. IPMP Global Fit - A one-step direct data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2017-12-04

    The objective of this work is to develop and validate a unified optimization algorithm for performing one-step global regression analysis of isothermal growth and survival curves for determination of kinetic parameters in predictive microbiology. The algorithm is incorporated with user-friendly graphical interfaces (GUIs) to develop a data analysis tool, the USDA IPMP-Global Fit. The GUIs are designed to guide the users to easily navigate through the data analysis process and properly select the initial parameters for different combinations of mathematical models. The software is developed for one-step kinetic analysis to directly construct tertiary models by minimizing the global error between the experimental observations and mathematical models. The current version of the software is specifically designed for constructing tertiary models with time and temperature as the independent model parameters in the package. The software is tested with a total of 9 different combinations of primary and secondary models for growth and survival of various microorganisms. The results of data analysis show that this software provides accurate estimates of kinetic parameters. In addition, it can be used to improve the experimental design and data collection for more accurate estimation of kinetic parameters. IPMP-Global Fit can be used in combination with the regular USDA-IPMP for solving the inverse problems and developing tertiary models in predictive microbiology. Published by Elsevier B.V.

  8. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  9. 454-Pyrosequencing Analysis of Bacterial Communities from Autotrophic Nitrogen Removal Bioreactors Utilizing Universal Primers: Effect of Annealing Temperature

    Directory of Open Access Journals (Sweden)

    Alejandro Gonzalez-Martinez

    2015-01-01

    Full Text Available Identification of anaerobic ammonium oxidizing (anammox bacteria by molecular tools aimed at the evaluation of bacterial diversity in autotrophic nitrogen removal systems is limited by the difficulty to design universal primers for the Bacteria domain able to amplify the anammox 16S rRNA genes. A metagenomic analysis (pyrosequencing of total bacterial diversity including anammox population in five autotrophic nitrogen removal technologies, two bench-scale models (MBR and Low Temperature CANON and three full-scale bioreactors (anammox, CANON, and DEMON, was successfully carried out by optimization of primer selection and PCR conditions (annealing temperature. The universal primer 530F was identified as the best candidate for total bacteria and anammox bacteria diversity coverage. Salt-adjusted optimum annealing temperature of primer 530F was calculated (47°C and hence a range of annealing temperatures of 44–49°C was tested. Pyrosequencing data showed that annealing temperature of 45°C yielded the best results in terms of species richness and diversity for all bioreactors analyzed.

  10. A compilation of Web-based research tools for miRNA analysis.

    Science.gov (United States)

    Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu

    2017-09-01

    Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  11. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  12. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  13. High temperature structure design for FBRs and analysis technology

    International Nuclear Information System (INIS)

    Iwata, Koji

    1986-01-01

    In the case of FBRs, the operation temperature exceeds 500 deg C, therefore, the design taking the inelastic characteristics of structural materials, such as plasticity and creep, into account is required, and the high grade and detailed evaluation of design is demanded. This new high temperature structure design technology has been advanced in respective countries taking up experimental, prototype and demonstration reactors as the targets. The development of FBRs in Japan was begun with the experimental reactor 'Joyo' which has been operated since 1977, and now, the prototype FBR 'Monju' of 280 MWe is under construction, which is expected to attain the criticality in 1992. In order to realize FBRs which can compete with LWRs through the construction of a demonstration FBR, the construction of large scale plants and the heightening of the economy and reliability are necessary. The features and the role of FBR structural design, the method of high temperature structure design and the trend of its standardization, the trend of the structural analysis technology for FBRs such as inelastic analysis, buckling analysis and fluid and structure coupled vibration analysis, the present status of structural analysis programs, and the subjects for the future of high temperature structure design are explained. (Kako, I.)

  14. Virtual tool mark generation for efficient striation analysis in forensic science

    Energy Technology Data Exchange (ETDEWEB)

    Ekstrand, Laura [Iowa State Univ., Ames, IA (United States)

    2012-01-01

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  15. Analysis and evaluation system for elevated temperature design of pressure vessels

    International Nuclear Information System (INIS)

    Hayakawa, Teiji; Sayawaki, Masaaki; Nishitani, Masahiro; Mii, Tatsuo; Murasawa, Kanji

    1977-01-01

    In pressure vessel technology, intensive efforts have recently been made to develop the elevated temperature design methods. Much of the impetus of these efforts has been provided mainly by the results of the Liquid Metal Fast Breeder Reactor (LMFBR) and more recently, of the High Temperature Gas-cooled Reactor (HTGR) Programs. The pressure vessels and associated components in these new type nuclear power plants must operate for long periods at elevated temperature where creep effects are significant and then must be designed by rigorous analysis for high reliability and safety. To carry out such an elevated temperature designing, numbers of highly developed analysis and evaluation techniques, which are so complicated as to be impossible by manual work, are indispensable. Under these circumstances, the authors have made the following approaches in the study: (1) Study into basic concepts and the associated techniques in elevated temperature design. (2) Systematization (Analysis System) of the procedure for loads and stress analyses. (3) Development of post-processor, ''POST-1592'', for strength evaluation based on ASME Code Case 1592-7. By linking the POST-1592 together with the Analysis System, an analysis and evaluation system is developed for an elevated temperature design of pressure vessels. Consequently, designing of elevated temperature vessels by detailed analysis and evaluation has easily and effectively become feasible by applying this software system. (auth.)

  16. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  17. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot...... Structural Analysis. Further the tool provides intuitive setup and visual aids in order to facilitate the process. Enabling students and professionals to quickly analyze and evaluate multiple design variations. The tool has been developed inside the Performance Aided Design course at the Master...... of Architecture and Design at Aalborg University...

  18. A portable borehole temperature logging system using the four-wire resistance method

    Science.gov (United States)

    Erkan, Kamil; Akkoyunlu, Bülent; Balkan, Elif; Tayanç, Mete

    2017-12-01

    High-quality temperature-depth information from boreholes with a depth of 100 m or more is used in geothermal studies and in studies of climate change. Electrical wireline tools with thermistor sensors are capable of measuring borehole temperatures with millikelvin resolution. The use of a surface readout mode allows analysis of the thermally conductive state of a borehole, which is especially important for climatic and regional heat flow studies. In this study we describe the design of a portable temperature logging tool that uses the four-wire resistance measurement method. The four-wire method enables the elimination of cable resistance effects, thus allowing millikelvin resolution of temperature data at depth. A preliminary two-wire model of the system is also described. The portability of the tool enables one to collect data from boreholes down to 300 m, even in locations with limited accessibility.

  19. Image edge detection based tool condition monitoring with morphological component analysis.

    Science.gov (United States)

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Analysis of mechanism of carbide tool wear and control by wear process

    Directory of Open Access Journals (Sweden)

    Pham Hoang Trung

    2017-01-01

    Full Text Available The analysis of physic-mechanical and thermal physic properties of hard alloys depending on their chemical composition is conducted. The correlation of cutting properties and regularities of carbide tool wear with cutting conditions and thermal physic properties of tool material are disclosed. Significant influence on the tool wear of not only mechanical, but, in the first place, thermal physic properties of tool and structural materials is established by the researches of Russian scientists, because in the range of industrial used cutting speeds the cause of tool wear are diffusion processes. The directions of intensity decreasing of tool wear by determining rational processing conditions, the choice of tool materials and wear-resistant coating on tool surface are defined.

  1. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    Science.gov (United States)

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  2. Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho

    Science.gov (United States)

    Andrew J. McMahan; Eric L. Smith

    2006-01-01

    Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing “tools”--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...

  3. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  4. Regression Analysis and Calibration Recommendations for the Characterization of Balance Temperature Effects

    Science.gov (United States)

    Ulbrich, N.; Volden, T.

    2018-01-01

    Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.

  5. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant

    International Nuclear Information System (INIS)

    Shariff, Azmi Mohd; Zaini, Dzulkarnain

    2010-01-01

    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage.

  6. A population MRI brain template and analysis tools for the macaque.

    Science.gov (United States)

    Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam

    2018-04-15

    The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.

  7. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  8. Status of CONRAD, a nuclear reaction analysis tool

    International Nuclear Information System (INIS)

    Saint Jean, C. de; Habert, B.; Litaize, O.; Noguere, G.; Suteau, C.

    2008-01-01

    The development of a software tool (CONRAD) was initiated at CEA/Cadarache to give answers to various problems arising in the data analysis of nuclear reactions. This tool is then characterized by the handling of uncertainties from experimental values to covariance matrices for multi-group cross sections. An object oriented design was chosen allowing an easy interface with graphical tool for input/output data and being a natural framework for innovative nuclear models (Fission). The major achieved developments are a data model for describing channels, nuclear reactions, nuclear models and processes with interface to classical data formats, theoretical calculations for the resolved resonance range (Reich-Moore) and unresolved resonance range (Hauser-Feshbach, Gilbert-Cameron,...) with nuclear model parameters adjustment on experimental data sets and a Monte Carlo method based on conditional probabilities developed to calculate properly covariance matrices. The on-going developments deal with the experimental data description (covariance matrices) and the graphical user interface. (authors)

  9. Simulation analysis of temperature control on RCC arch dam of hydropower station

    Science.gov (United States)

    XIA, Shi-fa

    2017-12-01

    The temperature analysis of roller compacted concrete (RCC) dam plays an important role in their design and construction. Based on three-dimensional finite element method, in the computation of temperature field, many cases are included, such as air temperature, elevated temperature by cement hydration heat, concrete temperature during placing, the influence of water in the reservoir, and boundary temperature. According to the corresponding parameters of RCC arch dam, the analysis of temperature field and stress field during the period of construction and operation is performed. The study demonstrates that detailed thermal stress analysis should be performed for RCC dams to provide a basis to minimize and control the occurrence of thermal cracking.

  10. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    Science.gov (United States)

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  11. Analysis of thermodynamic properties for high-temperature superconducting oxides

    International Nuclear Information System (INIS)

    Kushwah, S.S.; Shanker, J.

    1993-01-01

    Analysis of thermodynamic properties such as specific heat, Debye temperature, Einstein temperature, thermal expansion coefficient, bulk modulus, and Grueneisen parameter is performed for rare-earth-based, Tl-based, and Bi-based superconducting copper oxides. Values of thermodynamic parameters are calculated and reported. The relationship between the Debye temperature and the superconducting transition temperature is used to estimate the values of T c using the interaction parameters from Ginzburg. (orig.)

  12. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    Science.gov (United States)

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  13. Self propagating high temperature synthesis of mixed carbide and boride powder systems for cutting tools manufacturing

    International Nuclear Information System (INIS)

    Vallauri, D.; Cola, P.L. de; Piscone, F.; Amato, I.

    2001-01-01

    TiC-TiB 2 composites have been produced via SHS technique starting from low cost raw materials like TiO 2 , B 4 C, Mg. The influence of the diluent phase (Mg, TiC) content on combustion temperature has been investigated. The use of magnesium as the reductant phase allowed acid leaching of the undesired oxide product (MgO), leaving pure hard materials with fine particle size suitable to be employed in cutting tools manufacturing through cold pressing and sintering route. The densification has shown to be strongly dependent on the wetting additions. The influence of the metal binder and wetting additions on the sintering process has been investigated. A characterization of the obtained materials was performed by the point of view of cutting tools life (hardness, toughness, strength). (author)

  14. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  15. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  16. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  17. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...

  18. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.

    2015-01-01

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  19. Development of a coupled neutronic/thermal-hydraulic tool with multi-scale capabilities and applications to HPLWR core analysis

    International Nuclear Information System (INIS)

    Monti, Lanfranco; Starflinger, Joerg; Schulenberg, Thomas

    2011-01-01

    Highlights: → Advanced analysis and design techniques for innovative reactors are addressed. → Detailed investigation of a 3 pass core design with a multi-physics-scales tool. → Coupled 40-group neutron transport/equivalent channels TH core analyses methods. → Multi-scale capabilities: from equivalent channels to sub-channel pin-by-pin study. → High fidelity approach: reduction of conservatism involved in core simulations. - Abstract: The High Performance Light Water Reactor (HPLWR) is a thermal spectrum nuclear reactor cooled and moderated with light water operated at supercritical pressure. It is an innovative reactor concept, which requires developing and applying advanced analysis tools as described in the paper. The relevant water density reduction associated with the heat-up, together with the multi-pass core design, results in a pronounced coupling between neutronic and thermal-hydraulic analyses, which takes into account the strong natural influence of the in-core distribution of power generation and water properties. The neutron flux gradients within the multi-pass core, together with the pronounced dependence of water properties on the temperature, require to consider a fine spatial resolution in which the individual fuel pins are resolved to provide precise evaluation of the clad temperature, currently considered as one of the crucial design criteria. These goals have been achieved considering an advanced analysis method based on the usage of existing codes which have been coupled with developed interfaces. Initially neutronic and thermal-hydraulic full core calculations have been iterated until a consistent solution is found to determine the steady state full power condition of the HPLWR core. Results of few group neutronic analyses might be less reliable in case of HPLWR 3-pass core than for conventional LWRs because of considerable changes of the neutron spectrum within the core, hence 40 groups transport theory has been preferred to the

  20. Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Seong W. Lee

    2006-09-30

    The project entitled, ''Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification'', was successfully completed by the Principal Investigator, Dr. S. Lee and his research team in the Center for Advanced Energy Systems and Environmental Control Technologies at Morgan State University. The major results and outcomes were presented in semi-annual progress reports and annual project review meetings/presentations. Specifically, the literature survey including the gasifier temperature measurement, the ultrasonic application in cleaning application, and spray coating process and the gasifier simulator (cold model) testing has been successfully conducted during the first year. The results show that four factors (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. Then the gasifier simulator (hot model) design and the fabrication as well as the systematic tests on hot model were completed to test the significant factors on temperature measurement in the second year. The advanced Industrial analytic methods such as statistics-based experimental design, analysis of variance (ANOVA) and regression methods were applied in the hot model tests. The results show that operational parameters (i.e. air flow rate, water flow rate, fine dust particle amount, ammonia addition) presented significant impact on the temperature measurement inside the gasifier simulator. The experimental design and ANOVA are very efficient way to design and analyze the experiments. The results show that the air flow rate and fine dust particle amount are statistically significant to the temperature measurement. The regression model provided the functional relation between the temperature and these factors with substantial accuracy. In the last year of the project period, the ultrasonic and subsonic cleaning methods and coating

  1. Analysis of optimal design of low temperature economizer

    Science.gov (United States)

    Song, J. H.; Wang, S.

    2017-11-01

    This paper has studied the Off-design characteristic of low temperature economizer system based on thermodynamics analysis. Based on the data from one 1000 MW coal-fired unit, two modes of operation are contrasted and analyzed. One is to fix exhaust gas temperature and the other one is to take into account both of the average temperature difference and the exhaust gas temperature. Meanwhile, the cause of energy saving effect change is explored. Result shows that: in mode 1, the amount of decrease in coal consumption reduces from 1.11 g/kWh (under full load) to 0.54 g/kWh (under half load), and in mode 2, when the load decreases from 90% to 50%, the decrease in coal consumption reduces from 1.29 g/kWh to 0.84 g/kWh. From the result, under high load, the energy saving effect is superior, and under lower work load, energy saving effect declines rapidly when load is reduced. When load changes, the temperature difference of heat transfer, gas flow, the flue gas heat rejection and the waste heat recovery change. The energy saving effect corresponding changes result in that the energy saving effect under high load is superior and more stable. However, rational adjustment to the temperature of outlet gas can alleviate the decline of the energy saving effect under low load. The result provides theoretical analysis data for the optimal design and operation of low temperature economizer system of power plant.

  2. Electronic tools for health information exchange: an evidence-based analysis.

    Science.gov (United States)

    2013-01-01

    As patients experience transitions in care, there is a need to share information between care providers in an accurate and timely manner. With the push towards electronic medical records and other electronic tools (eTools) (and away from paper-based health records) for health information exchange, there remains uncertainty around the impact of eTools as a form of communication. To examine the impact of eTools for health information exchange in the context of care coordination for individuals with chronic disease in the community. A literature search was performed on April 26, 2012, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published until April 26, 2012 (no start date limit was applied). A systematic literature search was conducted, and meta-analysis conducted where appropriate. Outcomes of interest fell into 4 categories: health services utilization, disease-specific clinical outcomes, process-of-care indicators, and measures of efficiency. The quality of the evidence was assessed individually for each outcome. Expert panels were assembled for stakeholder engagement and contextualization. Eleven articles were identified (4 randomized controlled trials and 7 observational studies). There was moderate quality evidence of a reduction in hospitalizations, hospital length of stay, and emergency department visits following the implementation of an electronically generated laboratory report with recommendations based on clinical guidelines. The evidence showed no difference in disease-specific outcomes; there was no evidence of a positive impact on process-of-care indicators or measures of efficiency. A limited body of research specifically examined eTools for health information exchange in the population and setting of interest. This evidence included a

  3. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  4. RdTools: An Open Source Python Library for PV Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Deceglie, Michael G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nag, Ambarish [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Shinn, Adam [kWh Analytics

    2018-05-04

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  5. Comparison of the Effects of Tool Geometry for Friction Stir Welding Thin Sheet Aluminum Alloys for Aerospace Applications

    Science.gov (United States)

    Merry, Josh; Takeshita, Jennifer; Tweedy, Bryan; Burford, Dwight

    2006-01-01

    In this presentation, the results of a recent study on the effect of pin tool design for friction stir welding thin sheets (0.040") of aluminum alloys 2024 and 7075 are provided. The objective of this study was to investigate and document the effect of tool shoulder and pin diameter, as well as the presence of pin flutes, on the resultant microstructure and mechanical properties at both room temperature and cryogenic temperature. Specifically, the comparison between three tools will include: FSW process load analysis (tool forces required to fabricate the welds), Static Mechanical Properties (ultimate tensile strength, yield strength, and elongation), and Process window documenting the range of parameters that can be used with the three pin tools investigated. All samples were naturally aged for a period greater than 10 days. Prior research has shown 7075 may require post weld heat treatment. Therefore, an additional pair of room temperature and cryogenic temperature samples was post-weld aged to the 7075-T7 condition prior to mechanical testing.

  6. Microscopy image segmentation tool: Robust image data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  7. Microscopy image segmentation tool: Robust image data analysis

    Science.gov (United States)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  8. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  9. Numerical and experimental investigation of thermoelectric cooling in down-hole measuring tools; a case study

    Directory of Open Access Journals (Sweden)

    Rohitha Weerasinghe

    2017-09-01

    Full Text Available Use of Peltier cooling in down-hole seismic tooling has been restricted by the performance of such devices at elevated temperatures. Present paper analyses the performance of Peltier cooling in temperatures suited for down-hole measuring equipment using measurements, predicted manufacturer data and computational fluid dynamic analysis. Peltier performance prediction techniques is presented with measurements. Validity of the extrapolation of thermoelectric cooling performance at elevated temperatures has been tested using computational models for thermoelectric cooling device. This method has been used to model cooling characteristics of a prototype downhole tool and the computational technique used has been proven valid.

  10. Battery switch for downhole tools

    Science.gov (United States)

    Boling, Brian E.

    2010-02-23

    An electrical circuit for a downhole tool may include a battery, a load electrically connected to the battery, and at least one switch electrically connected in series with the battery and to the load. The at least one switch may be configured to close when a tool temperature exceeds a selected temperature.

  11. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  12. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    Science.gov (United States)

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  13. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  14. AN ANALYSIS OF THE CAUSES OF PRODUCT DEFECTS USING QUALITY MANAGEMENT TOOLS

    Directory of Open Access Journals (Sweden)

    Katarzyna MIDOR

    2014-10-01

    Full Text Available To stay or strengthen its position on the market, a modern business needs to follow the principles of quality control in its actions. Especially important is the Zero Defects concept developed by Philip Crosby, which means flawless production. The concept consists in preventing the occurrence of defects and flaws in all production stages. To achieve that, we must, among other things, make use of quality management tools. This article presents an analysis of the reasons for the return of damaged or faulty goods in the automotive industry by means of quality management tools such as the Ishikawa diagram and Pareto analysis, which allow us to identify the causes of product defectiveness. Based on the results, preventive measures have been proposed. The actions presented in this article and the results of the analysis prove the effectiveness of the aforementioned quality management tools.

  15. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  16. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    Science.gov (United States)

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  17. Multispectral analysis tools can increase utility of RGB color images in histology

    Science.gov (United States)

    Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard

    2018-04-01

    Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.

  18. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  19. Tool wear of a single-crystal diamond tool in nano-groove machining of a quartz glass plate

    International Nuclear Information System (INIS)

    Yoshino, Masahiko; Nakajima, Satoshi; Terano, Motoki

    2015-01-01

    Tool wear characteristics of a diamond tool in ductile mode machining are presented in this paper. Nano-groove machining of a quartz glass plate was conducted to examine the tool wear rate of a single-crystal diamond tool. Effects of lubrication on the tool wear rate were also evaluated. A numerical simulation technique was developed to evaluate the tool temperature and normal stress acting on the wear surface. From the simulation results it was found that the tool temperature does not increase during the machining experiment. It is also demonstrated that tool wear is attributed to the abrasive wear mechanism, but the effect of the adhesion wear mechanism is minor in nano-groove machining. It is found that the tool wear rate is reduced by using water or kerosene as a lubricant. (paper)

  20. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    Science.gov (United States)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the

  1. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  2. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    Science.gov (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  4. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  5. Situational Awareness Analysis Tools for Aiding Discovery of Security Events and Patterns

    National Research Council Canada - National Science Library

    Kumar, Vipin; Kim, Yongdae; Srivastava, Jaideep; Zhang, Zhi-Li; Shaneck, Mark; Chandola, Varun; Liu, Haiyang; Choi, Changho; Simon, Gyorgy; Eilertson, Eric

    2005-01-01

    .... The University of Minnesota team has developed a comprehensive, multi-stage analysis framework which provides tools and analysis methodologies to aid cyber security analysts in improving the quality...

  6. RankProdIt: A web-interactive Rank Products analysis tool

    Directory of Open Access Journals (Sweden)

    Laing Emma

    2010-08-01

    Full Text Available Abstract Background The first objective of a DNA microarray experiment is typically to generate a list of genes or probes that are found to be differentially expressed or represented (in the case of comparative genomic hybridizations and/or copy number variation between two conditions or strains. Rank Products analysis comprises a robust algorithm for deriving such lists from microarray experiments that comprise small numbers of replicates, for example, less than the number required for the commonly used t-test. Currently, users wishing to apply Rank Products analysis to their own microarray data sets have been restricted to the use of command line-based software which can limit its usage within the biological community. Findings Here we have developed a web interface to existing Rank Products analysis tools allowing users to quickly process their data in an intuitive and step-wise manner to obtain the respective Rank Product or Rank Sum, probability of false prediction and p-values in a downloadable file. Conclusions The online interactive Rank Products analysis tool RankProdIt, for analysis of any data set containing measurements for multiple replicated conditions, is available at: http://strep-microarray.sbs.surrey.ac.uk/RankProducts

  7. Study on Real-Time Simulation Analysis and Inverse Analysis System for Temperature and Stress of Concrete Dam

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2015-01-01

    Full Text Available In the concrete dam construction, it is very necessary to strengthen the real-time monitoring and scientific management of concrete temperature control. This paper constructs the analysis and inverse analysis system of temperature stress simulation, which is based on various useful data collected in real time in the process of concrete construction. The system can produce automatically data file of temperature and stress calculation and then achieve the remote real-time simulation calculation of temperature stress by using high performance computing techniques, so the inverse analysis can be carried out based on a basis of monitoring data in the database; it fulfills the automatic feedback calculation according to the error requirement and generates the corresponding curve and chart after the automatic processing and analysis of corresponding results. The system realizes the automation and intellectualization of complex data analysis and preparation work in simulation process and complex data adjustment in the inverse analysis process, which can facilitate the real-time tracking simulation and feedback analysis of concrete temperature stress in construction process and enable you to discover problems timely, take measures timely, and adjust construction scheme and can well instruct you how to ensure project quality.

  8. Risk D and D Rapid Prototype: Scenario Documentation and Analysis Tool

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-01-01

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health and safety risk analysis for decontamination and decommissioning projects. The objective of the Decontamination and Decommissioning (D and D) Risk Management Evaluation and Work Sequencing Standardization Project under DOE EM-23 is to recommend or develop practical risk-management tools for decommissioning of nuclear facilities. PNNL has responsibility under this project for recommending or developing computer-based tools that facilitate the evaluation of risks in order to optimize the sequencing of D and D work. PNNL's approach is to adapt, augment, and integrate existing resources rather than to develop a new suite of tools. Methods for the evaluation of H and S risks associated with work in potentially hazardous environments are well-established. Several approaches exist which, collectively, are referred to as process hazard analysis (PHA). A PHA generally involves the systematic identification of accidents, exposures, and other adverse events associated with a given process or work flow. This identification process is usually achieved in a brainstorming environment or by other means of eliciting informed opinion. The likelihoods of adverse events (scenarios) and their associated consequence severities are estimated against pre-defined scales, based on which risk indices are then calculated. A similar process is encoded in various project risk software products that facilitate the quantification of schedule and cost risks associated with adverse scenarios. However, risk models do not generally capture both project risk and H and S risk. The intent of the project reported here is to produce a tool that facilitates the elicitation, characterization, and documentation of both project risk and H and S risk based on defined sequences of D and D activities. By considering alternative D and D sequences, comparison of the predicted risks can

  9. On exergy analysis of industrial plants and significance of ambient temperature

    Energy Technology Data Exchange (ETDEWEB)

    Rian, Berit

    2011-07-01

    The exergy analysis has been a relatively mature theory for more than 30 years. However, it is not that developed in terms of procedures for optimizing systems, which partly explains why it is not that common. Misconceptions and prejudices, even among scientists, are also partly to blame.The main objective of this work was to contribute to the development of an understanding and methodology of the exergy analysis. The thesis was mainly based on three papers, two of which provided very different examples from existing industrial systems in Norway, thus showing the societal perspective in terms of resource utilization and thermodynamics. The last paper and the following investigation were limited to certain aspects of ambient conditions. Two Norwegian operational plants have been studied, one operative for close to 30 years (Kaarstoe steam production and distribution system), while the other has just started its expected 30 years of production (Snoehvit LNG plant). In addition to mapping the current operational status of these plants, the study of the Kaarstoe steam production and distribution system concluded that the potential for increasing the thermodynamic performance by rather cautious actions was significant, whereas the study of the Snoehvit LNG plant showed the considerable profit which the Arctic location provided in terms of reduced fuel consumption. The significance of the ambient temperature led to the study of systems with two ambient bodies (i.e. ambient water and ambient air) of different temperatures, here three different systems were investigated: A regenerative steam injection gas turbine (RSTIG), a simple Linde air liquefaction plant (Air Liq) and an air-source heat pump water heater (HPWH). In particular, the effect of the chosen environment on exergy analysis was negligible for RSTIG, modest for Air Liq and critical for HPWH. It was found that the amount of exergy received from the alternative ambient body, compared to the main exergy flow of

  10. Development of a Method for Tool Wear Analysis Using 3D Scanning

    Directory of Open Access Journals (Sweden)

    Hawryluk Marek

    2017-12-01

    Full Text Available The paper deals with evaluation of a 3D scanning method elaborated by the authors, by applying it to the analysis of the wear of forging tools. The 3D scanning method in the first place consists in the application of scanning to the analysis of changes in geometry of a forging tool by way of comparing the images of a worn tool with a CAD model or an image of a new tool. The method was evaluated in the context of the important measurement problems resulting from the extreme conditions present during the industrial hot forging processes. The method was used to evaluate wear of tools with an increasing wear degree, which made it possible to determine the wear characteristics in a function of the number of produced forgings. The following stage was the use it for a direct control of the quality and geometry changes of forging tools (without their disassembly by way of a direct measurement of the geometry of periodically collected forgings (indirect method based on forgings. The final part of the study points to the advantages and disadvantages of the elaborated method as well as the potential directions of its further development.

  11. Analysis of temperature data at the Olkiluoto

    Energy Technology Data Exchange (ETDEWEB)

    Sedighi, M.; Bennett, D.; Masum, S.; Thomas, H. [Cardiff Univ. (United Kingdom); Johansson, E. [Saanio and Riekkola Oy, Helsinki (Finland)

    2014-03-15

    As part of the rock mechanics monitoring programme 2012 at Olkiluoto, temperature data have been recorded. Temperature data have been measured, collected and monitored at the Olkiluoto site and in ONKALO in various locations, by different methods and in conjunction with other investigations carried out at the site. This report provides a detailed description of the investigation and analysis carried out on temperature datasets. This report aims to provide a better understanding of the in-situ temperature of the rock and soil at the site. Three categories of datasets have been analysed and studied from the Posiva thermal monitoring programme. These consist of: (i) data collected from the various drillholes during geophysical logging and Posiva Flow Log (PFL) measurements, (ii) measurements in the ONKALO ramp, the investigation niche located at elevation -140 m and a technical room located at 437 m below the surface, and (iii) surface temperature measurements from four weather stations and four measurement ditches. Time-series data obtained from the groundwater temperature measurements during the 'Posiva Flow Log' (PFL) tests in drillholes OL-KR1 to KR55 at different depths and years have been analysed. Temperature at a depth of 400 m was found to be in the range of 10 to 11 deg C. The geothermal gradient obtained from the PFL data without pumping was found to be approximately 1.4 deg C/100m with relatively uniform temporal and spatial patterns at the repository depth, i.e. at 400 m.The geothermal gradient obtained from the results of the PFL measurements and geophysical loggings indicate similar temperature values at the repository depths, i.e. 400 m. The characteristics of the time series data related to the ONKALO measurements, have been obtained through a series of Non-uniform Discrete Fourier Transform analysis Datasets related to the various chainages and investigation niche at ONKALO have been studied. The largest variation in the temperature

  12. Analysis of temperature data at the Olkiluoto

    International Nuclear Information System (INIS)

    Sedighi, M.; Bennett, D.; Masum, S.; Thomas, H.; Johansson, E.

    2014-03-01

    As part of the rock mechanics monitoring programme 2012 at Olkiluoto, temperature data have been recorded. Temperature data have been measured, collected and monitored at the Olkiluoto site and in ONKALO in various locations, by different methods and in conjunction with other investigations carried out at the site. This report provides a detailed description of the investigation and analysis carried out on temperature datasets. This report aims to provide a better understanding of the in-situ temperature of the rock and soil at the site. Three categories of datasets have been analysed and studied from the Posiva thermal monitoring programme. These consist of: (i) data collected from the various drillholes during geophysical logging and Posiva Flow Log (PFL) measurements, (ii) measurements in the ONKALO ramp, the investigation niche located at elevation -140 m and a technical room located at 437 m below the surface, and (iii) surface temperature measurements from four weather stations and four measurement ditches. Time-series data obtained from the groundwater temperature measurements during the 'Posiva Flow Log' (PFL) tests in drillholes OL-KR1 to KR55 at different depths and years have been analysed. Temperature at a depth of 400 m was found to be in the range of 10 to 11 deg C. The geothermal gradient obtained from the PFL data without pumping was found to be approximately 1.4 deg C/100m with relatively uniform temporal and spatial patterns at the repository depth, i.e. at 400 m.The geothermal gradient obtained from the results of the PFL measurements and geophysical loggings indicate similar temperature values at the repository depths, i.e. 400 m. The characteristics of the time series data related to the ONKALO measurements, have been obtained through a series of Non-uniform Discrete Fourier Transform analysis Datasets related to the various chainages and investigation niche at ONKALO have been studied. The largest variation in the temperature amplitude of data

  13. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  14. Numerical Analysis of Exergy for Air-Conditioning Influenced by Ambient Temperature

    Directory of Open Access Journals (Sweden)

    Jing-Nang Lee

    2014-07-01

    Full Text Available The article presents numerical analysis of exergy for air-conditioning influenced by ambient temperature. The model of numerical simulation uses an integrated air conditioning system exposed in varied ambient temperature to observe change of the four main devices, the compressor, the condenser, the capillary, and the evaporator in correspondence to ambient temperature. The analysis devices of the four devices’s exergy influenced by the varied ambient temperature and found that the capillary has unusual increasing exergy loss vs. increasing ambient temperature in comparison to the other devices. The result shows that reducing exergy loss of the capillary influenced by the ambient temperature is the key for improving working efficiency of an air-conditioning system when influence of the ambient temperature is considered. The higher ambient temperature causes the larger pressure drop of capillary and more exergy loss.

  15. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  16. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  17. Temperature and composition profile during double-track laser cladding of H13 tool steel

    Science.gov (United States)

    He, X.; Yu, G.; Mazumder, J.

    2010-01-01

    Multi-track laser cladding is now applied commercially in a range of industries such as automotive, mining and aerospace due to its diversified potential for material processing. The knowledge of temperature, velocity and composition distribution history is essential for a better understanding of the process and subsequent microstructure evolution and properties. Numerical simulation not only helps to understand the complex physical phenomena and underlying principles involved in this process, but it can also be used in the process prediction and system control. The double-track coaxial laser cladding with H13 tool steel powder injection is simulated using a comprehensive three-dimensional model, based on the mass, momentum, energy conservation and solute transport equation. Some important physical phenomena, such as heat transfer, phase changes, mass addition and fluid flow, are taken into account in the calculation. The physical properties for a mixture of solid and liquid phase are defined by treating it as a continuum media. The velocity of the laser beam during the transition between two tracks is considered. The evolution of temperature and composition of different monitoring locations is simulated.

  18. Analysis tools for discovering strong parity violation at hadron colliders

    Science.gov (United States)

    Backović, Mihailo; Ralston, John P.

    2011-07-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  19. Analysis tools for discovering strong parity violation at hadron colliders

    International Nuclear Information System (INIS)

    Backovic, Mihailo; Ralston, John P.

    2011-01-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or 'azimuthal flow'. Analysis uses the representations of the orthogonal group O(2) and dihedral groups D N necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single 'reaction plane'. Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of 'event-shape sorting' to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  20. Comparison study of inelastic analysis codes for high temperature structure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Lee, H. Y.; Park, C. K.; Geon, G. P.; Lee, J. H

    2004-02-01

    LMR high temperature structures subjected to operating and transient loadings may exhibit very complex deformation behaviors due to the use of ductile material such as 316SS and the systematic analysis technology of high temperature structure for reliable safety assessment is essential. In this project, comparative study with developed inelastic analysis program NONSTA and the existing analysis codes was performed applying various types of loading including non-proportional loading. The performance of NONSTA was confirmed and the effect of inelastic constants on the analysis result was analyzed. Also, the applicability of the inelastic analysis was enlarged as a result of applying both the developed program and the existing codes to the analyses of the enhanced creep behavior and the elastic follow-up behavior of high temperature structures and the necessary items for improvements were deduced. Further studies on the improvement of NONSTA program and the decision of the proper values of inelastic constants are necessary.

  1. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    Science.gov (United States)

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  2. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited......This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  3. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  4. High Thermal Conductivity and High Wear Resistance Tool Steels for cost-effective Hot Stamping Tools

    Science.gov (United States)

    Valls, I.; Hamasaiid, A.; Padré, A.

    2017-09-01

    In hot stamping/press hardening, in addition to its shaping function, the tool controls the cycle time, the quality of the stamped components through determining the cooling rate of the stamped blank, the production costs and the feasibility frontier for stamping a given component. During the stamping, heat is extracted from the stamped blank and transported through the tool to the cooling medium in the cooling lines. Hence, the tools’ thermal properties determine the cooling rate of the blank, the heat transport mechanism, stamping times and temperature distribution. The tool’s surface resistance to adhesive and abrasive wear is also an important cost factor, as it determines the tool durability and maintenance costs. Wear is influenced by many tool material parameters, such as the microstructure, composition, hardness level and distribution of strengthening phases, as well as the tool’s working temperature. A decade ago, Rovalma developed a hot work tool steel for hot stamping that features a thermal conductivity of more than double that of any conventional hot work tool steel. Since that time, many complimentary grades have been developed in order to provide tailored material solutions as a function of the production volume, degree of blank cooling and wear resistance requirements, tool geometries, tool manufacturing method, type and thickness of the blank material, etc. Recently, Rovalma has developed a new generation of high thermal conductivity, high wear resistance tool steel grades that enable the manufacture of cost effective tools for hot stamping to increase process productivity and reduce tool manufacturing costs and lead times. Both of these novel grades feature high wear resistance and high thermal conductivity to enhance tool durability and cut cycle times in the production process of hot stamped components. Furthermore, one of these new grades reduces tool manufacturing costs through low tool material cost and hardening through readily

  5. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  6. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  7. Nanosensors as Reservoir Engineering Tools to Map Insitu Temperature Distributions in Geothermal Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Morgan Ames

    2011-06-15

    The feasibility of using nanosensors to measure temperature distribution and predict thermal breakthrough in geothermal reservoirs is addressed in this report. Four candidate sensors were identified: melting tin-bismuth alloy nanoparticles, silica nanoparticles with covalently-attached dye, hollow silica nanoparticles with encapsulated dye and impermeable melting shells, and dye-polymer composite time-temperature indicators. Four main challenges associated with the successful implementation of temperature nanosensors were identified: nanoparticle mobility in porous and fractured media, the collection and detection of nanoparticles at the production well, engineering temperature sensing mechanisms that are both detectable and irreversible, and inferring the spatial geolocation of temperature measurements in order to map temperature distribution. Initial experiments were carried out to investigate each of these challenges. It was demonstrated in a slim-tube injection experiment that it is possible to transport silica nanoparticles over large distances through porous media. The feasibility of magnetic collection of nanoparticles from produced fluid was evaluated experimentally, and it was estimated that 3% of the injected nanoparticles were recovered in a prototype magnetic collection device. An analysis technique was tailored to nanosensors with a dye-release mechanism to estimate temperature measurement geolocation by analyzing the return curve of the released dye. This technique was used in a hypothetical example problem, and good estimates of geolocation were achieved. Tin-bismuth alloy nanoparticles were synthesized using a sonochemical method, and a bench heating experiment was performed using these nanoparticles. Particle growth due to melting was observed, indicating that tin-bismuth nanoparticles have potential as temperature nanosensors

  8. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  9. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  10. Spectral wave analysis at the mesopause from SCIAMACHY airglow data compared to SABER temperature spectra

    Directory of Open Access Journals (Sweden)

    M. Ern

    2009-01-01

    Full Text Available Space-time spectral analysis of satellite data is an important method to derive a synoptic picture of the atmosphere from measurements sampled asynoptically by satellite instruments. In addition, it serves as a powerful tool to identify and separate different wave modes in the atmospheric data. In our work we present space-time spectral analyses of chemical heating rates derived from Scanning Imaging Absorption SpectroMeter for Atmospheric CHartographY (SCIAMACHY hydroxyl nightglow emission measurements onboard Envisat for the years 2002–2006 at mesopause heights. Since SCIAMACHY nightglow hydroxyl emission measurements are restricted to the ascending (nighttime part of the satellite orbit, our analysis also includes temperature spectra derived from 15 μm CO2 emissions measured by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER instrument. SABER offers better temporal and spatial coverage (daytime and night-time values of temperature and a more regular sampling grid. Therefore SABER spectra also contain information about higher frequency waves. Comparison of SCIAMACHY and SABER results shows that SCIAMACHY, in spite of its observational restrictions, provides valuable information on most of the wave modes present in the mesopause region. The main differences between wave spectra obtained from these sensors can be attributed to the differences in their sampling patterns.

  11. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  12. High-Temperature Structural Analysis of a Small-Scale Prototype of a Process Heat Exchanger (IV) - Macroscopic High-Temperature Elastic-Plastic Analysis -

    International Nuclear Information System (INIS)

    Song, Kee Nam; Hong, Sung Deok; Park, Hong Yoon

    2011-01-01

    A PHE (Process Heat Exchanger) is a key component required to transfer heat energy of 950 .deg. C generated in a VHTR (Very High Temperature Reactor) to a chemical reaction that yields a large quantity of hydrogen. A small-scale PHE prototype made of Hastelloy-X was scheduled for testing in a small-scale gas loop at the Korea Atomic Energy Research Institute. In this study, as a part of the evaluation of the high-temperature structural integrity of the PHE prototype, high-temperature structural analysis modeling, and macroscopic thermal and elastic-plastic structural analysis of the PHE prototype were carried out under the gas-loop test conditions as a preliminary qwer123$ study before carrying out the performance test in the gas loop. The results obtained in this study will be used to design the performance test setup for the modified PHE prototype

  13. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  14. Thermal analysis of high temperature phase transformations of steel

    Directory of Open Access Journals (Sweden)

    K. Gryc

    2013-10-01

    Full Text Available The series of thermal analysis measurements of high temperature phase transformations of real grain oriented electrical steel grade under conditions of two analytical devices (Netzsch STA 449 F3 Jupiter; Setaram SETSYS 18TM were carried out. Two thermo analytical methods were used (DTA and Direct thermal analysis. The different weight of samples was used (200 mg, 23 g. The stability/reproducibility of results obtained by used methodologies was verified. The liquidus and solidus temperatures for close to equilibrium conditions and during cooling (20 °C/min; 80 °C/min were determined. It has been shown that the higher cooling rate lead to lower temperatures for start and end of solidification process of studied steel grade.

  15. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  16. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  17. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  18. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  19. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  20. Temperature control of fimbriation circuit switch in uropathogenic Escherichia coli: quantitative analysis via automated model abstraction.

    Science.gov (United States)

    Kuwahara, Hiroyuki; Myers, Chris J; Samoilov, Michael S

    2010-03-26

    Uropathogenic Escherichia coli (UPEC) represent the predominant cause of urinary tract infections (UTIs). A key UPEC molecular virulence mechanism is type 1 fimbriae, whose expression is controlled by the orientation of an invertible chromosomal DNA element-the fim switch. Temperature has been shown to act as a major regulator of fim switching behavior and is overall an important indicator as well as functional feature of many urologic diseases, including UPEC host-pathogen interaction dynamics. Given this panoptic physiological role of temperature during UTI progression and notable empirical challenges to its direct in vivo studies, in silico modeling of corresponding biochemical and biophysical mechanisms essential to UPEC pathogenicity may significantly aid our understanding of the underlying disease processes. However, rigorous computational analysis of biological systems, such as fim switch temperature control circuit, has hereto presented a notoriously demanding problem due to both the substantial complexity of the gene regulatory networks involved as well as their often characteristically discrete and stochastic dynamics. To address these issues, we have developed an approach that enables automated multiscale abstraction of biological system descriptions based on reaction kinetics. Implemented as a computational tool, this method has allowed us to efficiently analyze the modular organization and behavior of the E. coli fimbriation switch circuit at different temperature settings, thus facilitating new insights into this mode of UPEC molecular virulence regulation. In particular, our results suggest that, with respect to its role in shutting down fimbriae expression, the primary function of FimB recombinase may be to effect a controlled down-regulation (rather than increase) of the ON-to-OFF fim switching rate via temperature-dependent suppression of competing dynamics mediated by recombinase FimE. Our computational analysis further implies that this down

  1. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  2. Integrated analysis tools for trade studies of spacecraft controller and sensor locations

    Science.gov (United States)

    Rowell, L. F.

    1986-01-01

    The present investigation was conducted with the aim to evaluate the practicality and difficulties of modern control design methods for large space structure controls. The evaluation is used as a basis for the identification of useful computer-based analysis tools which would provide insight into control characteristics of a spacecraft concept. A description is presented of the wrap-rib antenna and its packaging concept. Attention is given to active control requirements, a mathematical model of structural dynamics, aspects of sensor and actuator location, the analysis approach, controllability, observability, the concept of balanced realization, transmission zeros, singular value plots, analysis results, model reduction, and an interactive computer program. It is pointed out that the application of selected control analysis tools to the wrap-rib antenna demonstrates several capabilities which can be useful during conceptual design.

  3. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  4. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  5. Ionic liquid thermal stabilities: decomposition mechanisms and analysis tools.

    Science.gov (United States)

    Maton, Cedric; De Vos, Nils; Stevens, Christian V

    2013-07-07

    The increasing amount of papers published on ionic liquids generates an extensive quantity of data. The thermal stability data of divergent ionic liquids are collected in this paper with attention to the experimental set-up. The influence and importance of the latter parameters are broadly addressed. Both ramped temperature and isothermal thermogravimetric analysis are discussed, along with state-of-the-art methods, such as TGA-MS and pyrolysis-GC. The strengths and weaknesses of the different methodologies known to date demonstrate that analysis methods should be in line with the application. The combination of data from advanced analysis methods allows us to obtain in-depth information on the degradation processes. Aided with computational methods, the kinetics and thermodynamics of thermal degradation are revealed piece by piece. The better understanding of the behaviour of ionic liquids at high temperature allows selective and application driven design, as well as mathematical prediction for engineering purposes.

  6. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    International Nuclear Information System (INIS)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook

    2007-08-01

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the modeling

  7. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook (and others)

    2007-08-15

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the

  8. Experience with conventional inelastic analysis procedures in very high temperature applications

    International Nuclear Information System (INIS)

    Mallett, R.H.; Thompson, J.M.; Swindeman, R.W.

    1991-01-01

    Conventional incremental plasticity and creep analysis procedures for inelastic analysis are applied to hot flue gas cleanup system components. These flue gas systems operate at temperatures where plasticity and creep are very much intertwined while the two phenomena are treated separately in the conventional inelastic analysis procedure. Data for RA333 material are represented in forms appropriate for the conventional inelastic analysis procedures. Behavior is predicted for typical operating cycles. Creep-fatigue damage is estimated based upon usage fractions. Excessive creep damage is predicted; the major contributions occur during high stress short term intervals caused by rapid temperature changes. In this paper these results are presented for discussion of the results and their interpretation in terms of creep-fatigue damage for very high temperature applications

  9. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  10. Investigation of approximate models of experimental temperature characteristics of machines

    Science.gov (United States)

    Parfenov, I. V.; Polyakov, A. N.

    2018-05-01

    This work is devoted to the investigation of various approaches to the approximation of experimental data and the creation of simulation mathematical models of thermal processes in machines with the aim of finding ways to reduce the time of their field tests and reducing the temperature error of the treatments. The main methods of research which the authors used in this work are: the full-scale thermal testing of machines; realization of various approaches at approximation of experimental temperature characteristics of machine tools by polynomial models; analysis and evaluation of modelling results (model quality) of the temperature characteristics of machines and their derivatives up to the third order in time. As a result of the performed researches, rational methods, type, parameters and complexity of simulation mathematical models of thermal processes in machine tools are proposed.

  11. Curie temperature determination via thermogravimetric and continuous wavelet transformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hasier, John; Nash, Philip [Thermal Processing Technology Center, IIT, Chicago, IL (United States); Riolo, Maria Annichia [University of Michigan, Center for the Study of Complex Systems, Ann Arbor, MI (United States)

    2017-12-15

    A cost effective method for conversion of a vertical tube thermogravimetric analysis system into a magnetic balance capable of measuring Curie Temperatures is presented. Reference and preliminary experimental data generated using this system is analyzed via a general-purpose wavelet based Curie point edge detection technique allowing for enhanced speed, ease and repeatability of magnetic balance data analysis. The Curie temperatures for a number of Heusler compounds are reported. (orig.)

  12. Conception of a PWR simulator as a tool for safety analysis

    International Nuclear Information System (INIS)

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.

    1982-09-01

    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  13. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  14. Mobility analysis tool based on the fundamental principle of conservation of energy.

    Energy Technology Data Exchange (ETDEWEB)

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  15. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2016-06-01

    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  16. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  17. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  18. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  19. Temperature control characteristics analysis of lead-cooled fast reactor with natural circulation

    International Nuclear Information System (INIS)

    Yang, Minghan; Song, Yong; Wang, Jianye; Xu, Peng; Zhang, Guangyu

    2016-01-01

    Highlights: • The LFR temperature control system are analyzed with frequency domain method. • The temperature control compensator is designed according to the frequency analysis. • Dynamic simulation is performed by SIMULINK and RELAP5-HD. - Abstract: Lead-cooled Fast Reactor (LFR) with natural circulation in primary system is among the highlights in advance nuclear reactor research, due to its great superiority in reactor safety and reliability. In this work, a transfer function matrix describing coolant temperature dynamic process, obtained by Laplace transform of the one-dimensional system dynamic model is developed in order to investigate the temperature control characteristics of LFR. Based on the transfer function matrix, a close-loop coolant temperature control system without compensator is built. The frequency domain analysis indicates that the stability and steady-state of the temperature control system needs to be improved. Accordingly, a temperature compensator based on Proportion–Integration and feed-forward is designed. The dynamic simulation of the whole system with the temperature compensator for core power step change is performed with SIMULINK and RELAP5-HD. The result shows that the temperature compensator can provide superior coolant temperature control capabilities in LFR with natural circulation due to the efficiency of the frequency domain analysis method.

  20. Experimental Investigation on Tool Wear Behavior and Cutting Temperature during Dry Machining of Carbon Steel SAE 1030 Using KC810 and KC910 Coated Inserts

    Directory of Open Access Journals (Sweden)

    Y. Tamerabet

    2018-03-01

    Full Text Available The removal of cutting fluids and lubrication in dry machining operations requires a good knowledge and full control of all the mechanisms that lead to tool damage. In order to optimize dry machining operations, it is necessary to clearly identify the wear patterns, determine the contact conditions and define the relationship between the contact parameters and the operating conditions. The idea is to choose optimal cutting conditions which lead to the best contact conditions limiting the triggering or aggravation of wear phenomena. The purpose of this paper is to determine the impact multilayer coatings and cutting parameters on tool wear and temperature at the tool-chip interface for two types of coated carbides (KC810 and KC910 Commercialized inserts during dry turning operation of carbon steel SAE 1030, in order to determine the ideal parameters and guarantee the best performances of the cutting tools. Cutting temperature, Crater and Flank wear have been systematically recorded in order to determine their influence on tool life time. To ensure the optimum choice of machining conditions; the TAGUCHI method associated to multi-factorial method were applied to plan the experiments. It has been noted that cutting speed was the most influential factor on temperature and wear evolution. We noted also that the KC810 insert was more suitable for machining of SAE 1030 Carbon Steel; where The best life time was registered (T=228 min. The KC810 inserts offer 30 min of additional machining time for the same work conditions.

  1. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  2. Thermal analysis of annular fins with temperature-dependent thermal properties

    Institute of Scientific and Technical Information of China (English)

    I. G. AKSOY

    2013-01-01

    The thermal analysis of the annular rectangular profile fins with variable thermal properties is investigated by using the homotopy analysis method (HAM). The thermal conductivity and heat transfer coefficient are assumed to vary with a linear and power-law function of temperature, respectively. The effects of the thermal-geometric fin parameter and the thermal conductivity parameter variations on the temperature distribution and fin efficiency are investigated for different heat transfer modes. Results from the HAM are compared with numerical results of the finite difference method (FDM). It can be seen that the variation of dimensionless parameters has a significant effect on the temperature distribution and fin efficiency.

  3. Experiment and calculation of reinforced concrete at elevated temperatures

    CERN Document Server

    Guo, Zhenhai

    2011-01-01

    Concrete as a construction material goes through both physical and chemical changes under extreme elevated temperatures. As one of the most widely used building materials, it is important that both engineers and architects are able to understand and predict its behavior in under extreme heat conditions. Brief and readable, this book provides the tools and techniques to properly analysis the effects of high temperature of reinforced concrete which will lead to more stable, safer structures. Based on years of the author's research, Reinforced Concrete at Elevated Temperatures four par

  4. Prediction of the wear and evolution of cutting tools in a carbide / titanium-aluminum-vanadium machining tribosystem by volumetric tool wear characterization and modeling

    Science.gov (United States)

    Kuttolamadom, Mathew Abraham

    The objective of this research work is to create a comprehensive microstructural wear mechanism-based predictive model of tool wear in the tungsten carbide / Ti-6Al-4V machining tribosystem, and to develop a new topology characterization method for worn cutting tools in order to validate the model predictions. This is accomplished by blending first principle wear mechanism models using a weighting scheme derived from scanning electron microscopy (SEM) imaging and energy dispersive x-ray spectroscopy (EDS) analysis of tools worn under different operational conditions. In addition, the topology of worn tools is characterized through scanning by white light interferometry (WLI), and then application of an algorithm to stitch and solidify data sets to calculate the volume of the tool worn away. The methodology was to first combine and weight dominant microstructural wear mechanism models, to be able to effectively predict the tool volume worn away. Then, by developing a new metrology method for accurately quantifying the bulk-3D wear, the model-predicted wear was validated against worn tool volumes obtained from corresponding machining experiments. On analyzing worn crater faces using SEM/EDS, adhesion was found dominant at lower surface speeds, while dissolution wear dominated with increasing speeds -- this is in conformance with the lower relative surface speed requirement for micro welds to form and rupture, essentially defining the mechanical load limit of the tool material. It also conforms to the known dominance of high temperature-controlled wear mechanisms with increasing surface speed, which is known to exponentially increase temperatures especially when machining Ti-6Al-4V due to its low thermal conductivity. Thus, straight tungsten carbide wear when machining Ti-6Al-4V is mechanically-driven at low surface speeds and thermally-driven at high surface speeds. Further, at high surface speeds, craters were formed due to carbon diffusing to the tool surface and

  5. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  6. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  7. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  8. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  9. Simple Strategic Analysis Tools at SMEs in Ecuador

    Directory of Open Access Journals (Sweden)

    Diego H. Álvarez Peralta

    2015-06-01

    Full Text Available This article explores the possible applications of Strategic Analysis Tools (SAT in SMEs located in emerging countries such as Ecuador (where there are no formal studies on the subject. It is intended to analyze if whether or not it is feasible to effectively apply a set of proposed tools to guide mental map decisions of executives when decisions on strategy have to be made. Through an in-depth review of the state of the art in regards to SAT and interviews performed to main participants such as chambers and executives of different firms, it is shown the feasibility of their application. This analysis is complemented with specialists´ interviews to deepen our insights and obtaining valid conclusions. Our conclusion is that SMEs can smoothly develop and apply an appropriate set of SAT when opting for very relevant choices. However, there are some inconveniences to be solved which are connected with resources (such as peoples’ abilities and technology and behavioral (cultural factors and methodological processes.Once these barriers are knocked down, it would be more likely to enrich current approaches to make strategic decisions even more effective. This is a qualitative investigation and the research design is not experimental (among them it is transversal as it relates to a specific moment in time.

  10. CFD Analysis of the Fuel Temperature in High Temperature Gas-Cooled Reactors

    International Nuclear Information System (INIS)

    In, W. K.; Chun, T. H.; Lee, W. J.; Chang, J. H.

    2005-01-01

    High temperature gas-cooled reactors (HTGR) have received a renewed interest as potential sources for future energy needs, particularly for a hydrogen production. Among the HTGRs, the pebble bed reactor (PBR) and a prismatic modular reactor (PMR) are considered as the nuclear heat source in Korea's nuclear hydrogen development and demonstration project. PBR uses coated fuel particles embedded in spherical graphite fuel pebbles. The fuel pebbles flow down through the core during an operation. PMR uses graphite fuel blocks which contain cylindrical fuel compacts consisting of the fuel particles. The fuel blocks also contain coolant passages and locations for absorber and control material. The maximum fuel temperature in the core hot spot is one of the important design parameters for both PBR and PMR. The objective of this study is to predict the fuel temperature distributions in PBR and PMR using a computational fluid dynamics(CFD) code, CFX-5. The reference reactor designs used in this analysis are PBMR400 and GT-MHR600

  11. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  12. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    Science.gov (United States)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center

  13. Dynamic analysis of the CTAR (constant temperature adsorption refrigeration) cycle

    International Nuclear Information System (INIS)

    Hassan, H.Z.; Mohamad, A.A.; Al-Ansary, H.A.; Alyousef, Y.M.

    2014-01-01

    The basic SAR (solar-driven adsorption refrigeration) machine is an intermittent cold production system. Recently, the CO-SAR (continuous operation solar-powered adsorption refrigeration) system is developed. The CO-SAR machine is based on the theoretical CTAR (constant temperature adsorption refrigeration) cycle in which the adsorption process takes place at a constant temperature that equals the ambient temperature. Practically, there should be a temperature gradient between the adsorption bed and the surrounding atmosphere to provide a driving potential for heat transfer. In the present study, the dynamic analysis of the CTAR cycle is developed. This analysis provides a comparison between the theoretical and the dynamic operation of the CTAR cycle. The developed dynamic model is based on the D-A adsorption equilibrium equation and the energy and mass balances in the adsorption reactor. Results obtained from the present work demonstrate that, the idealization of the constant temperature adsorption process in the theoretical CTAR cycle is not far from the real situation and can be approached. Furthermore, enhancing the heat transfer between the adsorption bed and the ambient during the bed pre-cooling process helps accelerating the heat rejection process from the adsorption reactor and therefore approaching the isothermal process. - Highlights: • The dynamic analysis of the CTAR (constant temperature adsorption refrigeration) cycle is developed. • The CTAR theoretical and dynamic cycles are compared. • The dynamic cycle approaches the ideal one by enhancing the bed precooling

  14. Modal Analysis and Experimental Determination of Optimum Tool Shank Overhang of a Lathe Machine

    Directory of Open Access Journals (Sweden)

    Nabin SARDAR

    2008-12-01

    Full Text Available Vibration of Tool Shank of a cutting tool has large influence on tolerances and surface finish of products. Frequency and amplitude of vibrations depend on the overhang of the shank of the cutting tool. In turning operations, when the tool overhang is about 2 times of the tool height, the amplitude of the vibration is almost zero and dimensional tolerances and surface finish of the product becomes high. In this paper, the above statement is verified firstly by using a finite element analysis of the cutting tool with ANSYS software package and secondly, with experimental verification with a piezoelectric sensor.

  15. First results from the in-situ temperature measurements by the newly developed downhole tool during the drilling cruise in the hydrothermal fields of the mid-Okinawa Trough

    Science.gov (United States)

    Kitada, K.; Wu, H. Y.; Miyazaki, J.; Akiyama, K.; Nozaki, T.; Ishibashi, J. I.; Kumagai, H.; Maeda, L.

    2016-12-01

    The Okinawa trough is an active backarc basin behind the Ryukyu subduction zone and exhibits active rifting associated with extension of the continental margin. The temperature measurement in this area is essential for understanding hydrothermal system and hydraulic structure. During the CK16-01 cruise this March, we have conducted the in-situ temperature measurements by the newly developed downhole tool, TRDT (Thermo-Resistant Downhole Thermometer) in hydrothermal fields of the mid-Okinawa Trough. The purpose of this measurement is to investigate the in-situ temperature structure in deep-hot zones and its variation after coring and/or drilling. TRDT was designed by JAMSTEC as a memory downhole tool to measure in-situ borehole temperature under the extreme high temperature environment. First trial was conducted in the CK14-04 cruise by the free fall deployment to reduce the operation time. However, there was no temperature data recorded due to the strong vibration during the operation. After CK14-04 cruise, TRDT was modified to improve the function against vibration and shock. The improved TRDT passed the high temperature, vibration and shock tests to ensure the data acquisition of borehole logging. During the CK16-01 cruise, we have first successfully collected the in-situ temperature data from hydrothermal borehole in the Iheya North Knoll with wireline system. The temperature at depth of 187mbsf continued to increase almost linearly from 220 to 245°C during the 20 minute measurements time. This suggests that the inside borehole was cooled down by pumping seawater through drill pipes during the coring and lowering down the TRDT tool to the bottom hole. The in-situ temperature were extrapolated with exponential curve using nonlinear least squares fitting and the estimated equilibrium temperature was 278°C. To recover the in-situ temperature more precisely, the measurement time should kept as long as possible by considering the temperature rating. The operational

  16. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  17. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    Energy Technology Data Exchange (ETDEWEB)

    Plott, B. [Alion Science and Technology, MA and D Operation, 4949 Pearl E. Circle, 300, Boulder, CO 80301 (United States)

    2006-07-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  18. Investigation of tool engagement and cutting performance in machining a pocket

    Science.gov (United States)

    Adesta, E. Y. T.; Hamidon, R.; Riza, M.; Alrashidi, R. F. F. A.; Alazemi, A. F. F. S.

    2018-01-01

    This study investigates the variation of tool engagement for different profile of cutting. In addition, behavior of cutting force and cutting temperature for different tool engagements for machining a pocket also been explored. Initially, simple tool engagement models were developed for peripheral and slot cutting for different types of corner. Based on these models, the tool engagements for contour and zig zag tool path strategies for a rectangular shape pocket with dimension 80 mm x 60 mm were analyzed. Experiments were conducted to investigate the effect of tool engagements on cutting force and cutting temperature for the machining of a pocket of AISI H13 material. The cutting parameters used were 150m/min cutting speed, 0.05mm/tooth feed, and 0.1mm depth of cut. Based on the results obtained, the changes of cutting force and cutting temperature performance there exist a relationship between cutting force, cutting temperature and tool engagement. A higher cutting force and cutting temperature is obtained when the cutting tool goes through up milling and when the cutting tool makes a full engagement with the workpiece.

  19. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  20. INTERDEPENDENCE BETWEEN DRY DAYS AND TEMPERATURE OF SYLHET REGION: CORRELATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    Syed Mustakim Ali Shah

    2016-01-01

    Full Text Available Climate change can have profound impact on weather conditions around the world such as heavy rainfall, drought, global warming and so on. Understanding and predicting these natural variations is now a key research challenge for disaster-prone country like Bangladesh. This study focuses on the north eastern part of Bangladesh which is a hilly region, plays an important role in the ecological balance of the country along with socio-economic development. Present study analyses the behavior of maximum temperature and dry days using different statistical tools. Pearson’s correlation matrix and Man-Kendall’s tau are used to correlate monthly dry days with monthly maximum temperature, and also their annual trend. A moderate correlation was found mostly in dry summer months. In addition, a positive trend was observed in Man Kendall’s trend test of yearly temperature which might be an indication of global warming in this region.

  1. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Measurement and analysis of reactivity temperature coefficient of CEFR

    International Nuclear Information System (INIS)

    Chen Yiyu; Hu Yun; Yang Xiaoyan; Fan Zhendong; Zhang Qiang; Zhao Jinkun; Li Zehua

    2013-01-01

    The reactivity temperature coefficient of CEFR was calculated by CITATION program and compared with the results calculated by correlative programs and measured from experiments for temperature effects. It is indicated that the calculation results from CITATION agree well with measured values. The reactivity temperature coefficient of CEFR is about -4 pcm/℃. The deviation of the measured values between the temperature increasing and decreasing processes is about 11%, which satisfies the experiment acceptance criteria. The measured results can validate the calculation ones by program and can provide important reference data for the safety operation of CEFR and the analysis of the reactivity balance in the reactor refueling situation. (authors)

  3. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  4. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  5. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  6. Temperature analysis of laser ignited metalized material using spectroscopic technique

    Science.gov (United States)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  7. Spectral wave analysis at the mesopause from SCIAMACHY airglow data compared to SABER temperature spectra

    Directory of Open Access Journals (Sweden)

    M. Ern

    2009-01-01

    Full Text Available Space-time spectral analysis of satellite data is an important method to derive a synoptic picture of the atmosphere from measurements sampled asynoptically by satellite instruments. In addition, it serves as a powerful tool to identify and separate different wave modes in the atmospheric data. In our work we present space-time spectral analyses of chemical heating rates derived from Scanning Imaging Absorption SpectroMeter for Atmospheric CHartographY (SCIAMACHY hydroxyl nightglow emission measurements onboard Envisat for the years 2002–2006 at mesopause heights.

    Since SCIAMACHY nightglow hydroxyl emission measurements are restricted to the ascending (nighttime part of the satellite orbit, our analysis also includes temperature spectra derived from 15 μm CO2 emissions measured by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER instrument. SABER offers better temporal and spatial coverage (daytime and night-time values of temperature and a more regular sampling grid. Therefore SABER spectra also contain information about higher frequency waves.

    Comparison of SCIAMACHY and SABER results shows that SCIAMACHY, in spite of its observational restrictions, provides valuable information on most of the wave modes present in the mesopause region. The main differences between wave spectra obtained from these sensors can be attributed to the differences in their sampling patterns.

  8. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  9. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  10. The Pelagics Habitat Analysis Module (PHAM): Decision Support Tools for Pelagic Fisheries

    Science.gov (United States)

    Armstrong, E. M.; Harrison, D. P.; Kiefer, D.; O'Brien, F.; Hinton, M.; Kohin, S.; Snyder, S.

    2009-12-01

    PHAM is a project funded by NASA to integrate satellite imagery and circulation models into the management of commercial and threatened pelagic species. Specifically, the project merges data from fishery surveys, and fisheries catch and effort data with satellite imagery and circulation models to define the habitat of each species. This new information on habitat will then be used to inform population distribution and models of population dynamics that are used for management. During the first year of the project, we created two prototype modules. One module, which was developed for the Inter-American Tropical Tuna Commission, is designed to help improve information available to manage the tuna fisheries of the eastern Pacific Ocean. The other module, which was developed for the Coastal Pelagics Division of the Southwest Fishery Science Center, assists management of by-catch of mako, blue, and thresher sharks along the Californian coast. Both modules were built with the EASy marine geographic information system, which provides a 4 dimensional (latitude, longitude, depth, and time) home for integration of the data. The projects currently provide tools for automated downloading and geo-referencing of satellite imagery of sea surface temperature, height, and chlorophyll concentrations; output from JPL’s ECCO2 global circulation model and its ROM California current model; and gridded data from fisheries and fishery surveys. It also provides statistical tools for defining species habitat from these and other types of environmental data. These tools include unbalanced ANOVA, EOF analysis of satellite imagery, and multivariate search routines for fitting fishery data to transforms of the environmental data. Output from the projects consists of dynamic maps of the distribution of the species that are driven by the time series of satellite imagery and output from the circulation models. It also includes relationships between environmental variables and recruitment. During

  11. Temperature analysis with voltage-current time differential operation of electrochemical sensors

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Leta Yar-Li; Glass, Robert Scott; Fitzpatrick, Joseph Jay; Wang, Gangqiang; Henderson, Brett Tamatea; Lourdhusamy, Anthoniraj; Steppan, James John; Allmendinger, Klaus Karl

    2018-01-02

    A method for temperature analysis of a gas stream. The method includes identifying a temperature parameter of an affected waveform signal. The method also includes calculating a change in the temperature parameter by comparing the affected waveform signal with an original waveform signal. The method also includes generating a value from the calculated change which corresponds to the temperature of the gas stream.

  12. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  13. ELM - A SIMPLE TOOL FOR THERMAL-HYDRAULIC ANALYSIS OF SOLID-CORE NUCLEAR ROCKET FUEL ELEMENTS

    Science.gov (United States)

    Walton, J. T.

    1994-01-01

    ELM is a simple computational tool for modeling the steady-state thermal-hydraulics of propellant flow through fuel element coolant channels in nuclear thermal rockets. Written for the nuclear propulsion project of the Space Exploration Initiative, ELM evaluates the various heat transfer coefficient and friction factor correlations available for turbulent pipe flow with heat addition. In the past, these correlations were found in different reactor analysis codes, but now comparisons are possible within one program. The logic of ELM is based on the one-dimensional conservation of energy in combination with Newton's Law of Cooling to determine the bulk flow temperature and the wall temperature across a control volume. Since the control volume is an incremental length of tube, the corresponding pressure drop is determined by application of the Law of Conservation of Momentum. The size, speed, and accuracy of ELM make it a simple tool for use in fuel element parametric studies. ELM is a machine independent program written in FORTRAN 77. It has been successfully compiled on an IBM PC compatible running MS-DOS using Lahey FORTRAN 77, a DEC VAX series computer running VMS, and a Sun4 series computer running SunOS UNIX. ELM requires 565K of RAM under SunOS 4.1, 360K of RAM under VMS 5.4, and 406K of RAM under MS-DOS. Because this program is machine independent, no executable is provided on the distribution media. The standard distribution medium for ELM is one 5.25 inch 360K MS-DOS format diskette. ELM was developed in 1991. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation.

  14. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  15. Analysis of Minimum Quantity Lubrication (MQL for Different Coating Tools during Turning of TC11 Titanium Alloy

    Directory of Open Access Journals (Sweden)

    Sheng Qin

    2016-09-01

    Full Text Available The tool coating and cooling strategy are two key factors when machining difficult-to-cut materials such as titanium alloy. In this paper, diamond coating was deposited on a commercial carbide insert as an attempt to increase the machinability of TC11 alloy during the turning process. An uncoated carbide insert and a commercial Al2O3/TiAlN-coated tool were also tested as a comparison. Furthermore, MQL was applied to improve the cutting condition. Cutting performances were analyzed by cutting force, cutting temperate and surface roughness measurements. Tool wears and tool lives were evaluated to find a good matchup between the tool coating and cooling strategy. According to the results, using MQL can slightly reduce the cutting force. By applying MQL, cutting temperatures and tool wears were reduced by a great amount. Besides, MQL can affect the tool wear mechanism and tool failure modes. The tool life of an Al2O3/TiAlN-coated tool can be prolonged by 88.4% under the MQL condition. Diamond-coated tools can obtain a good surface finish when cutting parameters and lubrication strategies are properly chosen.

  16. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  17. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  18. Smart Cutting Tools and Smart Machining: Development Approaches, and Their Implementation and Application Perspectives

    Science.gov (United States)

    Cheng, Kai; Niu, Zhi-Chao; Wang, Robin C.; Rakowski, Richard; Bateman, Richard

    2017-09-01

    Smart machining has tremendous potential and is becoming one of new generation high value precision manufacturing technologies in line with the advance of Industry 4.0 concepts. This paper presents some innovative design concepts and, in particular, the development of four types of smart cutting tools, including a force-based smart cutting tool, a temperature-based internally-cooled cutting tool, a fast tool servo (FTS) and smart collets for ultraprecision and micro manufacturing purposes. Implementation and application perspectives of these smart cutting tools are explored and discussed particularly for smart machining against a number of industrial application requirements. They are contamination-free machining, machining of tool-wear-prone Si-based infra-red devices and medical applications, high speed micro milling and micro drilling, etc. Furthermore, implementation techniques are presented focusing on: (a) plug-and-produce design principle and the associated smart control algorithms, (b) piezoelectric film and surface acoustic wave transducers to measure cutting forces in process, (c) critical cutting temperature control in real-time machining, (d) in-process calibration through machining trials, (e) FE-based design and analysis of smart cutting tools, and (f) application exemplars on adaptive smart machining.

  19. Finite element analysis of cutting tools prior to fracture in hard turning operations

    International Nuclear Information System (INIS)

    Cakir, M. Cemal; I Sik, Yahya

    2005-01-01

    In this work cutting FEA of cutting tools prior to fracture is investigated. Fracture is the catastrophic end of the cutting edge that should be avoided for the cutting tool in order to have a longer tool life. This paper presents finite element modelling of a cutting tool just before its fracture. The data used in FEA are gathered from a tool breakage system that detects the fracture according to the variations of the cutting forces measured by a three-dimensional force dynamometer. The workpiece material used in the experiments is cold work tool steel, AISI O1 (60 HRC) and the cutting tool material is uncoated tungsten carbide (DNMG 150608). In order to investigate the cutting tool conditions in longitudinal external turning operations prior to fracture, static and dynamic finite element analyses are conducted. After the static finite element analysis, the modal and harmonic response analyses are carried on and the dynamic behaviours of the cutting tool structure are investigated. All FE analyses were performed using a commercial finite element package ANSYS

  20. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    Science.gov (United States)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  1. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  2. Spatial-temporal analysis of building surface temperatures in Hung Hom

    Science.gov (United States)

    Zeng, Ying; Shen, Yueqian

    2015-12-01

    This thesis presents a study on spatial-temporal analysis of building surface temperatures in Hung Hom. Observations were collected from Aug 2013 to Oct 2013 at a 30-min interval, using iButton sensors (N=20) covering twelve locations in Hung Hom. And thermal images were captured in PolyU from 05 Aug 2013 to 06 Aug 2013. A linear regression model of iButton and thermal records is established to calibrate temperature data. A 3D modeling system is developed based on Visual Studio 2010 development platform, using ArcEngine10.0 component, Microsoft Access 2010 database and C# programming language. The system realizes processing data, spatial analysis, compound query and 3D face temperature rendering and so on. After statistical analyses, building face azimuths are found to have a statistically significant relationship with sun azimuths at peak time. And seasonal building temperature changing also corresponds to the sun angle and sun azimuth variations. Building materials are found to have a significant effect on building surface temperatures. Buildings with lower albedo materials tend to have higher temperatures and larger thermal conductivity material have significant diurnal variations. For the geographical locations, the peripheral faces of campus have higher temperatures than the inner faces during day time and buildings located at the southeast are cooler than the western. Furthermore, human activity is found to have a strong relationship with building surface temperatures through weekday and weekend comparison.

  3. Mathematical tool from corn stover TGA to determine its composition.

    Science.gov (United States)

    Freda, Cesare; Zimbardi, Francesco; Nanna, Francesco; Viola, Egidio

    2012-08-01

    Corn stover was treated by steam explosion process at four different temperatures. A fraction of the four exploded matters was extracted by water. The eight samples (four from steam explosion and four from water extraction of exploded matters) were analysed by wet chemical way to quantify the amount of cellulose, hemicellulose and lignin. Thermogravimetric analysis in air atmosphere was executed on the eight samples. A mathematical tool was developed, using TGA data, to determine the composition of corn stover in terms of cellulose, hemicellulose and lignin. It uses the biomass degradation temperature as multiple linear function of the cellulose, hemicellulose and lignin content of the biomass with interactive terms. The mathematical tool predicted cellulose, hemicellulose and lignin contents with average absolute errors of 1.69, 5.59 and 0.74 %, respectively, compared to the wet chemical method.

  4. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  5. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  6. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  7. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  8. Fractal analysis as a potential tool for surface morphology of thin films

    Science.gov (United States)

    Soumya, S.; Swapna, M. S.; Raj, Vimal; Mahadevan Pillai, V. P.; Sankararaman, S.

    2017-12-01

    Fractal geometry developed by Mandelbrot has emerged as a potential tool for analyzing complex systems in the diversified fields of science, social science, and technology. Self-similar objects having the same details in different scales are referred to as fractals and are analyzed using the mathematics of non-Euclidean geometry. The present work is an attempt to correlate fractal dimension for surface characterization by Atomic Force Microscopy (AFM). Taking the AFM images of zinc sulphide (ZnS) thin films prepared by pulsed laser deposition (PLD) technique, under different annealing temperatures, the effect of annealing temperature and surface roughness on fractal dimension is studied. The annealing temperature and surface roughness show a strong correlation with fractal dimension. From the regression equation set, the surface roughness at a given annealing temperature can be calculated from the fractal dimension. The AFM images are processed using Photoshop and fractal dimension is calculated by box-counting method. The fractal dimension decreases from 1.986 to 1.633 while the surface roughness increases from 1.110 to 3.427, for a change of annealing temperature 30 ° C to 600 ° C. The images are also analyzed by power spectrum method to find the fractal dimension. The study reveals that the box-counting method gives better results compared to the power spectrum method.

  9. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  10. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    Science.gov (United States)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user

  11. Climate Prediction Center(CPC)Ensemble Canonical Correlation Analysis Forecast of Temperature

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ensemble Canonical Correlation Analysis (ECCA) temperature forecast is a 90-day (seasonal) outlook of US surface temperature anomalies. The ECCA uses Canonical...

  12. Efficient thermal error prediction in a machine tool using finite element analysis

    International Nuclear Information System (INIS)

    Mian, Naeem S; Fletcher, Simon; Longstaff, Andrew P; Myers, Alan

    2011-01-01

    Thermally induced errors have a major significance on the positional accuracy of a machine tool. Heat generated during the machining process produces thermal gradients that flow through the machine structure causing linear and nonlinear thermal expansions and distortions of associated complex discrete structures, producing deformations that adversely affect structural stability. The heat passes through structural linkages and mechanical joints where interfacial parameters such as the roughness and form of the contacting surfaces affect the thermal resistance and thus the heat transfer coefficients. This paper presents a novel offline technique using finite element analysis (FEA) to simulate the effects of the major internal heat sources such as bearings, motors and belt drives of a small vertical milling machine (VMC) and the effects of ambient temperature pockets that build up during the machine operation. Simplified models of the machine have been created offline using FEA software and evaluated experimental results applied for offline thermal behaviour simulation of the full machine structure. The FEA simulated results are in close agreement with the experimental results ranging from 65% to 90% for a variety of testing regimes and revealed a maximum error range of 70 µm reduced to less than 10 µm

  13. ADVANCED AND RAPID DEVELOPMENT OF DYNAMIC ANALYSIS TOOLS FOR JAVA

    Directory of Open Access Journals (Sweden)

    Alex Villazón

    2012-01-01

    Full Text Available Low-level bytecode instrumentation techniques are widely used in many software-engineering tools for the Java Virtual Machine (JVM, that perform some form of dynamic program analysis, such as profilers or debuggers. While program manipulation at the bytecode level is very flexible, because the possible bytecode transformations are not restricted, tool development based on this technique is tedious and error-prone. As a promising alternative, the specification of bytecode instrumentation at a higher level using aspect-oriented programming (AOP can reduce tool development time and cost. Unfortunately, prevailing AOP frameworks lack some features that are essential for certain dynamic analyses. In this article, we focus on three common shortcomings in AOP frameworks with respect to the development of aspect-based tools - (1 the lack of mechanisms for passing data between woven advices in local variables, (2 the support for user-defined static analyses at weaving time, and (3 the absence of pointcuts at the level of individual basic blocks of code. We propose @J, an annotation-based AOP language and weaver that integrates support for these three features. The benefits of the proposed features are illustrated with concrete examples.

  14. Analysis of Global Urban Temperature Trends and Urbanization Impacts

    Science.gov (United States)

    Lee, K. I.; Ryu, J.; Jeon, S. W.

    2018-04-01

    Due to urbanization, urban areas are shrinking green spaces and increasing concrete, asphalt pavement. So urban climates are different from non-urban areas. In addition, long-term macroscopic studies of urban climate change are becoming more important as global urbanization affects global warming. To do this, it is necessary to analyze the effect of urbanization on the temporal change in urban temperature with the same temperature data and standards for urban areas around the world. In this study, time series analysis was performed with the maximum, minimum, mean and standard values of surface temperature during the from 1980 to 2010 and analyzed the effect of urbanization through linear regression analysis with variables (population, night light, NDVI, urban area). As a result, the minimum value of the surface temperature of the urban area reflects an increase by a rate of 0.28K decade-1 over the past 31 years, the maximum value reflects an increase by a rate of 0.372K decade-1, the mean value reflects an increase by a rate of 0.208 decade-1, and the standard deviation reflects a decrease by rate of 0.023K decade-1. And the change of surface temperature in urban areas is affected by urbanization related to land cover such as decrease of greenery and increase of pavement area, but socioeconomic variables are less influential than NDVI in this study. This study are expected to provide an approach to future research and policy-planning for urban temperature change and urbanization impacts.

  15. An ultrasonic inspection tool for production tubulars

    Energy Technology Data Exchange (ETDEWEB)

    Newton, K; Martin, R; Ravenscroft, F [AEA Technology, Harwell (United Kingdom)

    1994-06-01

    Advances in ultrasonic technology, high temperature techniques and remote processing power are enabling a new generation of inspection tools to be developed. This paper describes a particular new ultrasonic caliper system, developed by AEA Technology, with the aim of providing improved information about the condition of production tubulars of oil and gas wells. The system is designed to provide enhanced surface area coverage compared to the current devices, which are typically mechanical 'finger' calipers. It also provides a non-contacting measure of corrosion and wear together with direct on-line output and automated data analysis. The new tool is designed to operate in oil and gas, vertical or deviated wells and has the potential for modification to inspect small diameter pipes in topside or other plant. (author)

  16. HYDROLOGIC AND FEATURE-BASED SURFACE ANALYSIS FOR TOOL MARK INVESTIGATION ON ARCHAEOLOGICAL FINDS

    Directory of Open Access Journals (Sweden)

    K. Kovács

    2012-07-01

    Full Text Available The improvement of detailed surface documentation methods provides unique tool mark-study opportunities in the field of archaeological researches. One of these data collection techniques is short-range laser scanning, which creates a digital copy of the object’s morphological characteristics from high-resolution datasets. The aim of our work was the accurate documentation of a Bronze Age sluice box from Mitterberg, Austria with a spatial resolution of 0.2 mm. Furthermore, the investigation of the entirely preserved tool marks on the surface of this archaeological find was also accomplished by these datasets. The methodology of this tool mark-study can be summarized in the following way: At first, a local hydrologic analysis has been applied to separate the various patterns of tools on the finds’ surface. As a result, the XYZ coordinates of the special points, which represent the edge lines of the sliding tool marks, were calculated by buffer operations in a GIS environment. During the second part of the workflow, these edge points were utilized to manually clip the triangle meshes of these patterns in reverse engineering software. Finally, circle features were generated and analysed to determine the different sections along these sliding tool marks. In conclusion, the movement of the hand tool could be reproduced by the spatial analysis of the created features, since the horizontal and vertical position of the defined circle centre points indicated the various phases of the movements. This research shows an exact workflow to determine the fine morphological structures on the surface of the archaeological find.

  17. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stregy, Seth [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Dasilva, Ana [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Yilmaz, Serkan [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Saha, Pradip [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Loewen, Eric [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States)

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  18. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  19. Temperature control of fimbriation circuit switch in uropathogenic Escherichia coli: quantitative analysis via automated model abstraction.

    Directory of Open Access Journals (Sweden)

    Hiroyuki Kuwahara

    2010-03-01

    Full Text Available Uropathogenic Escherichia coli (UPEC represent the predominant cause of urinary tract infections (UTIs. A key UPEC molecular virulence mechanism is type 1 fimbriae, whose expression is controlled by the orientation of an invertible chromosomal DNA element-the fim switch. Temperature has been shown to act as a major regulator of fim switching behavior and is overall an important indicator as well as functional feature of many urologic diseases, including UPEC host-pathogen interaction dynamics. Given this panoptic physiological role of temperature during UTI progression and notable empirical challenges to its direct in vivo studies, in silico modeling of corresponding biochemical and biophysical mechanisms essential to UPEC pathogenicity may significantly aid our understanding of the underlying disease processes. However, rigorous computational analysis of biological systems, such as fim switch temperature control circuit, has hereto presented a notoriously demanding problem due to both the substantial complexity of the gene regulatory networks involved as well as their often characteristically discrete and stochastic dynamics. To address these issues, we have developed an approach that enables automated multiscale abstraction of biological system descriptions based on reaction kinetics. Implemented as a computational tool, this method has allowed us to efficiently analyze the modular organization and behavior of the E. coli fimbriation switch circuit at different temperature settings, thus facilitating new insights into this mode of UPEC molecular virulence regulation. In particular, our results suggest that, with respect to its role in shutting down fimbriae expression, the primary function of FimB recombinase may be to effect a controlled down-regulation (rather than increase of the ON-to-OFF fim switching rate via temperature-dependent suppression of competing dynamics mediated by recombinase FimE. Our computational analysis further implies

  20. A New Tool for Separating the Magnetic Mineralogy of Complex Mineral Assemblages from Low Temperature Magnetic Behavior

    Directory of Open Access Journals (Sweden)

    France Lagroix

    2017-07-01

    Full Text Available One timeless challenge in rock magnetic studies, inclusive of paleomagnetism and environmental magnetism, is decomposing a sample's bulk magnetic behavior into its individual magnetic mineral components. We present a method permitting to decompose the magnetic behavior of a bulk sample experimentally and at low temperature avoiding any ambiguities in data interpretation due to heating-induced alteration. A single instrument is used to measure the temperature dependence of remanent magnetizations and to apply an isothermal demagnetization step at any temperature between 2 and 400 K. The experimental method is validated on synthetic mixtures of magnetite, hematite, goethite as well as on natural loess samples where the contributions of magnetite, goethite, hematite and maghemite are successfully isolated. The experimental protocol can be adapted to target other iron bearing minerals relevant to the rock or sediment under study. One limitation rests on the fact that the method is based on remanent magnetizations. Consequently, a quantitative decomposition of absolute concentration of individual components remains unachievable without assumptions. Nonetheless, semi-quantitative magnetic mineral concentrations were determined on synthetic and natural loess/paleosol samples in order to validate and test the method as a semi-quantitative tool in environmental magnetism studies.

  1. The design and use of reliability data base with analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.

  2. The design and use of reliability data base with analysis tool

    International Nuclear Information System (INIS)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst's current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs

  3. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  4. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  5. A variable-temperature nanostencil compatible with a low-temperature scanning tunneling microscope/atomic force microscope

    International Nuclear Information System (INIS)

    Steurer, Wolfram; Gross, Leo; Schlittler, Reto R.; Meyer, Gerhard

    2014-01-01

    We describe a nanostencil lithography tool capable of operating at variable temperatures down to 30 K. The setup is compatible with a combined low-temperature scanning tunneling microscope/atomic force microscope located within the same ultra-high-vacuum apparatus. The lateral movement capability of the mask allows the patterning of complex structures. To demonstrate operational functionality of the tool and estimate temperature drift and blurring, we fabricated LiF and NaCl nanostructures on Cu(111) at 77 K

  6. A variable-temperature nanostencil compatible with a low-temperature scanning tunneling microscope/atomic force microscope

    Energy Technology Data Exchange (ETDEWEB)

    Steurer, Wolfram, E-mail: wst@zurich.ibm.com; Gross, Leo; Schlittler, Reto R.; Meyer, Gerhard [IBM Research-Zurich, 8803 Rüschlikon (Switzerland)

    2014-02-15

    We describe a nanostencil lithography tool capable of operating at variable temperatures down to 30 K. The setup is compatible with a combined low-temperature scanning tunneling microscope/atomic force microscope located within the same ultra-high-vacuum apparatus. The lateral movement capability of the mask allows the patterning of complex structures. To demonstrate operational functionality of the tool and estimate temperature drift and blurring, we fabricated LiF and NaCl nanostructures on Cu(111) at 77 K.

  7. A variable-temperature nanostencil compatible with a low-temperature scanning tunneling microscope/atomic force microscope.

    Science.gov (United States)

    Steurer, Wolfram; Gross, Leo; Schlittler, Reto R; Meyer, Gerhard

    2014-02-01

    We describe a nanostencil lithography tool capable of operating at variable temperatures down to 30 K. The setup is compatible with a combined low-temperature scanning tunneling microscope/atomic force microscope located within the same ultra-high-vacuum apparatus. The lateral movement capability of the mask allows the patterning of complex structures. To demonstrate operational functionality of the tool and estimate temperature drift and blurring, we fabricated LiF and NaCl nanostructures on Cu(111) at 77 K.

  8. Modeling high temperature materials behavior for structural analysis

    CERN Document Server

    Naumenko, Konstantin

    2016-01-01

    This monograph presents approaches to characterize inelastic behavior of materials and structures at high temperature. Starting from experimental observations, it discusses basic features of inelastic phenomena including creep, plasticity, relaxation, low cycle and thermal fatigue. The authors formulate constitutive equations to describe the inelastic response for the given states of stress and microstructure. They introduce evolution equations to capture hardening, recovery, softening, ageing and damage processes. Principles of continuum mechanics and thermodynamics are presented to provide a framework for the modeling materials behavior with the aim of structural analysis of high-temperature engineering components.

  9. Thermal Analysis of Bending Under Tension Test

    DEFF Research Database (Denmark)

    Ceron, Ermanno; Martins, Paulo A.F.; Bay, Niels

    2014-01-01

    during testing is similar to the one in the production tool. A universal sheet tribo-tester has been developed, which can run multiple tests automatically from coil. This allows emulating the temperature increase as in production. The present work performs finite element analysis of the evolution......The tribological conditions in deep drawing can be simulated in the Bending Under Tension test to evaluate the performance of new lubricants, tool materials, etc. Deep drawing production with automatic handling runs normally at high rate. This implies considerable heating of the tools, which...... sometimes can cause lubricant film breakdown and galling. In order to replicate the production conditions in bending under tension testing it is thus important to control the tool/workpiece interface temperature. This can be done by pre-heating the tool, but it is essential that the interface temperature...

  10. BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.

    Science.gov (United States)

    Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph

    2015-02-21

    Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.

  11. Analysis of Multiple Genomic Sequence Alignments: A Web Resource, Online Tools, and Lessons Learned From Analysis of Mammalian SCL Loci

    Science.gov (United States)

    Chapman, Michael A.; Donaldson, Ian J.; Gilbert, James; Grafham, Darren; Rogers, Jane; Green, Anthony R.; Göttgens, Berthold

    2004-01-01

    Comparative analysis of genomic sequences is becoming a standard technique for studying gene regulation. However, only a limited number of tools are currently available for the analysis of multiple genomic sequences. An extensive data set for the testing and training of such tools is provided by the SCL gene locus. Here we have expanded the data set to eight vertebrate species by sequencing the dog SCL locus and by annotating the dog and rat SCL loci. To provide a resource for the bioinformatics community, all SCL sequences and functional annotations, comprising a collation of the extensive experimental evidence pertaining to SCL regulation, have been made available via a Web server. A Web interface to new tools specifically designed for the display and analysis of multiple sequence alignments was also implemented. The unique SCL data set and new sequence comparison tools allowed us to perform a rigorous examination of the true benefits of multiple sequence comparisons. We demonstrate that multiple sequence alignments are, overall, superior to pairwise alignments for identification of mammalian regulatory regions. In the search for individual transcription factor binding sites, multiple alignments markedly increase the signal-to-noise ratio compared to pairwise alignments. PMID:14718377

  12. Materials corrosion and protection at high temperatures

    International Nuclear Information System (INIS)

    Balbaud, F.; Desgranges, Clara; Martinelli, Laure; Rouillard, Fabien; Duhamel, Cecile; Marchetti, Loic; Perrin, Stephane; Molins, Regine; Chevalier, S.; Heintz, O.; David, N.; Fiorani, J.M.; Vilasi, M.; Wouters, Y.; Galerie, A.; Mangelinck, D.; Viguier, B.; Monceau, D.; Soustelle, M.; Pijolat, M.; Favergeon, J.; Brancherie, D.; Moulin, G.; Dawi, K.; Wolski, K.; Barnier, V.; Rebillat, F.; Lavigne, O.; Brossard, J.M.; Ropital, F.; Mougin, J.

    2011-01-01

    This book was made from the lectures given in 2010 at the thematic school on 'materials corrosion and protection at high temperatures'. It gathers the contributions from scientists and engineers coming from various communities and presents a state-of-the-art of the scientific and technological developments concerning the behaviour of materials at high temperature, in aggressive environments and in various domains (aerospace, nuclear, energy valorization, and chemical industries). It supplies pedagogical tools to grasp high temperature corrosion thanks to the understanding of oxidation mechanisms. It proposes some protection solutions for materials and structures. Content: 1 - corrosion costs; macro-economical and metallurgical approach; 2 - basic concepts of thermo-chemistry; 3 - introduction to the Calphad (calculation of phase diagrams) method; 4 - use of the thermodynamic tool: application to pack-cementation; 5 - elements of crystallography and of real solids description; 6 - diffusion in solids; 7 - notions of mechanics inside crystals; 8 - high temperature corrosion: phenomena, models, simulations; 9 - pseudo-stationary regime in heterogeneous kinetics; 10 - nucleation, growth and kinetic models; 11 - test experiments in heterogeneous kinetics; 12 - mechanical aspects of metal/oxide systems; 13 - coupling phenomena in high temperature oxidation; 14 - other corrosion types; 15 - methods of oxidized surfaces analysis at micro- and nano-scales; 16 - use of SIMS in the study of high temperature corrosion of metals and alloys; 17 - oxidation of ceramics and of ceramic matrix composite materials; 18 - protective coatings against corrosion and oxidation; 19 - high temperature corrosion in the 4. generation of nuclear reactor systems; 20 - heat exchangers corrosion in municipal waste energy valorization facilities; 21 - high temperature corrosion in oil refining and petrochemistry; 22 - high temperature corrosion in new energies industry. (J.S.)

  13. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  14. Application of the Finite Elemental Analysis to Modeling Temperature Change of the Vaccine in an Insulated Packaging Container during Transport.

    Science.gov (United States)

    Ge, Changfeng; Cheng, Yujie; Shen, Yan

    2013-01-01

    This study demonstrated an attempt to predict temperatures of a perishable product such as vaccine inside an insulated packaging container during transport through finite element analysis (FEA) modeling. In order to use the standard FEA software for simulation, an equivalent heat conduction coefficient is proposed and calculated to describe the heat transfer of the air trapped inside the insulated packaging container. The three-dimensional, insulated packaging container is regarded as a combination of six panels, and the heat flow at each side panel is a one-dimension diffusion process. The transit-thermal analysis was applied to simulate the heat transition process from ambient environment to inside the container. Field measurements were carried out to collect the temperature during transport, and the collected data were compared to the FEA simulation results. Insulated packaging containers are used to transport temperature-sensitive products such as vaccine and other pharmaceutical products. The container is usually made of an extruded polystyrene foam filled with gel packs. World Health Organization guidelines recommend that all vaccines except oral polio vaccine be distributed in an environment where the temperature ranges between +2 to +8 °C. The primary areas of concern in designing the packaging for vaccine are how much of the foam thickness and gel packs should be used in order to keep the temperature in a desired range, and how to prevent the vaccine from exposure to freezing temperatures. This study uses numerical simulation to predict temperature change within an insulated packaging container in vaccine cold chain. It is our hope that this simulation will provide the vaccine industries with an alternative engineering tool to validate vaccine packaging and project thermal equilibrium within the insulated packaging container.

  15. Analysis on High Temperature Aging Property of Self-brazing Aluminum Honeycomb Core at Middle Temperature

    Directory of Open Access Journals (Sweden)

    ZHAO Huan

    2016-11-01

    Full Text Available Tension-shear test was carried out on middle temperature self-brazing aluminum honeycomb cores after high temperature aging by micro mechanical test system, and the microstructure and component of the joints were observed and analyzed using scanning electron microscopy and energy dispersive spectroscopy to study the relationship between brazing seam microstructure, component and high temperature aging properties. Results show that the tensile-shear strength of aluminum honeycomb core joints brazed by 1060 aluminum foil and aluminum composite brazing plate after high temperature aging(200℃/12h, 200℃/24h, 200℃/36h is similar to that of as-welded joints, and the weak part of the joint is the base metal which is near the brazing joint. The observation and analysis of the aluminum honeycomb core microstructure and component show that the component of Zn, Sn at brazing seam is not much affected and no compound phase formed after high temperature aging; therefore, the main reason for good high temperature aging performance of self-brazing aluminum honeycomb core is that no obvious change of brazing seam microstructure and component occurs.

  16. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    Lu Qiming; Biery, Kurt A; Kowalkowski, James B

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  17. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  18. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  19. SGRAPH (SeismoGRAPHer): Seismic waveform analysis and integrated tools in seismology

    Science.gov (United States)

    Abdelwahed, Mohamed F.

    2012-03-01

    Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.

  20. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  1. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  2. Combined numerical and experimental determination of the convective heat transfer coefficient between an AlCrN-coated Vanadis 4 tool and Rhenus oil

    DEFF Research Database (Denmark)

    Üstünyagiz, Esmeray; Nielsen, Chris V.; Tiedje, Niels S.

    2018-01-01

    Abstract Regardless of the field of application, the reliability of numerical simulations depends on correct description of boundary conditions. In thermal simulation, determination of heat transfer coefficients is important because it varies with material properties and process conditions....... This paper shows a combined experimental and numerical analysis applied for determination of the heat transfer coefficient between an AlCrN-coated Vanadis 4 tool and Rhenus LA722086 oil in an unloaded condition, i.e. without the tool being in contact with a workpiece. It is found that the heat transfer...... coefficient in unloaded conditions at 80°C oil temperature is 0.1 kW/(m2∙K) between the selected stamping tool and mineral oil. A sensitivity analysis of the numerical model was performed to verify the effects of mesh discretization, temperature measurement location and tool geometry. Among these parameters...

  3. Renormalization group analysis of the temperature dependent coupling constant in massless theory

    International Nuclear Information System (INIS)

    Yamada, Hirofumi.

    1987-06-01

    A general analysis of finite temperature renormalization group equations for massless theories is presented. It is found that in a direction where momenta and temperature are scaled up with their ratio fixed the coupling constant behaves in the same manner as in zero temperature and that asymptotic freedom at short distances is also maintained at finite temperature. (author)

  4. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    Author's accepted version (post-print). The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to eval...

  5. On the frequency dependence of the high temperature background

    International Nuclear Information System (INIS)

    Povolo, F.; Hermida, E.B.

    1996-01-01

    The high temperature background (HTB) damping in metals and alloys has been measured mostly as a function of temperature. These data were described by several empirical expressions proposed in the literature. In the present work, HTB in pure Mg and in two alloys (Zry-4 and Cu-5 at.%Au), measured with a torsion pendulum with variable moment of inertia, are analyzed on considering a new treatment of the data. This analysis provides an useful tool to determine whether a damping process is linear or not. (orig.)

  6. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  7. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  8. The thermodynamic meaning of local temperature of nonequilibrium open quantum systems

    OpenAIRE

    Ye, LvZhou; Zheng, Xiao; Yan, YiJing; Di Ventra, Massimiliano

    2016-01-01

    Measuring the local temperature of nanoscale systems out of equilibrium has emerged as a new tool to study local heating effects and other local thermal properties of systems driven by external fields. Although various experimental protocols and theoretical definitions have been proposed to determine the local temperature, the thermodynamic meaning of the measured or defined quantities remains unclear. By performing analytical and numerical analysis of bias-driven quantum dot systems both in ...

  9. Choosing your weapons : on sentiment analysis tools for software engineering research

    NARCIS (Netherlands)

    Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.

    2015-01-01

    Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these

  10. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  11. BFPTool: a software tool for analysis of Biomembrane Force Probe experiments.

    Science.gov (United States)

    Šmít, Daniel; Fouquet, Coralie; Doulazmi, Mohamed; Pincet, Frédéric; Trembleau, Alain; Zapotocky, Martin

    2017-01-01

    The Biomembrane Force Probe is an approachable experimental technique commonly used for single-molecule force spectroscopy and experiments on biological interfaces. The technique operates in the range of forces from 0.1 pN to 1000 pN. Experiments are typically repeated many times, conditions are often not optimal, the captured video can be unstable and lose focus; this makes efficient analysis challenging, while out-of-the-box non-proprietary solutions are not freely available. This dedicated tool was developed to integrate and simplify the image processing and analysis of videomicroscopy recordings from BFP experiments. A novel processing feature, allowing the tracking of the pipette, was incorporated to address a limitation of preceding methods. Emphasis was placed on versatility and comprehensible user interface implemented in a graphical form. An integrated analytical tool was implemented to provide a faster, simpler and more convenient way to process and analyse BFP experiments.

  12. Utilization of mathematical models to manage risk of holding cold food without temperature control.

    Science.gov (United States)

    Schaffner, Donald W

    2013-06-01

    This document describes the development of a tool to manage the risk of the transportation of cold food without temperature control. The tool uses predictions from ComBase predictor and builds on the 2009 U.S. Food and Drug Administration Model Food Code and supporting scientific data in the Food Code annex. I selected Salmonella spp. and Listeria monocytogenes as the organisms for risk management. Salmonella spp. were selected because they are associated with a wide variety of foods and grow rapidly at temperatures >17°C. L. monocytogenes was selected because it is frequently present in the food processing environment, it was used in the original analysis contained in the Food Code Annex, and it grows relatively rapidly at temperatures supplier collected as part of this project. The resulting model-based tool will be a useful aid to risk managers and customers of wholesale cash and carry food service suppliers, as well as to anyone interested in assessing and managing the risks posed by holding cold foods out of temperature control in supermarkets, delis, restaurants, cafeterias, and homes.

  13. Exergy analysis for stationary flow systems with several heat exchange temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Lampinen, M J; Heikkinen, M A [Helsinki Univ. of Technology, Espoo (Finland). Dept. of Energy Engineering

    1995-07-01

    A thermodynamic theory of exergy analysis for a stationary flow system having several heat inputs and outputs at different temperature levels is presented. As a new result a relevant reference temperature of the surroundings is derived for each case. Also a general formula which combines exergy analysis with a modified Carnot efficiency is derived. The results are illustrated by numerical examples for mechanical multi-circuit heat pump cycles, for a Brayton process and for an absorption heat pump. (Author)

  14. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.

    2013-01-01

    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  15. Thermal model to investigate the temperature in bone grinding for skull base neurosurgery.

    Science.gov (United States)

    Zhang, Lihui; Tai, Bruce L; Wang, Guangjun; Zhang, Kuibang; Sullivan, Stephen; Shih, Albert J

    2013-10-01

    This study develops a thermal model utilizing the inverse heat transfer method (IHTM) to investigate the bone grinding temperature created by a spherical diamond tool used for skull base neurosurgery. Bone grinding is a critical procedure in the expanded endonasal approach to remove the cranial bone and access to the skull base tumor via nasal corridor. The heat is generated during grinding and could damage the nerve or coagulate the blood in the carotid artery adjacent to the bone. The finite element analysis is adopted to investigate the grinding-induced bone temperature rise. The heat source distribution is defined by the thermal model, and the temperature distribution is solved using the IHTM with experimental inputs. Grinding experiments were conducted on a bovine cortical bone with embedded thermocouples. Results show significant temperature rise in bone grinding. Using 50°C as the threshold, the thermal injury can propagate about 3mm in the traverse direction, and 3mm below the ground surface under the dry grinding condition. The presented methodology demonstrated the capability of being a thermal analysis tool for bone grinding study. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  16. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley

    2009-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  17. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  18. Nonlinear analysis of reinforced concrete structures subjected to high temperature and external load

    International Nuclear Information System (INIS)

    Sugawara, Y.; Goto, M.; Saito, K.; Suzuki, N.; Muto, A.; Ueda, M.

    1993-01-01

    A quarter of a century has passed since the finite element method was first applied to nonlinear problems concerning reinforced concrete structures, and the reliability of the analysis at ordinary temperature has been enhanced accordingly. By contrast, few studies have tried to deal with the nonlinear behavior of reinforced concrete structures subjected to high temperature and external loads simultaneously. It is generally known that the mechanical properties of concrete and steel are affected greatly by temperature. Therefore, in order to analyze the nonlinear behavior of reinforced concrete subjected to external loads at high temperature, it is necessary to construct constitutive models of the materials reflecting the influence of temperature. In this study, constitutive models of concrete and reinforcement that can express decreases in strength and stiffness at high temperature have been developed. A two-dimensional nonlinear finite element analysis program has been developed by use of these material models. The behavior of reinforced concrete beams subjected simultaneously to high temperature and shear forces were simulated using the developed analytical method. The results of the simulation agreed well with the experimental results, evidencing the validity of the developed material models and the finite element analysis program

  19. Thermophysical characterization tools and numerical models for high temperature thermo-structural composite materials

    International Nuclear Information System (INIS)

    Lorrette, Ch.

    2007-04-01

    This work is an original contribution to the study of the thermo-structural composite materials thermal behaviour. It aims to develop a methodology with a new experimental device for thermal characterization adapted to this type of material and to model the heat transfer by conduction within these heterogeneous media. The first part deals with prediction of the thermal effective conductivity of stratified composite materials in the three space directions. For that, a multi scale model using a rigorous morphology analysis of the structure and the elementary properties is proposed and implemented. The second part deals with the thermal characterization at high temperature. It shows how to estimate simultaneously the thermal effusiveness and the thermal conductivity. The present method is based on the observation of the heating from a plane sample submitted to a continuous excitation generated by Joule Effect. Heat transfer is modelled with the quadrupole formalism, temperature is here measured on two sides of the sample. The development of both resistive probes for excitation and linear probes for temperature measurements enables the thermal properties measured up to 1000 C. Finally, some experimental and numerical application examples lead to review the obtained results. (author)

  20. A developmental screening tool for toddlers with multiple domains based on Rasch analysis.

    Science.gov (United States)

    Hwang, Ai-Wen; Chou, Yeh-Tai; Hsieh, Ching-Lin; Hsieh, Wu-Shiun; Liao, Hua-Fang; Wong, Alice May-Kuen

    2015-01-01

    Using multidomain developmental screening tools is a feasible method for pediatric health care professionals to identify children at risk of developmental problems in multiple domains simultaneously. The purpose of this study was to develop a Rasch-based tool for Multidimensional Screening in Child Development (MuSiC) for children aged 0-3 years. The MuSic was developed by constructing items bank based on three commonly used screening tools, validating with developmental status (at risk for delay or not) on five developmental domains. Parents of a convenient sample of 632 children (aged 3-35.5 months) with and without developmental delays responded to items from the three screening tools funded by health authorities in Taiwan. Item bank was determined by item fit of Rasch analysis for each of the five developmental domains (cognitive skills, language skills, gross motor skills, fine motor skills, and socioadaptive skills). Children's performance scores in logits derived in Rasch analysis were validated with developmental status for each domain using the area under receiver operating characteristic curves. MuSiC, a 75-item developmental screening tool for five domains, was derived. The diagnostic validity of all five domains was acceptable for all stages of development, except for the infant stage (≤11 months and 15 days). MuSiC can be applied simultaneously to well-child care visits as a universal screening tool for children aged 1-3 years on multiple domains. Items with sound validity for infants need to be further developed. Copyright © 2014. Published by Elsevier B.V.

  1. Analysis of temperature and stress distribution of superheater tubes after attemperation or sootblower activation

    International Nuclear Information System (INIS)

    Madejski, Paweł; Taler, Dawid

    2013-01-01

    Highlights: • The CFD simulation was used to calculate 3D steam and tube wall temperature distributions in the platen superheater. • The CFD results can be used in design of superheaters made of tubes with complex cross-section. • The CFD analysis enables the proper selection of the steel grade. • The transient temperature and stress distributions were calculated using Finite Volume Method. • The detailed analysis prevents superheater tubes from excessive stresses during sootblower or attemperator activation. - Abstract: Superheaters are characterized by high metal temperatures due to higher steam temperature and low heat transfer coefficients on the tube inner surfaces. Superheaters have especially difficult operating conditions, particularly during attemperator and sootblower activations, when temperature and steam flow rate as well as tube wall temperature change with time. A detailed thermo-mechanical analysis of the superheater tubes makes it possible to identify the cause of premature high-temperature failures and aids greatly in the changes in tubing arrangement and improving start-up technology. This paper presents a thermal and strength analysis of a tube “double omega”, used in the steam superheaters in CFB boilers

  2. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  3. NOAA High-Resolution Sea Surface Temperature (SST) Analysis Products

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This archive covers two high resolution sea surface temperature (SST) analysis products developed using an optimum interpolation (OI) technique. The analyses have a...

  4. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    Science.gov (United States)

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy

  5. Error analysis for mesospheric temperature profiling by absorptive occultation sensors

    Directory of Open Access Journals (Sweden)

    M. J. Rieder

    Full Text Available An error analysis for mesospheric profiles retrieved from absorptive occultation data has been performed, starting with realistic error assumptions as would apply to intensity data collected by available high-precision UV photodiode sensors. Propagation of statistical errors was investigated through the complete retrieval chain from measured intensity profiles to atmospheric density, pressure, and temperature profiles. We assumed unbiased errors as the occultation method is essentially self-calibrating and straight-line propagation of occulted signals as we focus on heights of 50–100 km, where refractive bending of the sensed radiation is negligible. Throughout the analysis the errors were characterized at each retrieval step by their mean profile, their covariance matrix and their probability density function (pdf. This furnishes, compared to a variance-only estimation, a much improved insight into the error propagation mechanism. We applied the procedure to a baseline analysis of the performance of a recently proposed solar UV occultation sensor (SMAS – Sun Monitor and Atmospheric Sounder and provide, using a reasonable exponential atmospheric model as background, results on error standard deviations and error correlation functions of density, pressure, and temperature profiles. Two different sensor photodiode assumptions are discussed, respectively, diamond diodes (DD with 0.03% and silicon diodes (SD with 0.1% (unattenuated intensity measurement noise at 10 Hz sampling rate. A factor-of-2 margin was applied to these noise values in order to roughly account for unmodeled cross section uncertainties. Within the entire height domain (50–100 km we find temperature to be retrieved to better than 0.3 K (DD / 1 K (SD accuracy, respectively, at 2 km height resolution. The results indicate that absorptive occultations acquired by a SMAS-type sensor could provide mesospheric profiles of fundamental variables such as temperature with

  6. Error analysis for mesospheric temperature profiling by absorptive occultation sensors

    Directory of Open Access Journals (Sweden)

    M. J. Rieder

    2001-01-01

    Full Text Available An error analysis for mesospheric profiles retrieved from absorptive occultation data has been performed, starting with realistic error assumptions as would apply to intensity data collected by available high-precision UV photodiode sensors. Propagation of statistical errors was investigated through the complete retrieval chain from measured intensity profiles to atmospheric density, pressure, and temperature profiles. We assumed unbiased errors as the occultation method is essentially self-calibrating and straight-line propagation of occulted signals as we focus on heights of 50–100 km, where refractive bending of the sensed radiation is negligible. Throughout the analysis the errors were characterized at each retrieval step by their mean profile, their covariance matrix and their probability density function (pdf. This furnishes, compared to a variance-only estimation, a much improved insight into the error propagation mechanism. We applied the procedure to a baseline analysis of the performance of a recently proposed solar UV occultation sensor (SMAS – Sun Monitor and Atmospheric Sounder and provide, using a reasonable exponential atmospheric model as background, results on error standard deviations and error correlation functions of density, pressure, and temperature profiles. Two different sensor photodiode assumptions are discussed, respectively, diamond diodes (DD with 0.03% and silicon diodes (SD with 0.1% (unattenuated intensity measurement noise at 10 Hz sampling rate. A factor-of-2 margin was applied to these noise values in order to roughly account for unmodeled cross section uncertainties. Within the entire height domain (50–100 km we find temperature to be retrieved to better than 0.3 K (DD / 1 K (SD accuracy, respectively, at 2 km height resolution. The results indicate that absorptive occultations acquired by a SMAS-type sensor could provide mesospheric profiles of fundamental variables such as temperature with

  7. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  8. Fast-NPS-A Markov Chain Monte Carlo-based analysis tool to obtain structural information from single-molecule FRET measurements

    Science.gov (United States)

    Eilert, Tobias; Beckers, Maximilian; Drechsler, Florian; Michaelis, Jens

    2017-10-01

    The analysis tool and software package Fast-NPS can be used to analyse smFRET data to obtain quantitative structural information about macromolecules in their natural environment. In the algorithm a Bayesian model gives rise to a multivariate probability distribution describing the uncertainty of the structure determination. Since Fast-NPS aims to be an easy-to-use general-purpose analysis tool for a large variety of smFRET networks, we established an MCMC based sampling engine that approximates the target distribution and requires no parameter specification by the user at all. For an efficient local exploration we automatically adapt the multivariate proposal kernel according to the shape of the target distribution. In order to handle multimodality, the sampler is equipped with a parallel tempering scheme that is fully adaptive with respect to temperature spacing and number of chains. Since the molecular surrounding of a dye molecule affects its spatial mobility and thus the smFRET efficiency, we introduce dye models which can be selected for every dye molecule individually. These models allow the user to represent the smFRET network in great detail leading to an increased localisation precision. Finally, a tool to validate the chosen model combination is provided. Programme Files doi:http://dx.doi.org/10.17632/7ztzj63r68.1 Licencing provisions: Apache-2.0 Programming language: GUI in MATLAB (The MathWorks) and the core sampling engine in C++ Nature of problem: Sampling of highly diverse multivariate probability distributions in order to solve for macromolecular structures from smFRET data. Solution method: MCMC algorithm with fully adaptive proposal kernel and parallel tempering scheme.

  9. Airports’ Operational Performance and Efficiency Evaluation Based on Multicriteria Decision Analysis (MCDA and Data Envelopment Analysis (DEA Tools

    Directory of Open Access Journals (Sweden)

    João Jardim

    2015-12-01

    Full Text Available Airport benchmarking depends on airports’ operational performance and efficiency indicators, which are important for business agents, operational managers, regulatory agencies, airlines and passengers. There are several sets of single and complex indicators to evaluate airports’ performance and efficiency as well as several techniques to benchmark such infrastructures. The general aim of this work is twofold: to balance the data envelopment analysis (DEA and multicriteria decision analysis (MCDA tools and to show that airport benchmarking is also possible using a multicriteria decision analysis tool called Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH. Whilst DEA measures the relative performance in the presence of multiple inputs and outputs, MCDA/ MACBETH uses performance and efficiency indicators to support benchmark results, being useful for evaluating the real importance and weight of the selected indicators. The work is structured as follows: first, a state-of-the-art review concerning either airport benchmarking and performance indicators or DEA and MCDA tool techniques; second, an overview of the impacts on airports’ operational performance and efficiency of emergent operational factors (sudden meteorological/natural phenomena; third, two case studies on a set of worldwide airports and Madeira (FNC Airport; and fourth, some insights into and challenges for future research that are still under development.

  10. Analysis of maizena drying system using temperature control based fuzzy logic method

    Science.gov (United States)

    Arief, Ulfah Mediaty; Nugroho, Fajar; Purbawanto, Sugeng; Setyaningsih, Dyah Nurani; Suryono

    2018-03-01

    Corn is one of the rice subtitution food that has good potential. Corn can be processed to be a maizena, and it can be used to make type of food that has been made from maizena, viz. Brownies cake, egg roll, and other cookies. Generally, maizena obtained by drying process carried out 2-3 days under the sun. However, drying process not possible during the rainy season. This drying process can be done using an automatic drying tool. This study was to analyze the design result and manufacture of maizena drying system with temperature control based fuzzylogic method. The result show that temperature of drying system with set point 40°C - 60°C work in suitable condition. The level of water content in 15% (BSN) and temperatureat 50°C included in good drying process. Time required to reach the set point of temperature in 50°C is 7.05 minutes. Drying time for 500 gr samples with temperature 50°C and power capacity 127.6 watt was 1 hour. Based on the result, drying process using temperature control based fuzzy logic method can improve energy efficiency than the conventional method of drying using a direct sunlight source with a temperature that cannot be directly controlled by human being causing the quality of drying result of flour is erratic.

  11. AQME: A forensic mitochondrial DNA analysis tool for next-generation sequencing data.

    Science.gov (United States)

    Sturk-Andreaggi, Kimberly; Peck, Michelle A; Boysen, Cecilie; Dekker, Patrick; McMahon, Timothy P; Marshall, Charla K

    2017-11-01

    The feasibility of generating mitochondrial DNA (mtDNA) data has expanded considerably with the advent of next-generation sequencing (NGS), specifically in the generation of entire mtDNA genome (mitogenome) sequences. However, the analysis of these data has emerged as the greatest challenge to implementation in forensics. To address this need, a custom toolkit for use in the CLC Genomics Workbench (QIAGEN, Hilden, Germany) was developed through a collaborative effort between the Armed Forces Medical Examiner System - Armed Forces DNA Identification Laboratory (AFMES-AFDIL) and QIAGEN Bioinformatics. The AFDIL-QIAGEN mtDNA Expert, or AQME, generates an editable mtDNA profile that employs forensic conventions and includes the interpretation range required for mtDNA data reporting. AQME also integrates an mtDNA haplogroup estimate into the analysis workflow, which provides the analyst with phylogenetic nomenclature guidance and a profile quality check without the use of an external tool. Supplemental AQME outputs such as nucleotide-per-position metrics, configurable export files, and an audit trail are produced to assist the analyst during review. AQME is applied to standard CLC outputs and thus can be incorporated into any mtDNA bioinformatics pipeline within CLC regardless of sample type, library preparation or NGS platform. An evaluation of AQME was performed to demonstrate its functionality and reliability for the analysis of mitogenome NGS data. The study analyzed Illumina mitogenome data from 21 samples (including associated controls) of varying quality and sample preparations with the AQME toolkit. A total of 211 tool edits were automatically applied to 130 of the 698 total variants reported in an effort to adhere to forensic nomenclature. Although additional manual edits were required for three samples, supplemental tools such as mtDNA haplogroup estimation assisted in identifying and guiding these necessary modifications to the AQME-generated profile. Along

  12. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    impact everything from strategic logistic operations down to the energy demands at the company level. It also looks at the force structure of the...this requirement. 34. The system shall determine the efficiency of the logistics network with respect to an estimated cost of fuel used to deliver...REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor

  13. Development and Application of a Tool for Optimizing Composite Matrix Viscoplastic Material Parameters

    Science.gov (United States)

    Murthy, Pappu L. N.; Naghipour Ghezeljeh, Paria; Bednarcyk, Brett A.

    2018-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) and its application. MAC/GMC is a composite material and laminate analysis software package developed at NASA Glenn Research Center. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that helps users optimize highly nonlinear viscoplastic constitutive law parameters by fitting experimentally observed/measured stress-strain responses under various thermo-mechanical conditions for braided composites. The tool has been developed utilizing the MATrix LABoratory (MATLAB) (The Mathworks, Inc., Natick, MA) programming language. Illustrative examples shown are for a specific braided composite system wherein the matrix viscoplastic behavior is represented by a constitutive law described by seven parameters. The tool is general enough to fit any number of experimentally observed stress-strain responses of the material. The number of parameters to be optimized, as well as the importance given to each stress-strain response, are user choice. Three different optimization algorithms are included: (1) Optimization based on gradient method, (2) Genetic algorithm (GA) based optimization and (3) Particle Swarm Optimization (PSO). The user can mix and match the three algorithms. For example, one can start optimization with either 2 or 3 and then use the optimized solution to further fine tune with approach 1. The secondary focus of this paper is to demonstrate the application of this tool to optimize/calibrate parameters for a nonlinear viscoplastic matrix to predict stress-strain curves (for constituent and composite levels) at different rates, temperatures and/or loading conditions utilizing the Generalized Method of Cells. After preliminary validation of the tool through comparison with experimental results, a detailed virtual parametric study is

  14. Inverse analysis of non-uniform temperature distributions using multispectral pyrometry

    Science.gov (United States)

    Fu, Tairan; Duan, Minghao; Tian, Jibin; Shi, Congling

    2016-05-01

    Optical diagnostics can be used to obtain sub-pixel temperature information in remote sensing. A multispectral pyrometry method was developed using multiple spectral radiation intensities to deduce the temperature area distribution in the measurement region. The method transforms a spot multispectral pyrometer with a fixed field of view into a pyrometer with enhanced spatial resolution that can give sub-pixel temperature information from a "one pixel" measurement region. A temperature area fraction function was defined to represent the spatial temperature distribution in the measurement region. The method is illustrated by simulations of a multispectral pyrometer with a spectral range of 8.0-13.0 μm measuring a non-isothermal region with a temperature range of 500-800 K in the spot pyrometer field of view. The inverse algorithm for the sub-pixel temperature distribution (temperature area fractions) in the "one pixel" verifies this multispectral pyrometry method. The results show that an improved Levenberg-Marquardt algorithm is effective for this ill-posed inverse problem with relative errors in the temperature area fractions of (-3%, 3%) for most of the temperatures. The analysis provides a valuable reference for the use of spot multispectral pyrometers for sub-pixel temperature distributions in remote sensing measurements.

  15. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    Science.gov (United States)

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Structural analysis for elevated temperature design of the LMFBR

    International Nuclear Information System (INIS)

    Griffin, D.S.

    1976-02-01

    In the structural design of LMFBR components for elevated temperature service it is necessary to take account of the time-dependent, creep behavior of materials. The accommodation of creep to assure design reliability has required (1) development of new design limits and criteria, (2) development of more detailed representations of material behavior, and (3) application of the most advanced analysis techniques. These developments are summarized and examples are given to illustrate the current state of technology in elevated temperature design

  17. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  18. Nuclear Tools For Oilfield Logging-While-Drilling Applications

    International Nuclear Information System (INIS)

    Reijonen, Jani

    2011-01-01

    Schlumberger is an international oilfield service company with nearly 80,000 employees of 140 nationalities, operating globally in 80 countries. As a market leader in oilfield services, Schlumberger has developed a suite of technologies to assess the downhole environment, including, among others, electromagnetic, seismic, chemical, and nuclear measurements. In the past 10 years there has been a radical shift in the oilfield service industry from traditional wireline measurements to logging-while-drilling (LWD) analysis. For LWD measurements, the analysis is performed and the instruments are operated while the borehole is being drilled. The high temperature, high shock, and extreme vibration environment of LWD imposes stringent requirements for the devices used in these applications. This has a significant impact on the design of the components and subcomponents of a downhole tool. Another significant change in the past few years for nuclear-based oilwell logging tools is the desire to replace the sealed radioisotope sources with active, electronic ones. These active radiation sources provide great benefits compared to the isotopic sources, ranging from handling and safety to nonproliferation and well contamination issues. The challenge is to develop electronic generators that have a high degree of reliability for the entire lifetime of a downhole tool. LWD tool testing and operations are highlighted with particular emphasis on electronic radiation sources and nuclear detectors for the downhole environment.

  19. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  20. Analysis of Low-Temperature Utilization of Geothermal Resources

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Brian

    2015-06-30

    Full realization of the potential of what might be considered “low-grade” geothermal resources will require that we examine many more uses for the heat than traditional electricity generation. To demonstrate that geothermal energy truly has the potential to be a national energy source we will be designing, assessing, and evaluating innovative uses for geothermal-produced water such as hybrid biomass-geothermal cogeneration of electricity and district heating and efficiency improvements to the use of cellulosic biomass in addition to utilization of geothermal in district heating for community redevelopment projects. The objectives of this project were: 1) to perform a techno-economic analysis of the integration and utilization potential of low-temperature geothermal sources. Innovative uses of low-enthalpy geothermal water were designed and examined for their ability to offset fossil fuels and decrease CO2 emissions. 2) To perform process optimizations and economic analyses of processes that can utilize low-temperature geothermal fluids. These processes included electricity generation using biomass and district heating systems. 3) To scale up and generalize the results of three case study locations to develop a regionalized model of the utilization of low-temperature geothermal resources. A national-level, GIS-based, low-temperature geothermal resource supply model was developed and used to develop a series of national supply curves. We performed an in-depth analysis of the low-temperature geothermal resources that dominate the eastern half of the United States. The final products of this study include 17 publications, an updated version of the cost estimation software GEOPHIRES, and direct-use supply curves for low-temperature utilization of geothermal resources. The supply curves for direct use geothermal include utilization from known hydrothermal, undiscovered hydrothermal, and near-hydrothermal EGS resources and presented these results at the Stanford

  1. Hotspot temperature calculation and quench analysis on ITER busbar

    International Nuclear Information System (INIS)

    Rong, J.; Huang, X.Y.; Song, Y.T.; Wu, S.T.

    2014-01-01

    Highlights: • The hotspot temperature is calculated in the case of different extra copper in this paper. • The MQE (minimum quench energy) is carried out as the external heating to trigger a quench in busbar. • The temperature changes after quench is analyzed by Gandalf code in the case of different extra copper and no helium. • The normal length is carried out in the case of different extra copper by Gandalf code. - Abstract: This paper describes the analysis of ITER feeder busbar, the hotspot temperature of busbar is calculated by classical method in the case of 0%, 50%, 75% and 100% extra copper (copper strands). The quench behavior of busbar is simulated by 1-D Gandalf code, and the MQE (minimum quench energy) is estimated in classical method as initial external heat in Gandalf input file. The temperature and the normal length of conductor are analyzed in the case of 0%, 50% and 100% extra copper and no helium. By hotspot temperature, conductor temperature and normal length are contrasted in different extra copper cases, it is shown that the extra copper play an important role in quench protecting

  2. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  3. Digital Elevation Profile: A Complex Tool for the Spatial Analysis of Hiking

    Directory of Open Access Journals (Sweden)

    Laura TÎRLĂ

    2014-11-01

    Full Text Available One of the current attributions of mountain geomorphology is to provide information for tourism purposes, such as the spatial analysis of hiking trails. Therefore, geomorphic tools are indispensable for terrain analyses. Elevation profile is one of the most adequate tools for assessing the morphometric patterns of the hiking trails. In this study we tested several applications in order to manage raw data, create profile graphs and obtain the morphometric parameters of five hiking trails in the Căpățânii Mountains (South Carpathians, Romania. Different data complexity was explored: distance, elevation, cumulative gain or loss, slope etc. Furthermore, a comparative morphometric analysis was performed in order to emphasize the multiple possibilities provided by the elevation profile. Results show that GPS Visualizer, Geocontext and in some manner Google Earth are the most adequate applications that provide high-quality elevation profiles and detailed data, with multiple additional functions, according to user's needs. The applied tools and techniques are very useful for mountain route planning, elaborating mountain guides, enhancing knowledge about specific trails or routes, or assessing the landscape and tourism value of a mountain area.

  4. Fuel temperature analysis method for channel-blockage accident in HTTR

    International Nuclear Information System (INIS)

    Maruyama, So; Fujimoto, Nozomu; Sudo, Yukio; Kiso, Yoshihiro; Hayakawa, Hitoshi

    1994-01-01

    During operation of the High Temperature Engineering Test Reactor (HTTR), coolability must be maintained without core damage under all postulated accident conditions. Channel blockage of a fuel element was selected as one of the design-basis accidents in the safety evaluation of the reactor. The maximum fuel temperature for such a scenario has been evaluated in the safety analysis and is compared to the core damage limits.For the design of the HTTR, an in-core thermal and hydraulic analysis code ppercase[flownet/trump] was developed. This code calculates fuel temperature distribution, not only for a channel blockage accident but also for transient conditions. The validation of ppercase[flownet/trump] code was made by comparison of the analytical results with the results of thermal and hydraulic tests by the Helium Engineering Demonstration Loop (HENDEL) multi-channel test rig (T 1-M ), which simulated one fuel column in the core. The analytical results agreed well with the experiments in which the HTTR operating conditions were simulated.The maximum fuel temperature during a channel blockage accident is 1653 C. Therefore, it is confirmed that the integrity of the core is maintained during a channel blockage accident. ((orig.))

  5. Energy and exergy analysis of low temperature district heating network

    International Nuclear Information System (INIS)

    Li, Hongwei; Svendsen, Svend

    2012-01-01

    Low temperature district heating with reduced network supply and return temperature provides better match of the low quality building heating demand and the low quality heating supply from waste heat or renewable energy. In this paper, a hypothetical low temperature district heating network is designed to supply heating for 30 low energy detached residential houses. The network operational supply/return temperature is set as 55 °C/25 °C, which is in line with a pilot project carried out in Denmark. Two types of in-house substations are analyzed to supply the consumer domestic hot water demand. The space heating demand is supplied through floor heating in the bathroom and low temperature radiators in the rest of rooms. The network thermal and hydraulic conditions are simulated under steady state. A district heating network design and simulation code is developed to incorporate the network optimization procedure and the network simultaneous factor. Through the simulation, the overall system energy and exergy efficiencies are calculated and the exergy losses for the major district heating system components are identified. Based on the results, suggestions are given to further reduce the system energy/exergy losses and increase the quality match between the consumer heating demand and the district heating supply. -- Highlights: ► Exergy and energy analysis for low and medium temperature district heating systems. ► Different district heating network dimensioning methods are analyzed. ► Major exergy losses are identified in the district heating network and the in-house substations. ► Advantages to apply low temperature district heating are highlighted through exergy analysis. ► The influence of thermal by-pass on system exergy/energy performance is analyzed.

  6. REGRESSION ANALYSIS OF SEA-SURFACE-TEMPERATURE PATTERNS FOR THE NORTH PACIFIC OCEAN.

    Science.gov (United States)

    SEA WATER, *SURFACE TEMPERATURE, *OCEANOGRAPHIC DATA, PACIFIC OCEAN, REGRESSION ANALYSIS , STATISTICAL ANALYSIS, UNDERWATER EQUIPMENT, DETECTION, UNDERWATER COMMUNICATIONS, DISTRIBUTION, THERMAL PROPERTIES, COMPUTERS.

  7. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  8. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  9. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Silva, Claudio [New York Univ. (NYU), NY (United States). Computer Science and Engineering Dept.

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integration or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.

  10. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  11. Analytical studies on hotspot temperature of cable-in-conduit conductors

    International Nuclear Information System (INIS)

    Yoshida, Kiyoshi; Takigami, Hiroyoshi; Kubo, Hiroatsu

    2001-01-01

    This paper describes an analytical study to review the hotspot temperature design criteria of the cable-in-conduit conductors for the ITER magnet system. The ITER magnet system uses three kinds of cable-in-conduit conductors for the Toroidal Field (TF) coils, the Central Solenoid (CS) and the Poloidal Field (PF) coils. The amount of copper in the superconducting cable has been defined by using the classical hotspot temperature design criteria that is based on the adiabatic condition. In the current design, ITER superconducting cables include a large amount of pure copper strands to satisfy the classical criteria. However, temperature and stress in the conduit and insulations after quench can be simulated with the quench simulation program and stress analysis program using the latest analysis tools. This analysis shows that the strand temperature is dominated by the conduction along strands and the heat capacity of other conductor materials and coolant. The hotspot temperature depends strongly on the delay time for quench detection. This analysis provides an estimation of delay times for quench detection. The thermal and stress analysis can provide the maximum allowable temperature after quench by determination of a failure or a functional disorder condition of the jacket material and turn insulation. In conclusion, it is found that the current density of the cable space can be increased, by reducing the extra copper strand, thereby, allowing a reduction of the coil radial build. (author)

  12. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    Science.gov (United States)

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  13. Urban temperature analysis and impact on the building cooling energy performances: an Italian case study

    Directory of Open Access Journals (Sweden)

    Michele Zinzi

    2016-06-01

    Full Text Available Climate changes and urban sprawl are dramatically increasing the heat island effect in urban environments, whatever the size and the latitude are, affecting these latter parameters the effect intensity. The urban heats island is a phenomenon observed since the last decades of the XIX century but demonstrated at large scale only one century later, characterised by the increase of air temperature in densely built urban environments respect to the countryside surround cities. Many studies are available, showing urban heat island intensities up to 12°C. This thermal stress causes social, health and environmental hazards, with major consequences on weaker social classes, as elderly and low income people, it is not by chance that survey demonstrated the increase of deaths in such categories during intense and extended heat waves. This study presents the firs results on the observation of air temperature measures in different spots of Rome, city characterised by a typical Mediterranean climate and by a complex urban texture, in which densely built areas are kept separated by relatively green or not-built zones. Six spots are monitored since June 2014 and include: historical city centre, semi-central zones with different construction typologies, surrounding areas again with various urban and building designs. The paper is focused on the analysis of summer temperature profiles, increase respect to the temperature outside the cities and the impact on the cooling performance of buildings. Temperature datasets and a reference building model were inputted into the well-known and calibrated dynamic tool TRNSYS. Cooling net energy demand of the reference building was calculated, as well as the operative temperature evolution in the not cooled building configuration. The results of calculation allow to compare the energy and thermal performances in the urban environment respect to the reference conditions, usually adopted by building codes. Advice and

  14. Creative design-by-analysis solutions applied to high-temperature components

    International Nuclear Information System (INIS)

    Dhalla, A.K.

    1993-01-01

    Elevated temperature design has evolved over the last two decades from design-by-formula philosophy of the ASME Boiler and Pressure Vessel Code, Sections I and VIII (Division 1), to the design-by-analysis philosophy of Section III, Code Case N-47. The benefits of design-by-analysis procedures, which were developed under a US-DOE-sponsored high-temperature structural design (HTSD) program, are illustrated in the paper through five design examples taken from two U.S. liquid metal reactor (LMR) plants. Emphasis in the paper is placed upon the use of a detailed, nonlinear finite element analysis method to understand the structural response and to suggest design optimization so as to comply with Code Case N-47 criteria. A detailed analysis is cost-effective, if selectively used, to qualify an LMR component for service when long-lead-time structural forgings, procured based upon simplified preliminary analysis, do not meet the design criteria, or the operational loads are increased after the components have been fabricated. In the future, the overall costs of a detailed analysis will be reduced even further with the availability of finite element software used on workstations or PCs

  15. Thermodynamic analysis of a new design of temperature controlled parabolic trough collector

    International Nuclear Information System (INIS)

    Ceylan, İlhan; Ergun, Alper

    2013-01-01

    Highlights: • This new design parabolic trough collector has been made as temperature control. • The TCPTC system is very appropriate for the industrial systems which require high temperatures. • With TCPTC can provide hot water with low solar radiation. • TCPTC system costs are cheaper than other systems (thermo siphon systems, pomp systems, etc.). - Abstract: Numerous types of solar water heater are used throughout the world. These heaters can be classified into two groups as pumped systems and thermo siphon systems. However, water temperature cannot be controlled by these systems. In this study, a new temperature-controlled parabolic trough collector (TCPTC) was designed and analyzed experimentally. The analysis was made at a temperature range of 40–100 °C, with at intervals of 10 °C. A detailed analysis was performed by calculating energy efficiencies, exergy efficiencies, water temperatures and water amounts. The highest energy efficiency of TCPTC was calculated as 61.2 for 100 °C. As the set temperature increased, the energy efficiency increased as well. The highest exergy efficiency was calculated as 63 for 70 °C. However, as the set temperature increased, the exergy efficiency did not increase. Optimum exergy efficiency was obtained for 70 °C

  16. Systematic review and meta-analysis: tools for the information age.

    Science.gov (United States)

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. A Modeling approach for analysis and improvement of spindle-holder-tool assembly dynamics

    OpenAIRE

    Budak, Erhan; Ertürk, A.; Erturk, A.; Özgüven, H. N.; Ozguven, H. N.

    2006-01-01

    The most important information required for chatter stability analysis is the dynamics of the involved structures, i.e. the frequency response functions (FRFs) which are usually determined experimentally. In this study, the tool point FRF of a spindle-holder-tool assembly is analytically determined by using the receptance coupling and structural modification techniques. Timoshenko’s beam model is used for increased accuracy. The spindle is also modeled analytically with elastic supports repre...

  18. System Evaluation and Life-Cycle Cost Analysis of a Commercial-Scale High-Temperature Electrolysis Hydrogen Production Plant

    Energy Technology Data Exchange (ETDEWEB)

    Edwin A. Harvego; James E. O' Brien; Michael G. McKellar

    2012-11-01

    Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysis was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.

  19. Influence of minimum quantity of lubricant (MQL on tool life of carbide cutting tools during milling process of steel AISI 1018

    Directory of Open Access Journals (Sweden)

    Diego Núñez

    2017-03-01

    Full Text Available Nowadays, high productivity of machining is an important issue to obtain economic benefits in the industry. This purpose could be reached with high cutting velocity and feed rate. However, the inherently behavior produce high temperatures in the interface of couple cutting tool/workpiece. Many cutting fluids have been developed to control temperature in process and increase tool life. The objective of this paper is to compare the carbide milling tool wear using different systems cutting fluids: flood and minimum quantity of lubrication (MQL. The values of carbide milling cutting tool wear was evaluate according with the standard ISO 8688-1 1989. The experimental results showed that using MQL reduces significantly (about 40% tool wear in milling AISI 1018 steel at industrial cutting conditions.

  20. A Method to Optimize Geometric Errors of Machine Tool based on SNR Quality Loss Function and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Cai Ligang

    2017-01-01

    Full Text Available Instead improving the accuracy of machine tool by increasing the precision of key components level blindly in the production process, the method of combination of SNR quality loss function and machine tool geometric error correlation analysis to optimize five-axis machine tool geometric errors will be adopted. Firstly, the homogeneous transformation matrix method will be used to build five-axis machine tool geometric error modeling. Secondly, the SNR quality loss function will be used for cost modeling. And then, machine tool accuracy optimal objective function will be established based on the correlation analysis. Finally, ISIGHT combined with MATLAB will be applied to optimize each error. The results show that this method is reasonable and appropriate to relax the range of tolerance values, so as to reduce the manufacturing cost of machine tools.

  1. Adapting the capacities and vulnerabilities approach: a gender analysis tool.

    Science.gov (United States)

    Birks, Lauren; Powell, Christopher; Hatfield, Jennifer

    2017-12-01

    Gender analysis methodology is increasingly being considered as essential to health research because 'women's social, economic and political status undermine their ability to protect and promote their own physical, emotional and mental health, including their effective use of health information and services' {World Health Organization [Gender Analysis in Health: a review of selected tools. 2003; www.who.int/gender/documents/en/Gender. pdf (20 February 2008, date last accessed)]}. By examining gendered roles, responsibilities and norms through the lens of gender analysis, we can develop an in-depth understanding of social power differentials, and be better able to address gender inequalities and inequities within institutions and between men and women. When conducting gender analysis, tools and frameworks may help to aid community engagement and to provide a framework to ensure that relevant gendered nuances are assessed. The capacities and vulnerabilities approach (CVA) is one such gender analysis framework that critically considers gender and its associated roles, responsibilities and power dynamics in a particular community and seeks to meet a social need of that particular community. Although the original intent of the CVA was to guide humanitarian intervention and disaster preparedness, we adapted this framework to a different context, which focuses on identifying and addressing emerging problems and social issues in a particular community or area that affect their specific needs, such as an infectious disease outbreak or difficulty accessing health information and resources. We provide an example of our CVA adaptation, which served to facilitate a better understanding of how health-related disparities affect Maasai women in a remote, resource-poor setting in Northern Tanzania. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Development of a computer tool to support scenario analysis for safety assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Kawamura, Makoto; Wakasugi, Keiichiro; Okubo, Hiroo; Takase, Hiroyasu

    2007-02-01

    In 'H12 Project to Establishing Technical Basis for HLW Disposal in Japan' a systematic approach that was based on an international consensus was adopted to develop scenarios to be considered in performance assessment. Adequacy of the approach was, in general term, appreciated through the domestic and international peer review. However it was also suggested that there were issues related to improving transparency and traceability of the procedure. To achieve this, improvement of scenario analysis method has been studied. In this study, based on an improvement method for treatment of FEP interaction a computer tool to support scenario analysis by specialists of performance assessment has been developed. Anticipated effects of this tool are to improve efficiency of complex and time consuming scenario analysis work and to reduce possibility of human errors in this work. This tool also enables to describe interactions among a vast number of FEPs and the related information as interaction matrix, and analysis those interactions from a variety of perspectives. (author)

  3. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  4. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one

  5. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    Science.gov (United States)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  6. Development of NONSTA code for the design and analysis of LMR high temperature structure

    International Nuclear Information System (INIS)

    Kim, Jong Bum; Lee, H. Y.; Yoo, B.

    1999-02-01

    Liquid metal reactor(LMR) operates at high temperature (500-550 dg C) and structural materials undergo complex deformation behavior like diffusion, dislocation glide, and dislocation climb due to high temperature environment. And the material life reduces rapidly due to the interaction of cavities created inside structural materials and high temperature fatigue cracks. Thus the establishment of high temperature structure analysis techniques is necessary for the reliability and safety evaluation of such structures. The objectives of this study are to develop NONSTA code as the subprogram of ABAQUS code adopting constitutive equations which can predict high temperature material behavior precisely and to build the systematic analysis procedures. The developed program was applied to the example problems such as the tensile analysis using exponential creep model and the repetitive tensile-compression analysis using Chaboche unified viscoplastic model. In addition, the problem of a plate with a center hole subjected to tensile load was solved to show the applicability of the program to multiaxial problem and the time dependent stress redistribution was observed. (Author). 40 refs., 2 tabs., 24 figs

  7. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  8. GISS Surface Temperature Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The GISTEMP dataset is a global 2x2 gridded temperature anomaly dataset. Temperature data is updated around the middle of every month using current data files from...

  9. Visualizing decoupling in nanocrystalline alloys: A FORC-temperature analysis

    Science.gov (United States)

    Rivas, M.; Martínez-García, J. C.; Gorria, P.

    2016-02-01

    Devitrifying ferromagnetic amorphous precursors in the adequate conditions may give rise to disordered assemblies of densely packed nanocrystals with extraordinary magnetic softness well explained by the exchange coupling among multiple crystallites. Whether the magnetic exchange interaction is produced by direct contact or mediated by the intergranular amorphous matrix has a strong influence on the behaviour of the system above room temperature. Multi-phase amorphous-nanocrystalline systems dramatically harden when approaching the amorphous Curie temperature (TC) due to the hard grains decoupling. The study of the thermally induced decoupling of nanosized crystallites embedded in an amorphous matrix has been performed in this work by the first-order reversal curves (FORCs) analysis. We selected a Fe-rich amorphous alloy with TC = 330 K, in order to follow the evolution of the FORC diagrams obtained below and above such temperature in samples with different percentages of nanocrystalline phase. The existence of up to four regions exhibiting unlike magnetic behaviours is unambiguously determined from the temperature evolution of the FORC.

  10. SafetyBarrierManager, a software tool to perform risk analysis using ARAMIS's principles

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    2017-01-01

    of the ARAMIS project, Risø National Laboratory started developing a tool that could implement these methodologies, leading to SafetyBarrierManager. The tool is based on the principles of “safety‐barrier diagrams”, which are very similar to “bowties”, with the possibility of performing quantitative analysis......The ARAMIS project resulted in a number of methodologies, dealing with among others: the development of standard fault trees and “bowties”; the identification and classification of safety barriers; and including the quality of safety management into the quantified risk assessment. After conclusion....... The tool allows constructing comprehensive fault trees, event trees and safety‐barrier diagrams. The tool implements the ARAMIS idea of a set of safety barrier types, to which a number of safety management issues can be linked. By rating the quality of these management issues, the operational probability...

  11. DEBRISK, a Tool for Re-Entry Risk Analysis

    Science.gov (United States)

    Omaly, P.; Spel, M.

    2012-01-01

    An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the

  12. Aspects of Low Temperature Irradiation in Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Brune, D.

    1968-08-01

    Neutron irradiation of the sample while frozen in a cooling device inserted in a reactor channel has been carried out in the analysis of iodine in aqueous samples as well as of mercury in biological tissue and water. For the simultaneous irradiation of a large number of aqueous solutions the samples were arranged in a suitable geometry in order to avoid mutual flux perturbation effects. The influence of the neutron temperature on the activation process has been discussed. Potential applications of the low temperature irradiation technique are outlined

  13. Aspects of Low Temperature Irradiation in Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D

    1968-08-15

    Neutron irradiation of the sample while frozen in a cooling device inserted in a reactor channel has been carried out in the analysis of iodine in aqueous samples as well as of mercury in biological tissue and water. For the simultaneous irradiation of a large number of aqueous solutions the samples were arranged in a suitable geometry in order to avoid mutual flux perturbation effects. The influence of the neutron temperature on the activation process has been discussed. Potential applications of the low temperature irradiation technique are outlined.

  14. Surface Temperature Data Analysis

    Science.gov (United States)

    Hansen, James; Ruedy, Reto

    2012-01-01

    Small global mean temperature changes may have significant to disastrous consequences for the Earth's climate if they persist for an extended period. Obtaining global means from local weather reports is hampered by the uneven spatial distribution of the reliably reporting weather stations. Methods had to be developed that minimize as far as possible the impact of that situation. This software is a method of combining temperature data of individual stations to obtain a global mean trend, overcoming/estimating the uncertainty introduced by the spatial and temporal gaps in the available data. Useful estimates were obtained by the introduction of a special grid, subdividing the Earth's surface into 8,000 equal-area boxes, using the existing data to create virtual stations at the center of each of these boxes, and combining temperature anomalies (after assessing the radius of high correlation) rather than temperatures.

  15. System analysis: A tool for determining the feasibility of PACS for radiology

    International Nuclear Information System (INIS)

    Parrish, D.M.; Thompson, B.G.; Creasy, J.L.; Wallace, R.J.

    1987-01-01

    In the emerging technology of picture archival and communication systems, the real productivity improvements that such a system may provide are being evaluated by a systems analysis tool. This computer model allows a simulated comparison of manual versus digital departmental functions, by allowing changes of operational parameter times, physical distances in the department, and equipment and manpower resources; examples are presented. The presentation will focus on the analysis approach, the operational parameters most important in the digital environment, and an analysis of the potential productivity improvements

  16. Sequence Quality Analysis Tool for HIV Type 1 Protease and Reverse Transcriptase

    OpenAIRE

    DeLong, Allison K.; Wu, Mingham; Bennett, Diane; Parkin, Neil; Wu, Zhijin; Hogan, Joseph W.; Kantor, Rami

    2012-01-01

    Access to antiretroviral therapy is increasing globally and drug resistance evolution is anticipated. Currently, protease (PR) and reverse transcriptase (RT) sequence generation is increasing, including the use of in-house sequencing assays, and quality assessment prior to sequence analysis is essential. We created a computational HIV PR/RT Sequence Quality Analysis Tool (SQUAT) that runs in the R statistical environment. Sequence quality thresholds are calculated from a large dataset (46,802...

  17. A Delay Time Measurement of ULTRAS (Ultra-high Temperature Ultrasonic Response Analysis System) for a High Temperature Experiment

    International Nuclear Information System (INIS)

    Koo, Kil Mo; Kim, Sang Baik

    2010-01-01

    The temperature measurement of very high temperature core melt is of importance in a high temperature as the molten pool experiment in which gap formation between core melt and the reactor lower head, and the effect of the gap on thermal behavior are to be measured. The existing temperature measurement techniques have some problems, which the thermocouple, one of the contact methods, is restricted to under 2000 .deg. C, and the infrared thermometry, one of the non-contact methods, is unable to measure an internal temperature and very sensitive to the interference from reacted gases. In order to solve these problems, the delay time technique of ultrasonic wavelets due to high temperature has two sorts of stage. As a first stage, a delay time measurement of ULTRAS (Ultra-high Temperature Ultrasonic Response Analysis System) is suggested. As a second stage, a molten material temperature was measured up to 2300 .deg. C. Also, the optimization design of the UTS (ultrasonic temperature sensor) with persistence at the high temperature was suggested in this paper. And the utilization of the theory suggested in this paper and the efficiency of the developed system are performed by special equipment and some experiments supported by KRISS (Korea Research Institute of Standard and Science)

  18. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  19. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James E. [Krell Institute, Ames, IA (United States); Miller, Barton P. [Univ. of Wisconsin, Madison, WI (United States). Computer Sciences Dept.; Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States). Computer Sciences Dept.; Roth, Philip [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Future Technologies Group, Computer Science and Math Division; Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing (CASC)

    2013-12-19

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  20. Real-Time Estimation for Cutting Tool Wear Based on Modal Analysis of Monitored Signals

    Directory of Open Access Journals (Sweden)

    Yongjiao Chi

    2018-05-01

    Full Text Available There is a growing body of literature that recognizes the importance of product safety and the quality problems during processing. The working status of cutting tools may lead to project delay and cost overrun if broken down accidentally, and tool wear is crucial to processing precision in mechanical manufacturing, therefore, this study contributes to this growing area of research by monitoring condition and estimating wear. In this research, an effective method for tool wear estimation was constructed, in which, the signal features of machining process were extracted by ensemble empirical mode decomposition (EEMD and were used to estimate the tool wear. Based on signal analysis, vibration signals that had better linear relationship with tool wearing process were decomposed, then the intrinsic mode functions (IMFs, frequency spectrums of IMFs and the features relating to amplitude changes of frequency spectrum were obtained. The trend that tool wear changes with the features was fitted by Gaussian fitting function to estimate the tool wear. Experimental investigation was used to verify the effectiveness of this method and the results illustrated the correlation between tool wear and the modal features of monitored signals.