WorldWideScience

Sample records for models temperature logs

  1. Well-log based prediction of temperature models in the exploration of sedimentary settings

    DEFF Research Database (Denmark)

    Fuchs, Sven; Förster, Andrea; Wonik, Thomas

    Temperature-depth distributions are pivotal in subsurface studies in academia as well as in georesources applications. In this regard, high-resolution temperature profiles, logged under equilibrium thermal borehole conditions, are the ultimate measure. However there are circumstances in which...

  2. An Alternative Approach to Non-Log-Linear Thermal Microbial Inactivation: Modelling the Number of Log Cycles Reduction with Respect to Temperature

    Directory of Open Access Journals (Sweden)

    Vasilis Panagiotis Valdramidis

    2005-01-01

    Full Text Available A mathematical approach incorporating the shoulder effect during the quantification of microbial heat inactivation is being developed based on »the number of log cycles of reduction « concept. Hereto, the heat resistance of Escherichia coli K12 in BHI broth has been quantitatively determined in a generic and accurate way by defining the time t for x log reductions in the microbial population, i.e. txD, as a function of the treatment temperature T. Survival data of the examined microorganism are collected in a range of temperatures between 52–60.6 °C. Shoulder length Sl and specific inactivation rate kmax are derived from a mathematical expression that describes a non-log-linear behaviour. The temperature dependencies of Sl and kmax are used for structuring the txD(T function. Estimation of the txD(T parameters through a global identification procedure permits reliable predictions of the time to achieve a pre-decided microbial reduction. One of the parameters of the txD(T function is proposed as »the reference minimum temperature for inactivation«. For the case study considered, a value of 51.80 °C (with a standard error, SE, of 3.47 was identified. Finally, the time to achieve commercial sterilization and pasteurization for the product at hand, i.e. BHI broth, was found to be 11.70 s (SE=5.22, and 5.10 min (SE=1.22, respectively. Accounting for the uncertainty (based on the 90 % confidence intervals, CI a fail-safe treatment of these two processes takes 20.36 s and 7.12 min, respectively.

  3. Technology development for high temperature logging tools

    Energy Technology Data Exchange (ETDEWEB)

    Veneruso, A.F.; Coquat, J.A.

    1979-01-01

    A set of prototype, high temperature logging tools (temperature, pressure and flow) were tested successfully to temperatures up to 275/sup 0/C in a Union geothermal well during November 1978 as part of the Geothermal Logging Instrumentation Development Program. This program is being conducted by Sandia Laboratories for the Department of Energy's Division of Geothermal Energy. The progress and plans of this industry based program to develop and apply the high temperature instrumentation technology needed to make reliable geothermal borehole measurements are described. Specifically, this program is upgrading existing sondes for improved high temperature performance, as well as applying new materials (elastomers, polymers, metals and ceramics) and developing component technology such as high temperature cables, cableheads and electronics to make borehole measurements such as formation temperature, flow rate, high resolution pressure and fracture mapping. In order to satisfy critical existing needs, the near term goal is for operation up to 275/sup 0/C and 7000 psi by the end of FY80. The long term goal is for operation up to 350/sup 0/C and 20,000 psi by the end of FY84.

  4. Decomposable log-linear models

    DEFF Research Database (Denmark)

    Eriksen, Poul Svante

    can be characterized by a structured set of conditional independencies between some variables given some other variables. We term the new model class decomposable log-linear models, which is illustrated to be a much richer class than decomposable graphical models.It covers a wide range of non...... The present paper considers discrete probability models with exact computational properties. In relation to contingency tables this means closed form expressions of the maksimum likelihood estimate and its distribution. The model class includes what is known as decomposable graphicalmodels, which......-hierarchical models, models with structural zeroes, models described by quasi independence and models for level merging. Also, they have a very natural interpretation as they may be formulated by a structured set of conditional independencies between two events given some other event. In relation to contingency...

  5. The Little Ice Age signature and subsequent warming seen in borehole temperature logs versus solar forcing model

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J.; Šafanda, Jan; Przybylak, R.

    2014-01-01

    Roč. 103, č. 4 (2014), s. 1163-1173 ISSN 1437-3254 Institutional support: RVO:67985530 Keywords : surface processes * borehole temperatures * climatic warming * Little Ice Age * solar irradiation Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 2.093, year: 2014

  6. Log-binomial models: exploring failed convergence.

    Science.gov (United States)

    Williamson, Tyler; Eliasziw, Misha; Fick, Gordon Hilton

    2013-12-13

    Relative risk is a summary metric that is commonly used in epidemiological investigations. Increasingly, epidemiologists are using log-binomial models to study the impact of a set of predictor variables on a single binary outcome, as they naturally offer relative risks. However, standard statistical software may report failed convergence when attempting to fit log-binomial models in certain settings. The methods that have been proposed in the literature for dealing with failed convergence use approximate solutions to avoid the issue. This research looks directly at the log-likelihood function for the simplest log-binomial model where failed convergence has been observed, a model with a single linear predictor with three levels. The possible causes of failed convergence are explored and potential solutions are presented for some cases. Among the principal causes is a failure of the fitting algorithm to converge despite the log-likelihood function having a single finite maximum. Despite these limitations, log-binomial models are a viable option for epidemiologists wishing to describe the relationship between a set of predictors and a binary outcome where relative risk is the desired summary measure. Epidemiologists are encouraged to continue to use log-binomial models and advocate for improvements to the fitting algorithms to promote the widespread use of log-binomial models.

  7. Model wells for nuclear well logging

    International Nuclear Information System (INIS)

    Tittle, C.W.

    1989-01-01

    Considerations needed in the design and construction of model wells for nuclear log calibration are covered, with special attention to neutron porosity logging and total γ-ray logging. Pulsed neutron decay-time and spectral γ-ray logging are discussed briefly. The American Petroleum Institute calibration facility for nuclear logs is a good starting point for similar or expanded facilities. A few of its shortcomings are mentioned; they are minor. The problem of fluid saturation is emphasized. Attention is given to models made of consolidated rock and those containing unconsolidated material such as Ottawa sand. Needed precautions are listed. A similarity method is presented for estimating the porosity index of formations that are not fully saturated. (author)

  8. Modeling Precipitation Extremes using Log-Histospline

    Science.gov (United States)

    Huang, W. K.; Nychka, D. W.; Zhang, H.

    2017-12-01

    One of the commonly used approaches to modeling univariate extremes is the peaks-overthreshold (POT) method. The POT method models exceedances over a (sufficiently high/low) threshold as a generalized Pareto distribution (GPD). To apply this method, a threshold has to be chosen and the estimates might be sensitive to the chosen threshold. Here we propose an alternative, the "Log-Histospline", to explore modeling the tail behavior and the remainder of the density in one step using the full range of the data. Log-Histospline applies a smoothing spline model on a finely binned histogram of the log transformed data to estimate its log density. By construction, we are able to preserve the polynomial upper tail behavior, a feature commonly observed in geophysical observations. The Log-Histospline can be extended to the spatial setting by treating the marginal (log) density at each location as spatially indexed functional data, and perform a dimension reduction and spatial smoothing. We illustrate the proposed method by analyzing precipitation data from regional climate model output (North American Regional Climate Change and Assessment Program (NARCCAP)).

  9. Effect of Temperature on Acoustic Evaluation of Standing trees and logs: Part 1-Laboratory investigation

    Science.gov (United States)

    Shan Gao; Xiping Wang; Lihai Wang; R. Bruce. Allison

    2012-01-01

    The goals of this study were to investigate the effect of environment temperature on acoustic velocity of standing trees and green logs and to develop workable models for compensating temperature differences as acoustic measurements are performed in different climates and seasons. The objective of Part 1 was to investigate interactive effects of temperature and...

  10. CS model coil experimental log book

    International Nuclear Information System (INIS)

    Nishijima, Gen; Sugimoto, Makoto; Nunoya, Yoshihiko; Wakabayashi, Hiroshi; Tsuji, Hiroshi

    2001-02-01

    Charging test of the ITER CS Model Coil which is the world's largest superconducting pulse coil and the CS Insert Coil had started at April 11, 2000 and had completed at August 18, 2000. In the campaign, total shot numbers were 356 and the size of the data file in the DAS (Data Acquisition System) was over 20 GB. This report is a database that consists of the log list and the log sheets of every shot. One can access the database, make a search, and browse results via Internet (http://1ogwww.naka.jaeri.go.jp). The database will be useful to quick search to choose necessary shots. (author)

  11. Modelling tropical forests response to logging

    Science.gov (United States)

    Cazzolla Gatti, Roberto; Di Paola, Arianna; Valentini, Riccardo; Paparella, Francesco

    2013-04-01

    Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy logging and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective logging to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective logging on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective logging to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.

  12. Effect of temperature on Acoustic Evaluation of standing trees and logs: Part 2: Field Investigation

    Science.gov (United States)

    Shan Gao; Xiping Wang; Lihai Wang; R. Bruce Allison

    2013-01-01

    The objectives of this study were to investigate the effect of seasonal temperature changes on acoustic velocity measured on standing trees and green logs and to develop models for compensating temperature differences because acoustic measurements are performed in different climates and seasons. Field testing was conducted on 20 red pine (Pinus resinosa...

  13. CS model coil experimental log book

    Energy Technology Data Exchange (ETDEWEB)

    Nishijima, Gen; Sugimoto, Makoto; Nunoya, Yoshihiko; Wakabayashi, Hiroshi; Tsuji, Hiroshi [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2001-02-01

    Charging test of the ITER CS Model Coil which is the world's largest superconducting pulse coil and the CS Insert Coil had started at April 11, 2000 and had completed at August 18, 2000. In the campaign, total shot numbers were 356 and the size of the data file in the DAS (Data Acquisition System) was over 20 GB. This report is a database that consists of the log list and the log sheets of every shot. One can access the database, make a search, and browse results via Internet (http://1ogwww.naka.jaeri.go.jp). The database will be useful to quick search to choose necessary shots. (author)

  14. Analysis of RIA standard curve by log-logistic and cubic log-logit models

    International Nuclear Information System (INIS)

    Yamada, Hideo; Kuroda, Akira; Yatabe, Tami; Inaba, Taeko; Chiba, Kazuo

    1981-01-01

    In order to improve goodness-of-fit in RIA standard analysis, programs for computing log-logistic and cubic log-logit were written in BASIC using personal computer P-6060 (Olivetti). Iterative least square method of Taylor series was applied for non-linear estimation of logistic and log-logistic. Hear ''log-logistic'' represents Y = (a - d)/(1 + (log(X)/c)sup(b)) + d As weights either 1, 1/var(Y) or 1/σ 2 were used in logistic or log-logistic and either Y 2 (1 - Y) 2 , Y 2 (1 - Y) 2 /var(Y), or Y 2 (1 - Y) 2 /σ 2 were used in quadratic or cubic log-logit. The term var(Y) represents squares of pure error and σ 2 represents estimated variance calculated using a following equation log(σ 2 + 1) = log(A) + J log(y). As indicators for goodness-of-fit, MSL/S sub(e)sup(2), CMD% and WRV (see text) were used. Better regression was obtained in case of alpha-fetoprotein by log-logistic than by logistic. Cortisol standard curve was much better fitted with cubic log-logit than quadratic log-logit. Predicted precision of AFP standard curve was below 5% in log-logistic in stead of 8% in logistic analysis. Predicted precision obtained using cubic log-logit was about five times lower than that with quadratic log-logit. Importance of selecting good models in RIA data processing was stressed in conjunction with intrinsic precision of radioimmunoassay system indicated by predicted precision. (author)

  15. Analysis of artificial fireplace logs by high temperature gas chromatography.

    Science.gov (United States)

    Kuk, Raymond J

    2002-11-01

    High temperature gas chromatography is used to analyze the wax of artificial fireplace logs (firelogs). Firelogs from several different manufacturers are studied and compared. This study shows that the wax within a single firelog is homogeneous and that the wax is also uniform throughout a multi-firelog package. Different brands are shown to have different wax compositions. Firelogs of the same brand, but purchased in different locations, also have different wax compositions. With this information it may be possible to associate an unknown firelog sample to a known sample, but a definitive statement of the origin cannot be made.

  16. DOE-Grand Junction logging model data synopsis

    International Nuclear Information System (INIS)

    Mathews, M.A.; Koizumi, C.J.; Evans, H.B.

    1978-05-01

    This synopsis provides the available data concerning the logging models at the DoE-Grand Junction facility, to date (1976). Because gamma-ray logs are used in uranium exploration to estimate the grade (percent U 3 O 8 ) and the thickness of uranium ore zones in exploration drill holes, logging models are required to calibrate the gamma-ray logging equipment in order to obtain accuracy, uniformity, standardization, and repeatability during logging. This quality control is essential for accurate ore reserve calculations and for estimates of ore potential. The logging models at the DoE-Grand Junction facility are available for use by private industry in calibrating their gamma-ray logging equipment. 21 figures, 26 tables

  17. Predictive models of forest logging residues of Triplochiton ...

    African Journals Online (AJOL)

    The model developed indicated that logarithmic functions performed better than other form of equation. The findings of this study revealed that there is significant logging residues left to waste in the forest after timber harvest and quantifying this logging residue in terms of biomass model can serve as management tools in ...

  18. Development of interpretation models for PFN uranium log analysis

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1980-11-01

    This report presents the models for interpretation of borehole logs for the PFN (Prompt Fission Neutron) uranium logging system. Two models have been developed, the counts-ratio model and the counts/dieaway model. Both are empirically developed, but can be related to the theoretical bases for PFN analysis. The models try to correct for the effects of external factors (such as probe or formation parameters) in the calculation of uranium grade. The theoretical bases and calculational techniques for estimating uranium concentration from raw PFN data and other parameters are discussed. Examples and discussions of borehole logs are included

  19. Latent log-linear models for handwritten digit classification.

    Science.gov (United States)

    Deselaers, Thomas; Gass, Tobias; Heigold, Georg; Ney, Hermann

    2012-06-01

    We present latent log-linear models, an extension of log-linear models incorporating latent variables, and we propose two applications thereof: log-linear mixture models and image deformation-aware log-linear models. The resulting models are fully discriminative, can be trained efficiently, and the model complexity can be controlled. Log-linear mixture models offer additional flexibility within the log-linear modeling framework. Unlike previous approaches, the image deformation-aware model directly considers image deformations and allows for a discriminative training of the deformation parameters. Both are trained using alternating optimization. For certain variants, convergence to a stationary point is guaranteed and, in practice, even variants without this guarantee converge and find models that perform well. We tune the methods on the USPS data set and evaluate on the MNIST data set, demonstrating the generalization capabilities of our proposed models. Our models, although using significantly fewer parameters, are able to obtain competitive results with models proposed in the literature.

  20. Ordinal Log-Linear Models for Contingency Tables

    Directory of Open Access Journals (Sweden)

    Brzezińska Justyna

    2016-12-01

    Full Text Available A log-linear analysis is a method providing a comprehensive scheme to describe the association for categorical variables in a contingency table. The log-linear model specifies how the expected counts depend on the levels of the categorical variables for these cells and provide detailed information on the associations. The aim of this paper is to present theoretical, as well as empirical, aspects of ordinal log-linear models used for contingency tables with ordinal variables. We introduce log-linear models for ordinal variables: linear-by-linear association, row effect model, column effect model and RC Goodman’s model. Algorithm, advantages and disadvantages will be discussed in the paper. An empirical analysis will be conducted with the use of R.

  1. Modelling discontinuous well log signal to identify lithological ...

    Indian Academy of Sciences (India)

    1Indian School of Mines (ISM), Dhanbad 826 004, India. ... new wavelet transform-based algorithm to model the abrupt discontinuous changes from well log data by taking care of ...... the 11th ACM International Conference on Multimedia,.

  2. A portable borehole temperature logging system using the four-wire resistance method

    Science.gov (United States)

    Erkan, Kamil; Akkoyunlu, Bülent; Balkan, Elif; Tayanç, Mete

    2017-12-01

    High-quality temperature-depth information from boreholes with a depth of 100 m or more is used in geothermal studies and in studies of climate change. Electrical wireline tools with thermistor sensors are capable of measuring borehole temperatures with millikelvin resolution. The use of a surface readout mode allows analysis of the thermally conductive state of a borehole, which is especially important for climatic and regional heat flow studies. In this study we describe the design of a portable temperature logging tool that uses the four-wire resistance measurement method. The four-wire method enables the elimination of cable resistance effects, thus allowing millikelvin resolution of temperature data at depth. A preliminary two-wire model of the system is also described. The portability of the tool enables one to collect data from boreholes down to 300 m, even in locations with limited accessibility.

  3. High Temperature Logging and Monitoring Instruments to Explore and Drill Deep into Hot Oceanic Crust.

    Science.gov (United States)

    Denchik, N.; Pezard, P. A.; Ragnar, A.; Jean-Luc, D.; Jan, H.

    2014-12-01

    Drilling an entire section of the oceanic crust and through the Moho has been a goal of the scientific community for more than half of a century. On the basis of ODP and IODP experience and data, this will require instruments and strategies working at temperature far above 200°C (reached, for example, at the bottom of DSDP/ODP Hole 504B), and possibly beyond 300°C. Concerning logging and monitoring instruments, progress were made over the past ten years in the context of the HiTI ("High Temperature Instruments") project funded by the european community for deep drilling in hot Icelandic geothermal holes where supercritical conditions and a highly corrosive environment are expected at depth (with temperatures above 374 °C and pressures exceeding 22 MPa). For example, a slickline tool (memory tool) tolerating up to 400°C and wireline tools up to 300°C were developed and tested in Icelandic high-temperature geothermal fields. The temperature limitation of logging tools was defined to comply with the present limitation in wireline cables (320°C). As part of this new set of downhole tools, temperature, pressure, fluid flow and casing collar location might be measured up to 400°C from a single multisensor tool. Natural gamma radiation spectrum, borehole wall ultrasonic images signal, and fiber optic cables (using distributed temperature sensing methods) were also developed for wireline deployment up to 300°C and tested in the field. A wireline, dual laterolog electrical resistivity tool was also developed but could not be field tested as part of HiTI. This new set of tools constitutes a basis for the deep exploration of the oceanic crust in the future. In addition, new strategies including the real-time integration of drilling parameters with modeling of the thermo-mechanical status of the borehole could be developed, using time-lapse logging of temperature (for heat flow determination) and borehole wall images (for hole stability and in-situ stress determination

  4. [Using log-binomial model for estimating the prevalence ratio].

    Science.gov (United States)

    Ye, Rong; Gao, Yan-hui; Yang, Yi; Chen, Yue

    2010-05-01

    To estimate the prevalence ratios, using a log-binomial model with or without continuous covariates. Prevalence ratios for individuals' attitude towards smoking-ban legislation associated with smoking status, estimated by using a log-binomial model were compared with odds ratios estimated by logistic regression model. In the log-binomial modeling, maximum likelihood method was used when there were no continuous covariates and COPY approach was used if the model did not converge, for example due to the existence of continuous covariates. We examined the association between individuals' attitude towards smoking-ban legislation and smoking status in men and women. Prevalence ratio and odds ratio estimation provided similar results for the association in women since smoking was not common. In men however, the odds ratio estimates were markedly larger than the prevalence ratios due to a higher prevalence of outcome. The log-binomial model did not converge when age was included as a continuous covariate and COPY method was used to deal with the situation. All analysis was performed by SAS. Prevalence ratio seemed to better measure the association than odds ratio when prevalence is high. SAS programs were provided to calculate the prevalence ratios with or without continuous covariates in the log-binomial regression analysis.

  5. Log-Normal Turbulence Dissipation in Global Ocean Models

    Science.gov (United States)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  6. Monte Carlo Numerical Models for Nuclear Logging Applications

    Directory of Open Access Journals (Sweden)

    Fusheng Li

    2012-06-01

    Full Text Available Nuclear logging is one of most important logging services provided by many oil service companies. The main parameters of interest are formation porosity, bulk density, and natural radiation. Other services are also provided from using complex nuclear logging tools, such as formation lithology/mineralogy, etc. Some parameters can be measured by using neutron logging tools and some can only be measured by using a gamma ray tool. To understand the response of nuclear logging tools, the neutron transport/diffusion theory and photon diffusion theory are needed. Unfortunately, for most cases there are no analytical answers if complex tool geometry is involved. For many years, Monte Carlo numerical models have been used by nuclear scientists in the well logging industry to address these challenges. The models have been widely employed in the optimization of nuclear logging tool design, and the development of interpretation methods for nuclear logs. They have also been used to predict the response of nuclear logging systems for forward simulation problems. In this case, the system parameters including geometry, materials and nuclear sources, etc., are pre-defined and the transportation and interactions of nuclear particles (such as neutrons, photons and/or electrons in the regions of interest are simulated according to detailed nuclear physics theory and their nuclear cross-section data (probability of interacting. Then the deposited energies of particles entering the detectors are recorded and tallied and the tool responses to such a scenario are generated. A general-purpose code named Monte Carlo N– Particle (MCNP has been the industry-standard for some time. In this paper, we briefly introduce the fundamental principles of Monte Carlo numerical modeling and review the physics of MCNP. Some of the latest developments of Monte Carlo Models are also reviewed. A variety of examples are presented to illustrate the uses of Monte Carlo numerical models

  7. TENSOR DECOMPOSITIONS AND SPARSE LOG-LINEAR MODELS

    Science.gov (United States)

    Johndrow, James E.; Bhattacharya, Anirban; Dunson, David B.

    2017-01-01

    Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. We derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions. PMID:29332971

  8. Estimation of oil reservoir thermal properties through temperature log data using inversion method

    International Nuclear Information System (INIS)

    Cheng, Wen-Long; Nian, Yong-Le; Li, Tong-Tong; Wang, Chang-Long

    2013-01-01

    Oil reservoir thermal properties not only play an important role in steam injection well heat transfer, but also are the basic parameters for evaluating the oil saturation in reservoir. In this study, for estimating reservoir thermal properties, a novel heat and mass transfer model of steam injection well was established at first, this model made full analysis on the wellbore-reservoir heat and mass transfer as well as the wellbore-formation, and the simulated results by the model were quite consistent with the log data. Then this study presented an effective inversion method for estimating the reservoir thermal properties through temperature log data. This method is based on the heat transfer model in steam injection wells, and can be used to predict the thermal properties as a stochastic approximation method. The inversion method was applied to estimate the reservoir thermal properties of two steam injection wells, it was found that the relative error of thermal conductivity for the two wells were 2.9% and 6.5%, and the relative error of volumetric specific heat capacity were 6.7% and 7.0%,which demonstrated the feasibility of the proposed method for estimating the reservoir thermal properties. - Highlights: • An effective inversion method for predicting the oil reservoir thermal properties was presented. • A novel model for steam injection well made full study on the wellbore-reservoir heat and mass transfer. • The wellbore temperature field and steam parameters can be simulated by the model efficiently. • Both reservoirs and formation thermal properties could be estimated simultaneously by the proposed method. • The estimated steam temperature was quite consistent with the field data

  9. Testing and analysis of internal hardwood log defect prediction models

    Science.gov (United States)

    R. Edward Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  10. Preliminary report on NTS spectral gamma logging and calibration models

    International Nuclear Information System (INIS)

    Mathews, M.A.; Warren, R.G.; Garcia, S.R.; Lavelle, M.J.

    1985-01-01

    Facilities are now available at the Nevada Test Site (NTS) in Building 2201 to calibrate spectral gamma logging equipment in environments of low radioactivity. Such environments are routinely encountered during logging of holes at the NTS. Four calibration models were delivered to Building 2201 in January 1985. Each model, or test pit, consists of a stone block with a 12-inch diameter cored borehole. Preliminary radioelement values from the core for the test pits range from 0.58 to 3.83% potassium (K), 0.48 to 29.11 ppm thorium (Th), and 0.62 to 40.42 ppm uranium (U). Two satellite holes, U19ab number2 and U19ab number3, were logged during the winter of 1984-1985. The response of these logs correlates with contents of the naturally radioactive elements K. Th. and U determined in samples from petrologic zones that occur within these holes. Based on these comparisons, the spectral gamma log aids in the recognition and mapping of subsurface stratigraphic units and alteration features associated with unusual concentration of these radioactive elements, such as clay-rich zones

  11. Modeling and validating the grabbing forces of hydraulic log grapples used in forest operations

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux; Lihai Wang

    2003-01-01

    The grabbing forces of log grapples were modeled and analyzed mathematically under operating conditions when grabbing logs from compact log piles and from bunch-like log piles. The grabbing forces are closely related to the structural parameters of the grapple, the weight of the grapple, and the weight of the log grabbed. An operational model grapple was designed and...

  12. An ecosystem model for tropical forest disturbance and selective logging

    Science.gov (United States)

    Maoyi Huang; Gregory P. Asner; Michael Keller; Joseph A. Berry

    2008-01-01

    [1] A new three-dimensional version of the Carnegie-Ames-Stanford Approach (CASA) ecosystem model (CASA-3D) was developed to simulate regional carbon cycling in tropical forest ecosystems after disturbances such as logging. CASA-3D has the following new features: (1) an alternative approach for calculating absorbed photosynthetically active radiation (APAR) using new...

  13. Validation of an internal hardwood log defect prediction model

    Science.gov (United States)

    R. Edward. Thomas

    2011-01-01

    The type, size, and location of internal defects dictate the grade and value of lumber sawn from hardwood logs. However, acquiring internal defect knowledge with x-ray/computed-tomography or magnetic-resonance imaging technology can be expensive both in time and cost. An alternative approach uses prediction models based on correlations among external defect indicators...

  14. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Science.gov (United States)

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  15. Defining a Family of Cognitive Diagnosis Models Using Log-Linear Models with Latent Variables

    Science.gov (United States)

    Henson, Robert A.; Templin, Jonathan L.; Willse, John T.

    2009-01-01

    This paper uses log-linear models with latent variables (Hagenaars, in "Loglinear Models with Latent Variables," 1993) to define a family of cognitive diagnosis models. In doing so, the relationship between many common models is explicitly defined and discussed. In addition, because the log-linear model with latent variables is a general model for…

  16. Combination of Well-Logging Temperature and Thermal Remote Sensing for Characterization of Geothermal Resources in Hokkaido, Northern Japan

    Directory of Open Access Journals (Sweden)

    Bingwei Tian

    2015-03-01

    Full Text Available Geothermal resources have become an increasingly important source of renewable energy for electrical power generation worldwide. Combined Three Dimension (3D Subsurface Temperature (SST and Land Surface Temperature (LST measurements are essential for accurate assessment of geothermal resources. In this study, subsurface and surface temperature distributions were combined using a dataset comprised of well logs and Thermal Infrared Remote sensing (TIR images from Hokkaido island, northern Japan. Using 28,476 temperature data points from 433 boreholes sites and a method of Kriging with External Drift or trend (KED, SST distribution model from depths of 100 to 1500 m was produced. Regional LST was estimated from 13 scenes of Landsat 8 images. Resultant SST ranged from around 50 °C to 300 °C at a depth of 1500 m. Most of western and part of the eastern Hokkaido are characterized by high temperature gradients, while low temperatures were found in the central region. Higher temperatures in shallower crust imply the western region and part of the eastern region have high geothermal potential. Moreover, several LST zones considered to have high geothermal potential were identified upon clarification of the underground heat distribution according to 3D SST. LST in these zones showed the anomalies, 3 to 9 °C higher than the surrounding areas. These results demonstrate that our combination of TIR and 3D temperature modeling using well logging and geostatistics is an efficient and promising approach to geothermal resource exploration.

  17. Modeling the dielectric logging tool at high frequency

    International Nuclear Information System (INIS)

    Chew, W.C.

    1987-01-01

    The high frequency dielectric logging tool has been used widely in electromagnetic well logging, because by measuring the dielectric constant at high frequencies (1 GHz), the water saturation of rocks could be known without measuring the water salinity in the rocks. As such, it could be used to delineate fresh water bearing zones, as the dielectric constant of fresh water is much higher than that of oil while they may have the same resistivity. The authors present a computer model, though electromagnetic field analysis, the response of such a measurement tool in a well logging environment. As the measurement is performed at high frequency, usually with small separation between the transmitter and receivers, some small geological features could be measured by such a tool. They use the computer model to study the behavior of such a tool across geological bed boundaries, and also across thin geological beds. Such a study could be very useful in understanding the limitation on the resolution of the tool. Furthermore, they could study the standoff effect and the depth of investigation of such a tool. This could delineate the range of usefulness of the measurement

  18. Headwater stream temperature: interpreting response after logging, with and without riparian buffers, Washington, USA

    Science.gov (United States)

    Jack E. Janisch; Steven M. Wondzell; William J. Ehinger

    2012-01-01

    We examined stream temperature response to forest harvest in small forested headwater catchments in western Washington, USA over a seven year period (2002-2008). These streams have very low discharge in late summer and many become spatially intermittent. We used a before-after, control-impact (BACl) study design to contrast the effect of clearcut logging with two...

  19. Estimation of geothermal gradients from single temperature log-field cases

    International Nuclear Information System (INIS)

    Kutasov, I M; Eppelbaum, L V

    2009-01-01

    A geothermal gradient is one of the most frequently used parameters in logging geophysics. However, the drilling process greatly disturbs the temperature of the formations around the wellbore. For this reason, in order to determine with the required accuracy the formation temperatures and geothermal gradients, a certain length of shut-in time is required. It was shown earlier (Kutasov 1968 Freiberger Forshungshefte C 238 55–61, 1987 Geothermics 16 467–72) that at least two transient temperature surveys are needed to determine the geothermal gradient with adequate accuracy. However, in many cases only one temperature log is conducted in a shut-in borehole. For these cases, we propose an approximate method for the estimation of the geothermal gradient. The utilization of this method is demonstrated on four field examples

  20. Repeated temperature logs from Czech, Slovenian and Portuguese borehole climate observatories

    Czech Academy of Sciences Publication Activity Database

    Šafanda, Jan; Rajver, D.; Correia, A.; Dědeček, Petr

    2007-01-01

    Roč. 3, č. 3 (2007), s. 453-462 ISSN 1814-9324 R&D Projects: GA AV ČR(CZ) IAA300120603 Grant - others:NATO(US) PDD(CP)-(EST.CLG 980 152) Institutional research plan: CEZ:AV0Z30120515 Source of funding: V - iné verejné zdroje Keywords : borehole temperatures * temperature logs * borehole climate observatories Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.450, year: 2007

  1. High temperature color conductivity at next-to-leading log order

    International Nuclear Information System (INIS)

    Arnold, Peter; Yaffe, Laurence G.

    2000-01-01

    The non-Abelian analogue of electrical conductivity at high temperature has previously been known only at leading logarithmic order -- that is, neglecting effects suppressed only by an inverse logarithm of the gauge coupling. We calculate the first sub-leading correction. This has immediate application to improving, to next-to-leading log order, both effective theories of non-perturbative color dynamics, and calculations of the hot electroweak baryon number violation rate

  2. Temperature logging of groundwater in bedrock wells for geothermal gradient characterization in New Hampshire, 2012

    Science.gov (United States)

    Degnan, James; Barker, Gregory; Olson, Neil; Wilder, Leland

    2012-01-01

    The U.S. Geological Survey, in cooperation with the New Hampshire Geological Survey, measured the fluid temperature of groundwater in deep bedrock wells in the State of New Hampshire in order to characterize geothermal gradients in bedrock. All wells selected for the study had low water yields, which correspond to low groundwater flow from fractures. This reduced the potential for flow-induced temperature changes that would mask the natural geothermal gradient in the bedrock. All the wells included in this study were privately owned, and permission to use the wells was obtained from homeowners before logging.

  3. Design A Prototype of Temperature Logging Tools for Geothermal Prospecting Areas

    Directory of Open Access Journals (Sweden)

    Supriyanto

    2013-08-01

    Full Text Available The costs of geothermal exploration are very high because technology is still imported from other countries. The local business players in the geothermal sector do not have the ability to compete with global companies. To reduce costs, we need to develop our own equipment with competitive prices. Here in Indonesia, we have started to design a prototype of temperature logging tools for geothermal prospecting areas. This equipment can be used to detect temperature versus depth variations. To measure the thermal gradient, the platinum resistor temperature sensor is moved slowly down along the borehole. The displacement along the borehole is measured by a rotary encoder. This system is controlled by a 16-bit H8/3069F microcontroller. The acquired temperature data is displayed on a PC monitor using a Python Graphical User Interface. The system has been already tested in the Gunung Pancar geothermal prospect area in Bogor.

  4. Geothermal regime of Tarim basin, NW China: insights from borehole temperature logging

    Science.gov (United States)

    Liu, S.; Lei, X.

    2013-12-01

    Geothermal regime of sedimentary basin is vital for understanding basin (de)formation process, hydrocarbon generation status and assessing the resource potential. Located at the Precambrian craton block, the Tarim basin is the largest intermountain basin in China, which is also the ongoing target of oil and gas exploration. Previous knowledge of thermal regime of this basin is from limited oil exploration borehole testing temperature, the inherent deficiency of data of this type makes accurate understanding of its thermal regime impossible. Here we reported our latest steady temperature logging results in this basin and analyze its thermal regime as well. In this study, 10 temperature loggings are conducted in the northern Tarim basin where the major oil and gas fields are discovered. All the boreholes for temperature logging are non-production wells and are shut in at least more than 2~3 years, ensuring the temperature equilibrium after drilling. The derived geothermal gradient varies from 20.2 to 26.1 degree/km, with a mean of 22.0 degree/km. However, some previous reported gradients in this area are obviously lower than our results; for example, the previous gradient of THN2 well is 13.2 degree/km but 23.2 degree/km in this study, and not enough equilibrium time in previous logging accounts for this discrepancy. More important, it is found that high gradients usually occur in the gas field and the gradients of the gas fields are larger than those in other oil fields, indicating higher thermal regime in gas field. The cause of this phenomenon is unclear, and the upward migration of hot fluid along fault conduit is speculated as the possible mechanism for this high geothermal anomaly in the oil and gas fields. Combined with measured thermal conductivity data, 10 new heat flow values are also achieved, and the heat flow of the Tarim basin is between 38mW/m2 and 52mW/m2, with a mean of 43 mW/m2. This relatively low heat flow is coincident with that of typical

  5. Production-log base model for carbonate permeability distribution and steam flood optimization

    Energy Technology Data Exchange (ETDEWEB)

    Ahamed, S.F.; Choudhry, M.A.; Abdulbaqi, J.B. [Kuwait Gulf Oil Co. (Kuwait)

    2008-10-15

    This paper presented a model for the effective management of primary and thermal oil recovery operations in the Wafra Field in Kuwait, where a small huff and puff project was carried out in 1998 to determine if steam injection was a feasible recovery option for the field. The Eocene heavy oil reservoirs of the Wafra Field are carbonate rock admixtures with gypsum and anhydrite. They are the shallowest of the field's productive horizons and exhibit a high degree of fluid flow heterogeneity. The assessment of vertical and lateral permeability variation is a key factor for success of the reservoir development plan. Steam injection began in 2006 in a small scale test (SST) to determine if the innovative technology could produce steam from effluent water and to test the viability of steam injection in carbonate reservoirs. Following the success of the SST, a large scale pilot (LSP) is schedule to start in 2009. It can be used for completion strategies of injectors and producers in steam injection. The model showed that the productivity of the Eocene wells could be correlated with common available logs to develop a log based-permeability model. A series of cross plots for the perforated intervals of high and low productivity wells were constructed to develop a relationship between well productivity and location of log parameters on the plots. A relationship between rock quality, productivity and conventional log parameters was established. It was concluded that the vertical permeability and interwell continuity in the Eocene wells can be used to optimize new well placement for horizontal and vertical infill drilling. The model is also an effective tool to predict the steam injectivity profile to understand the anomalies related to temperature-depth distribution. The model can be used to improve the efficiency of formation heating by optimizing the steam flood process and steam pattern well completion. 16 refs.

  6. Paleoclimatic reconstructions in western Canada from borehole temperature logs: surface air temperature forcing and groundwater flow

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J.; Grasby, S. E.; Ferguson, G.; Šafanda, Jan; Skinner, W.

    2006-01-01

    Roč. 2, č. 1 (2006), s. 1-10 ISSN 1814-9324 Institutional research plan: CEZ:AV0Z30120515 Keywords : palaeoclimatic reconstructions * Canada * borehole temperatures Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  7. Calibration models for density borehole logging - construction report

    International Nuclear Information System (INIS)

    Engelmann, R.E.; Lewis, R.E.; Stromswold, D.C.

    1995-10-01

    Two machined blocks of magnesium and aluminum alloys form the basis for Hanford's density models. The blocks provide known densities of 1.780 ± 0.002 g/cm 3 and 2.804 ± 0.002 g/cm 3 for calibrating borehole logging tools that measure density based on gamma-ray scattering from a source in the tool. Each block is approximately 33 x 58 x 91 cm (13 x 23 x 36 in.) with cylindrical grooves cut into the sides of the blocks to hold steel casings of inner diameter 15 cm (6 in.) and 20 cm (8 in.). Spacers that can be inserted between the blocks and casings can create air gaps of thickness 0.64, 1.3, 1.9, and 2.5 cm (0.25, 0.5, 0.75 and 1.0 in.), simulating air gaps that can occur in actual wells from hole enlargements behind the casing

  8. Application of computer mathematical modeling in nuclear well-logging industry

    International Nuclear Information System (INIS)

    Cai Shaohui

    1994-01-01

    Nuclear well logging techniques have made rapid progress since the first well log calibration facility (the API pits) was dedicated in 1959. Then came the first computer mathematical model in the late 70's. Mathematical modeling can now minimize design and experiment time, as well as provide new information and idea on tool design, environmental effects and result interpretation. The author gives a brief review on the achievements of mathematical modeling on nuclear logging problems

  9. Computer model for calculating gamma-ray pulse-height spectra for logging applications

    International Nuclear Information System (INIS)

    Evans, M.L.

    1981-01-01

    A generalized computer model has been devised to simulate the emission, transport, and detection of natural gamma radiation from various logging environments. The model yields high-resolution gamma-ray pulse-height spectra that can be used to correct both gross gamma and spectral gamma-ray logs. The technique can help provide corrections to airborne and surface radiometric survey logs for the effects of varying altitude, formation composition, and overburden. Applied to borehole logging, the model can yield estimates of the effects of varying borehole fluid and casing attenuations, as well as varying formation porosity and saturation

  10. Estimation of geological formation thermal conductivity by using stochastic approximation method based on well-log temperature data

    International Nuclear Information System (INIS)

    Cheng, Wen-Long; Huang, Yong-Hua; Liu, Na; Ma, Ran

    2012-01-01

    Thermal conductivity is a key parameter for evaluating wellbore heat losses which plays an important role in determining the efficiency of steam injection processes. In this study, an unsteady formation heat-transfer model was established and a cost-effective in situ method by using stochastic approximation method based on well-log temperature data was presented. The proposed method was able to estimate the thermal conductivity and the volumetric heat capacity of geological formation simultaneously under the in situ conditions. The feasibility of the present method was assessed by a sample test, the results of which shown that the thermal conductivity and the volumetric heat capacity could be obtained with the relative errors of −0.21% and −0.32%, respectively. In addition, three field tests were conducted based on the easily obtainable well-log temperature data from the steam injection wells. It was found that the relative errors of thermal conductivity for the three field tests were within ±0.6%, demonstrating the excellent performance of the proposed method for calculating thermal conductivity. The relative errors of volumetric heat capacity ranged from −6.1% to −14.2% for the three field tests. Sensitivity analysis indicated that this was due to the low correlation between the volumetric heat capacity and the wellbore temperature, which was used to generate the judgment criterion. -- Highlights: ► A cost-effective in situ method for estimating thermal properties of formation was presented. ► Thermal conductivity and volumetric heat capacity can be estimated simultaneously by the proposed method. ► The relative error of thermal conductivity estimated was within ±0.6%. ► Sensitivity analysis was conducted to study the estimated results of thermal properties.

  11. Progress in nuclear well logging modeling using deterministic transport codes

    International Nuclear Information System (INIS)

    Kodeli, I.; Aldama, D.L.; Maucec, M.; Trkov, A.

    2002-01-01

    Further studies in continuation of the work presented in 2001 in Portoroz were performed in order to study and improve the performances, precission and domain of application of the deterministic transport codes with respect to the oil well logging analysis. These codes are in particular expected to complement the Monte Carlo solutions, since they can provide a detailed particle flux distribution in the whole geometry in a very reasonable CPU time. Real-time calculation can be envisaged. The performances of deterministic transport methods were compared to those of the Monte Carlo method. IRTMBA generic benchmark was analysed using the codes MCNP-4C and DORT/TORT. Centric as well as excentric casings were considered using 14 MeV point neutron source and NaI scintillation detectors. Neutron and gamma spectra were compared at two detector positions.(author)

  12. Effects of post-fire logging on forest surface air temperatures in the Siskiyou Mountains, Oregon, USA

    Science.gov (United States)

    Joseph B. Fontaine; Daniel C. Donato; John L. Campbell; Jonathan G. Martin; Beverley E. Law

    2010-01-01

    Following stand-replacing wildfire, post-fire (salvage) logging of fire-killed trees is a widely implemented management practice in many forest types. A common hypothesis is that removal of fire-killed trees increases surface temperatures due to loss of shade and increased solar radiation, thereby influencing vegetation establishment and possibly stand development. Six...

  13. [Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].

    Science.gov (United States)

    Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L

    2017-03-10

    To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.

  14. Modelling research on determining shape coefficients for subdivision interpretation in γ-ray spectral logging

    International Nuclear Information System (INIS)

    Yin Wangming; She Guanjun; Tang Bin

    2011-01-01

    This paper first describes the physical meaning of the shape coefficients in the subdivision interpretation of γ-ray logging; then discusses the theory, method to determine the practical shape coefficients with logging model and defines the formula to approximately calculate the coefficients. A great deal of experimental work has been preformed with a HPGe γ-ray spectrometer and reached satisfied result which has validated the effeciency of the modelling method. (authors)

  15. Determination of Transport Properties From Flowing Fluid Temperature Logging In Unsaturated Fractured Rocks: Theory And Semi-Analytical Solution

    International Nuclear Information System (INIS)

    Mukhopadhyay, Sumit; Tsang, Yvonne W.

    2008-01-01

    Flowing fluid temperature logging (FFTL) has been recently proposed as a method to locate flowing fractures. We argue that FFTL, backed up by data from high-precision distributed temperature sensors, can be a useful tool in locating flowing fractures and in estimating the transport properties of unsaturated fractured rocks. We have developed the theoretical background needed to analyze data from FFTL. In this paper, we present a simplified conceptualization of FFTL in unsaturated fractured rock, and develop a semianalytical solution for spatial and temporal variations of pressure and temperature inside a borehole in response to an applied perturbation (pumping of air from the borehole). We compare the semi-analytical solution with predictions from the TOUGH2 numerical simulator. Based on the semi-analytical solution, we propose a method to estimate the permeability of the fracture continuum surrounding the borehole. Using this proposed method, we estimated the effective fracture continuum permeability of the unsaturated rock hosting the Drift Scale Test (DST) at Yucca Mountain, Nevada. Our estimate compares well with previous independent estimates for fracture permeability of the DST host rock. The conceptual model of FFTL presented in this paper is based on the assumptions of single-phase flow, convection-only heat transfer, and negligible change in system state of the rock formation. In a sequel paper (Mukhopadhyay et al., 2008), we extend the conceptual model to evaluate some of these assumptions. We also perform inverse modeling of FFTL data to estimate, in addition to permeability, other transport parameters (such as porosity and thermal conductivity) of unsaturated fractured rocks

  16. Biomass yield and modeling of logging residues of Terminalia ...

    African Journals Online (AJOL)

    The use of Dbh as an independent variable in the prediction of models for estimating the biomass residues of the tree species was adjudged best because it performed well. The validation results showed that the selected models satisfied the assumptions of regression analysis. The practical implication of the models is that ...

  17. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    Science.gov (United States)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  18. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  19. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  20. Log-normal frailty models fitted as Poisson generalized linear mixed models.

    Science.gov (United States)

    Hirsch, Katharina; Wienke, Andreas; Kuss, Oliver

    2016-12-01

    The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known since decades. As shown in recent studies, this equivalence carries over to clustered survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for generalized linear mixed models are readily available for fitting frailty models. This gain in flexibility comes at the small price of (1) having to fix the number of pieces for the baseline hazard in advance and (2) having to "explode" the data set by the number of pieces. In this paper we extend the simulations of former studies by using a more realistic baseline hazard (Gompertz) and by comparing the model under consideration with competing models. Furthermore, the SAS macro %PCFrailty is introduced to apply the Poisson generalized linear mixed approach to frailty models. The simulations show good results for the shared frailty model. Our new %PCFrailty macro provides proper estimates, especially in case of 4 events per piece. The suggested Poisson generalized linear mixed approach for log-normal frailty models based on the %PCFrailty macro provides several advantages in the analysis of clustered survival data with respect to more flexible modelling of fixed and random effects, exact (in the sense of non-approximate) maximum likelihood estimation, and standard errors and different types of confidence intervals for all variance parameters. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. A Log Logistic Survival Model Applied to Hypobaric Decompression Sickness

    Science.gov (United States)

    Conkin, Johnny

    2001-01-01

    Decompression sickness (DCS) is a complex, multivariable problem. A mathematical description or model of the likelihood of DCS requires a large amount of quality research data, ideas on how to define a decompression dose using physical and physiological variables, and an appropriate analytical approach. It also requires a high-performance computer with specialized software. I have used published DCS data to develop my decompression doses, which are variants of equilibrium expressions for evolved gas plus other explanatory variables. My analytical approach is survival analysis, where the time of DCS occurrence is modeled. My conclusions can be applied to simple hypobaric decompressions - ascents lasting from 5 to 30 minutes - and, after minutes to hours, to denitrogenation (prebreathing). They are also applicable to long or short exposures, and can be used whether the sufferer of DCS is at rest or exercising at altitude. Ultimately I would like my models to be applied to astronauts to reduce the risk of DCS during spacewalks, as well as to future spaceflight crews on the Moon and Mars.

  2. TF insert experiment log book. 2nd Experiment of CS model coil

    International Nuclear Information System (INIS)

    Sugimoto, Makoto; Isono, Takaaki; Matsui, Kunihiro

    2001-12-01

    The cool down of CS model coil and TF insert was started on August 20, 2001. It took almost one month and immediately started coil charge since September 17, 2001. The charge test of TF insert and CS model coil was completed on October 19, 2001. In this campaign, total shot numbers were 88 and the size of the data file in the DAS (Data Acquisition System) was about 4 GB. This report is a database that consists of the log list and the log sheets of every shot. This is an experiment logbook for 2nd experiment of CS model coil and TF insert for charge test. (author)

  3. Bayesian log-periodic model for financial crashes

    DEFF Research Database (Denmark)

    Rodríguez-Caballero, Carlos Vladimir; Knapik, Oskar

    2014-01-01

    This paper introduces a Bayesian approach in econophysics literature about financial bubbles in order to estimate the most probable time for a financial crash to occur. To this end, we propose using noninformative prior distributions to obtain posterior distributions. Since these distributions...... cannot be performed analytically, we develop a Markov Chain Monte Carlo algorithm to draw from posterior distributions. We consider three Bayesian models that involve normal and Student’s t-distributions in the disturbances and an AR(1)-GARCH(1,1) structure only within the first case. In the empirical...... part of the study, we analyze a well-known example of financial bubble – the S&P 500 1987 crash – to show the usefulness of the three methods under consideration and crashes of Merval-94, Bovespa-97, IPCMX-94, Hang Seng-97 using the simplest method. The novelty of this research is that the Bayesian...

  4. Discovering block-structured process models from event logs containing infrequent behaviour

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Aalst, van der W.M.P.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Given an event log describing observed behaviour, process discovery aims to find a process model that ‘best’ describes this behaviour. A large variety of process discovery algorithms has been proposed. However, no existing algorithm returns a sound model in all cases (free of deadlocks and other

  5. Nb3Al insert experiment log book. 3rd experiment of CS model coil

    International Nuclear Information System (INIS)

    Sugimoto, Makoto; Koizumi, Norikiyo; Isono, Takaaki

    2002-10-01

    The cool down of CS model coil and Nb 3 Al insert was started on March 4, 2002. It took almost one month and immediately started coil charge since April 3, 2002. The charge test of Nb 3 Al insert and CS model coil was completed on May 2, 2002. All of the experiments including the warm up was also completed on May 30, 2002. In this campaign, total shot numbers were 102 and the size of the data file in the DAS (Data Acquisition System) was about 5.2 GB. This report is a database that consists of the log list and the log sheets of every shot. (author)

  6. Bayesian Hierarchical Scale Mixtures of Log-Normal Models for Inference in Reliability with Stochastic Constraint

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2017-06-01

    Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.

  7. Pricing FX Options in the Heston/CIR Jump-Diffusion Model with Log-Normal and Log-Uniform Jump Amplitudes

    Directory of Open Access Journals (Sweden)

    Rehez Ahlip

    2015-01-01

    model for the exchange rate with log-normal jump amplitudes and the volatility model with log-uniformly distributed jump amplitudes. We assume that the domestic and foreign stochastic interest rates are governed by the CIR dynamics. The instantaneous volatility is correlated with the dynamics of the exchange rate return, whereas the domestic and foreign short-term rates are assumed to be independent of the dynamics of the exchange rate and its volatility. The main result furnishes a semianalytical formula for the price of the foreign exchange European call option.

  8. Accounting for measurement error in log regression models with applications to accelerated testing.

    Directory of Open Access Journals (Sweden)

    Robert Richardson

    Full Text Available In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  9. Accounting for measurement error in log regression models with applications to accelerated testing.

    Science.gov (United States)

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  10. Minimizing bias in biomass allometry: Model selection and log transformation of data

    Science.gov (United States)

    Joseph Mascaro; undefined undefined; Flint Hughes; Amanda Uowolo; Stefan A. Schnitzer

    2011-01-01

    Nonlinear regression is increasingly used to develop allometric equations for forest biomass estimation (i.e., as opposed to the raditional approach of log-transformation followed by linear regression). Most statistical software packages, however, assume additive errors by default, violating a key assumption of allometric theory and possibly producing spurious models....

  11. High-Resolution Wellbore Temperature Logging Combined with a Borehole-Scale Heat Budget: Conceptual and Analytical Approaches to Characterize Hydraulically Active Fractures and Groundwater Origin

    Directory of Open Access Journals (Sweden)

    Guillaume Meyzonnat

    2018-01-01

    Full Text Available This work aims to provide an overview of the thermal processes that shape wellbore temperature profiles under static and dynamic conditions. Understanding of the respective influences of advection and conduction heat fluxes is improved through the use of a new heat budget at the borehole scale. Keeping in mind the thermal processes involved, a qualitative interpretation of the temperature profiles allows the occurrence, the position, and the origin of groundwater flowing into wellbores from hydraulically active fractures to be constrained. With the use of a heat budget developed at the borehole scale, temperature logging efficiency has been quantitatively enhanced and allows inflow temperatures to be calculated through the simultaneous use of a flowmeter. Under certain hydraulic or pumping conditions, both inflow intensities and associated temperatures can also be directly modelled from temperature data and the use of the heat budget. Theoretical and applied examples of the heat budget application are provided. Applied examples are shown using high-resolution temperature logging, spinner flow metering, and televiewing for three wells installed in fractured bedrock aquifers in the St-Lawrence Lowlands, Quebec, Canada. Through relatively rapid manipulations, thermal measurements in such cases can be used to detect the intervals or discrete positions of hydraulically active fractures in wellbores, as well as the existence of ambient flows with a high degree of sensitivity, even at very low flows. Heat budget calculations at the borehole scale during pumping indicate that heat advection fluxes rapidly dominate over heat conduction fluxes with the borehole wall. The full characterization of inflow intensities provides information about the distribution of hydraulic properties with depth. The full knowledge of inflow temperatures indicates horizons that are drained from within the aquifer, providing advantageous information on the depth from which

  12. Encyclopedia of well logging

    International Nuclear Information System (INIS)

    Desbrandes, R.

    1985-01-01

    The 16 chapters of this book aim to provide students, trainees and engineers with a manual covering all well-logging measurements ranging from drilling to productions, from oil to minerals going by way of geothermal energy. Each chapter is a summary but a bibliography is given at the end of each chapter. Well-logging during drilling, wireline logging equipment and techniques, petroleum logging, data processing of borehole data, interpretation of well-logging, sampling tools, completion and production logging, logging in relief wells to kill off uncontrolled blowouts, techniques for high temperature geothermal energy, small-scale mining and hydrology, logging with oil-base mud and finally recommended logging programs are all topics covered. There is one chapter on nuclear well-logging which is indexed separately. (UK)

  13. A Hierarchical Poisson Log-Normal Model for Network Inference from RNA Sequencing Data

    Science.gov (United States)

    Gallopin, Mélina; Rau, Andrea; Jaffrézic, Florence

    2013-01-01

    Gene network inference from transcriptomic data is an important methodological challenge and a key aspect of systems biology. Although several methods have been proposed to infer networks from microarray data, there is a need for inference methods able to model RNA-seq data, which are count-based and highly variable. In this work we propose a hierarchical Poisson log-normal model with a Lasso penalty to infer gene networks from RNA-seq data; this model has the advantage of directly modelling discrete data and accounting for inter-sample variance larger than the sample mean. Using real microRNA-seq data from breast cancer tumors and simulations, we compare this method to a regularized Gaussian graphical model on log-transformed data, and a Poisson log-linear graphical model with a Lasso penalty on power-transformed data. For data simulated with large inter-sample dispersion, the proposed model performs better than the other methods in terms of sensitivity, specificity and area under the ROC curve. These results show the necessity of methods specifically designed for gene network inference from RNA-seq data. PMID:24147011

  14. Large ground warming in the Canadian Arctic inferred from inversions of temperature logs

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J. A.; Skinner, W. R.; Šafanda, Jan

    2004-01-01

    Roč. 221, č. 1 (2004), s. 15-25 ISSN 0012-821X Institutional research plan: CEZ:AV0Z3012916 Keywords : global warming * borehole temperatures * ground temperatures Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 3.499, year: 2004

  15. Efficient and accurate log-Lévy approximations to Lévy driven LIBOR models

    DEFF Research Database (Denmark)

    Papapantoleon, Antonis; Schoenmakers, John; Skovmand, David

    2011-01-01

    The LIBOR market model is very popular for pricing interest rate derivatives, but is known to have several pitfalls. In addition, if the model is driven by a jump process, then the complexity of the drift term is growing exponentially fast (as a function of the tenor length). In this work, we con...... ratchet caps show that the approximations perform very well. In addition, we also consider the log-L\\'evy approximation of annuities, which offers good approximations for high volatility regimes....

  16. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  17. A tool for aligning event logs and prescriptive process models through automated planning

    OpenAIRE

    De Leoni, M.; Lanciano, G.; Marrella, A.

    2017-01-01

    In Conformance Checking, alignment is the problem of detecting and repairing nonconformity between the actual execution of a business process, as recorded in an event log, and the model of the same process. Literature proposes solutions for the alignment problem that are implementations of planning algorithms built ad-hoc for the specific problem. Unfortunately, in the era of big data, these ad-hoc implementations do not scale sufficiently compared with wellestablished planning systems. In th...

  18. Source rock formation evaluation using TOC & Ro log model based on well-log data procesing: study case of Ngimbang formation, North East Java basin

    Directory of Open Access Journals (Sweden)

    Fatahillah Yosar

    2017-01-01

    Full Text Available Ngimbang Formation is known as one major source of hydrocarbon supply in the North Eastern Java Basin. Aged Mid-Eocene, Ngimbang is dominated by sedimentary clastic rocks mostly shale, shaly sandstone, and thick layers of limestone (CD Limestone, with thin layers of coal. Although, laboratory analyses show the Ngimbang Formation to be a relatively rich source-rocks, such data are typically too limited to regionally quantify the distribution of organic matter. To adequately sample the formation both horizontally and vertically on a basin–wide scale, large number of costly and time consuming laboratory analyses would be required. Such analyses are prone to errors from a number of sources, and core data are frequently not available at key locations. In this paper, the authors established four TOC (Total Organic Carbon Content logging calculation models; Passey, Schmoker-Hester, Meyer-Nederloff, and Decker/Density Model by considering the geology of Ngimbang. Well data along with its available core data was used to determine the most suitable model to be applied in the well AFA-1, as well as to compare the accuracy of these TOC model values. The result shows good correlation using Decker (TOC Model and Mallick-Raju (Ro- Vitrinite Reflectance Model. Two source rocks potential zones were detected by these log models.

  19. Efficient and Accurate Log-Levy Approximations of Levy-Driven LIBOR Models

    DEFF Research Database (Denmark)

    Papapantoleon, Antonis; Schoenmakers, John; Skovmand, David

    2012-01-01

    The LIBOR market model is very popular for pricing interest rate derivatives but is known to have several pitfalls. In addition, if the model is driven by a jump process, then the complexity of the drift term grows exponentially fast (as a function of the tenor length). We consider a Lévy-driven ...... ratchet caps show that the approximations perform very well. In addition, we also consider the log-Lévy approximation of annuities, which offers good approximations for high-volatility regimes....

  20. STOCHASTIC PRICING MODEL FOR THE REAL ESTATE MARKET: FORMATION OF LOG-NORMAL GENERAL POPULATION

    Directory of Open Access Journals (Sweden)

    Oleg V. Rusakov

    2015-01-01

    Full Text Available We construct a stochastic model of real estate pricing. The method of the pricing construction is based on a sequential comparison of the supply prices. We proof that under standard assumptions imposed upon the comparison coefficients there exists an unique non-degenerated limit in distribution and this limit has the lognormal law of distribution. The accordance of empirical distributions of prices to thetheoretically obtained log-normal distribution we verify by numerous statistical data of real estate prices from Saint-Petersburg (Russia. For establishing this accordance we essentially apply the efficient and sensitive test of fit of Kolmogorov-Smirnov. Basing on “The Russian Federal Estimation Standard N2”, we conclude that the most probable price, i.e. mode of distribution, is correctly and uniquely defined under the log-normal approximation. Since the mean value of log-normal distribution exceeds the mode - most probable value, it follows that the prices valued by the mathematical expectation are systematically overstated.

  1. Effect of log soaking and the temperature of peeling on the properties of rotary-cut birch (Betula pendula Roth) veneer bonded with phenol-formaldehyde adhesive

    Science.gov (United States)

    Anti Rohumaa; Akio Yamamoto; Christopher G. Hunt; Charles R. Frihart; Mark Hughes; Jaan Kers

    2016-01-01

    Heating logs prior to peeling positively affects the surface properties of veneer as well as the wood-adhesive bond strength. However, the mechanism behind this increase in strength is not fully understood. The aim of the present study was to separate the influence of soaking temperature and peeling temperature on the physical surface properties and bonding quality....

  2. Comparing Multiple-Group Multinomial Log-Linear Models for Multidimensional Skill Distributions in the General Diagnostic Model. Research Report. ETS RR-08-35

    Science.gov (United States)

    Xu, Xueli; von Davier, Matthias

    2008-01-01

    The general diagnostic model (GDM) utilizes located latent classes for modeling a multidimensional proficiency variable. In this paper, the GDM is extended by employing a log-linear model for multiple populations that assumes constraints on parameters across multiple groups. This constrained model is compared to log-linear models that assume…

  3. Log-layer mismatch and modeling of the fluctuating wall stress in wall-modeled large-eddy simulations

    Science.gov (United States)

    Yang, Xiang I. A.; Park, George Ilhwan; Moin, Parviz

    2017-10-01

    Log-layer mismatch refers to a chronic problem found in wall-modeled large-eddy simulation (WMLES) or detached-eddy simulation, where the modeled wall-shear stress deviates from the true one by approximately 15 % . Many efforts have been made to resolve this mismatch. The often-used fixes, which are generally ad hoc, include modifying subgrid-scale stress models, adding a stochastic forcing, and moving the LES-wall-model matching location away from the wall. An analysis motivated by the integral wall-model formalism suggests that log-layer mismatch is resolved by the built-in physics-based temporal filtering. In this work we investigate in detail the effects of local filtering on log-layer mismatch. We show that both local temporal filtering and local wall-parallel filtering resolve log-layer mismatch without moving the LES-wall-model matching location away from the wall. Additionally, we look into the momentum balance in the near-wall region to provide an alternative explanation of how LLM occurs, which does not necessarily rely on the numerical-error argument. While filtering resolves log-layer mismatch, the quality of the wall-shear stress fluctuations predicted by WMLES does not improve with our remedy. The wall-shear stress fluctuations are highly underpredicted due to the implied use of LES filtering. However, good agreement can be found when the WMLES data are compared to the direct numerical simulation data filtered at the corresponding WMLES resolutions.

  4. The Meaning of Logs

    NARCIS (Netherlands)

    Etalle, Sandro; Massacci, Fabio; Yautsiukhin, Artsiom; Lambrinoudakis, Costas; Pernul, Günther; Tjoa, A Min

    While logging events is becoming increasingly common in computing, in communication and in collaborative environments, log systems need to satisfy increasingly challenging (if not conflicting) requirements. In this paper we propose a high-level framework for modeling log systems, and reasoning about

  5. Operator product expansion in Liouville field theory and Seiberg-type transitions in log-correlated random energy models

    Science.gov (United States)

    Cao, Xiangyu; Le Doussal, Pierre; Rosso, Alberto; Santachiara, Raoul

    2018-04-01

    We study transitions in log-correlated random energy models (logREMs) that are related to the violation of a Seiberg bound in Liouville field theory (LFT): the binding transition and the termination point transition (a.k.a., pre-freezing). By means of LFT-logREM mapping, replica symmetry breaking and traveling-wave equation techniques, we unify both transitions in a two-parameter diagram, which describes the free-energy large deviations of logREMs with a deterministic background log potential, or equivalently, the joint moments of the free energy and Gibbs measure in logREMs without background potential. Under the LFT-logREM mapping, the transitions correspond to the competition of discrete and continuous terms in a four-point correlation function. Our results provide a statistical interpretation of a peculiar nonlocality of the operator product expansion in LFT. The results are rederived by a traveling-wave equation calculation, which shows that the features of LFT responsible for the transitions are reproduced in a simple model of diffusion with absorption. We examine also the problem by a replica symmetry breaking analysis. It complements the previous methods and reveals a rich large deviation structure of the free energy of logREMs with a deterministic background log potential. Many results are verified in the integrable circular logREM, by a replica-Coulomb gas integral approach. The related problem of common length (overlap) distribution is also considered. We provide a traveling-wave equation derivation of the LFT predictions announced in a precedent work.

  6. Borehole logging

    International Nuclear Information System (INIS)

    Olsen, H.

    1995-01-01

    Numerous ground water investigations have been accomplished by means of borehole logging. Borehole logging can be applied to establish new water recovery wells, to control the existing water producing wells and source areas and to estimate ground water quality. (EG)

  7. Log-linear model based behavior selection method for artificial fish swarm algorithm.

    Science.gov (United States)

    Huang, Zhehuang; Chen, Yidong

    2015-01-01

    Artificial fish swarm algorithm (AFSA) is a population based optimization technique inspired by social behavior of fishes. In past several years, AFSA has been successfully applied in many research and application areas. The behavior of fishes has a crucial impact on the performance of AFSA, such as global exploration ability and convergence speed. How to construct and select behaviors of fishes are an important task. To solve these problems, an improved artificial fish swarm algorithm based on log-linear model is proposed and implemented in this paper. There are three main works. Firstly, we proposed a new behavior selection algorithm based on log-linear model which can enhance decision making ability of behavior selection. Secondly, adaptive movement behavior based on adaptive weight is presented, which can dynamically adjust according to the diversity of fishes. Finally, some new behaviors are defined and introduced into artificial fish swarm algorithm at the first time to improve global optimization capability. The experiments on high dimensional function optimization showed that the improved algorithm has more powerful global exploration ability and reasonable convergence speed compared with the standard artificial fish swarm algorithm.

  8. Log-Linear Model Based Behavior Selection Method for Artificial Fish Swarm Algorithm

    Directory of Open Access Journals (Sweden)

    Zhehuang Huang

    2015-01-01

    Full Text Available Artificial fish swarm algorithm (AFSA is a population based optimization technique inspired by social behavior of fishes. In past several years, AFSA has been successfully applied in many research and application areas. The behavior of fishes has a crucial impact on the performance of AFSA, such as global exploration ability and convergence speed. How to construct and select behaviors of fishes are an important task. To solve these problems, an improved artificial fish swarm algorithm based on log-linear model is proposed and implemented in this paper. There are three main works. Firstly, we proposed a new behavior selection algorithm based on log-linear model which can enhance decision making ability of behavior selection. Secondly, adaptive movement behavior based on adaptive weight is presented, which can dynamically adjust according to the diversity of fishes. Finally, some new behaviors are defined and introduced into artificial fish swarm algorithm at the first time to improve global optimization capability. The experiments on high dimensional function optimization showed that the improved algorithm has more powerful global exploration ability and reasonable convergence speed compared with the standard artificial fish swarm algorithm.

  9. Forward modelling of multi-component induction logging tools in layered anisotropic dipping formations

    International Nuclear Information System (INIS)

    Gao, Jie; Xu, Chenhao; Xiao, Jiaqi

    2013-01-01

    Multi-component induction logging provides great assistance in the exploration of thinly laminated reservoirs. The 1D parametric inversion following an adaptive borehole correction is the key step in the data processing of multi-component induction logging responses. To make the inversion process reasonably fast, an efficient forward modelling method is necessary. In this paper, a modelling method has been developed to simulate the multi-component induction tools in deviated wells drilled in layered anisotropic formations. With the introduction of generalized reflection coefficients, the analytic expressions of magnetic field in the form of a Sommerfeld integral were derived. The fast numerical computation of the integral has been completed by using the fast Fourier–Hankel transform and fast Hankel transform methods. The latter is so time efficient that it is competent enough for real-time multi-parameter inversion. In this paper, some simulated results have been presented and they are in excellent agreement with the finite difference method code's solution. (paper)

  10. Use of critical pathway models and log-normal frequency distributions for siting nuclear facilities

    International Nuclear Information System (INIS)

    Waite, D.A.; Denham, D.H.

    1975-01-01

    The advantages and disadvantages of potential sites for nuclear facilities are evaluated through the use of environmental pathway and log-normal distribution analysis. Environmental considerations of nuclear facility siting are necessarily geared to the identification of media believed to be sifnificant in terms of dose to man or to be potential centres for long-term accumulation of contaminants. To aid in meeting the scope and purpose of this identification, an exposure pathway diagram must be developed. This type of diagram helps to locate pertinent environmental media, points of expected long-term contaminant accumulation, and points of population/contaminant interface for both radioactive and non-radioactive contaminants. Confirmation of facility siting conclusions drawn from pathway considerations must usually be derived from an investigatory environmental surveillance programme. Battelle's experience with environmental surveillance data interpretation using log-normal techniques indicates that this distribution has much to offer in the planning, execution and analysis phases of such a programme. How these basic principles apply to the actual siting of a nuclear facility is demonstrated for a centrifuge-type uranium enrichment facility as an example. A model facility is examined to the extent of available data in terms of potential contaminants and facility general environmental needs. A critical exposure pathway diagram is developed to the point of prescribing the characteristics of an optimum site for such a facility. Possible necessary deviations from climatic constraints are reviewed and reconciled with conclusions drawn from the exposure pathway analysis. Details of log-normal distribution analysis techniques are presented, with examples of environmental surveillance data to illustrate data manipulation techniques and interpretation procedures as they affect the investigatory environmental surveillance programme. Appropriate consideration is given these

  11. Characterization of Rock Mechanical Properties Using Lab Tests and Numerical Interpretation Model of Well Logs

    Directory of Open Access Journals (Sweden)

    Hao Xu

    2016-01-01

    Full Text Available The tight gas reservoir in the fifth member of the Xujiahe formation contains heterogeneous interlayers of sandstone and shale that are low in both porosity and permeability. Elastic characteristics of sandstone and shale are analyzed in this study based on petrophysics tests. The tests indicate that sandstone and mudstone samples have different stress-strain relationships. The rock tends to exhibit elastic-plastic deformation. The compressive strength correlates with confinement pressure and elastic modulus. The results based on thin-bed log interpretation match dynamic Young’s modulus and Poisson’s ratio predicted by theory. The compressive strength is calculated from density, elastic impedance, and clay contents. The tensile strength is calibrated using compressive strength. Shear strength is calculated with an empirical formula. Finally, log interpretation of rock mechanical properties is performed on the fifth member of the Xujiahe formation. Natural fractures in downhole cores and rock microscopic failure in the samples in the cross section demonstrate that tensile fractures were primarily observed in sandstone, and shear fractures can be observed in both mudstone and sandstone. Based on different elasticity and plasticity of different rocks, as well as the characteristics of natural fractures, a fracture propagation model was built.

  12. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    Science.gov (United States)

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log

  13. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  14. Design and Development of a Relative Humidity and Room Temperature Measurement System with On Line Data Logging Feature for Monitoring the Fermentation Room of Tea Factory

    Directory of Open Access Journals (Sweden)

    Utpal SARMA

    2011-12-01

    Full Text Available The design and development of a Relative Humidity (RH and Room Temperature (RT monitoring system with on line data logging feature for monitoring fermentation room of a tea factory is presented in this paper. A capacitive RH sensor with on chip signal conditioner is taken as RH sensor and a temperature to digital converter (TDC is used for ambient temperature monitoring. An 8051 core microcontroller is the heart of the whole system which reads the digital equivalent of RH data with the help of a 12-bit Analog to Digital (A/D converter and synchronize TDC to get the ambient temperature. The online data logging is achieved with the help of RS-232C communication. Field performance is also studied by installing it in the fermentation room of a tea factory.

  15. Zero temperature landscape of the random sine-Gordon model

    International Nuclear Information System (INIS)

    Sanchez, A.; Bishop, A.R.; Cai, D.

    1997-01-01

    We present a preliminary summary of the zero temperature properties of the two-dimensional random sine-Gordon model of surface growth on disordered substrates. We found that the properties of this model can be accurately computed by using lattices of moderate size as the behavior of the model turns out to be independent of the size above certain length (∼ 128 x 128 lattices). Subsequently, we show that the behavior of the height difference correlation function is of (log r) 2 type up to a certain correlation length (ξ ∼ 20), which rules out predictions of log r behavior for all temperatures obtained by replica-variational techniques. Our results open the way to a better understanding of the complex landscape presented by this system, which has been the subject of very many (contradictory) analysis

  16. Standardizing effect size from linear regression models with log-transformed variables for meta-analysis.

    Science.gov (United States)

    Rodríguez-Barranco, Miguel; Tobías, Aurelio; Redondo, Daniel; Molina-Portillo, Elena; Sánchez, María José

    2017-03-17

    Meta-analysis is very useful to summarize the effect of a treatment or a risk factor for a given disease. Often studies report results based on log-transformed variables in order to achieve the principal assumptions of a linear regression model. If this is the case for some, but not all studies, the effects need to be homogenized. We derived a set of formulae to transform absolute changes into relative ones, and vice versa, to allow including all results in a meta-analysis. We applied our procedure to all possible combinations of log-transformed independent or dependent variables. We also evaluated it in a simulation based on two variables either normally or asymmetrically distributed. In all the scenarios, and based on different change criteria, the effect size estimated by the derived set of formulae was equivalent to the real effect size. To avoid biased estimates of the effect, this procedure should be used with caution in the case of independent variables with asymmetric distributions that significantly differ from the normal distribution. We illustrate an application of this procedure by an application to a meta-analysis on the potential effects on neurodevelopment in children exposed to arsenic and manganese. The procedure proposed has been shown to be valid and capable of expressing the effect size of a linear regression model based on different change criteria in the variables. Homogenizing the results from different studies beforehand allows them to be combined in a meta-analysis, independently of whether the transformations had been performed on the dependent and/or independent variables.

  17. Dose estimative in operators during petroleum wells logging with nuclear wireless probes through computer modelling

    International Nuclear Information System (INIS)

    Souza, Edmilson Monteiro de; Silva, Ademir Xavier da; Lopes, Ricardo T.; Correa, Samanda Cristine Arruda; Rocha, Paula L.F.

    2011-01-01

    This paper evaluates the absorbed dose and the effective dose on operators during the petroleum well logging with nuclear wireless that uses gamma radiation sources. To obtain the data, a typical scenery of a logging procedure will be simulated with MCNPX Monte Carlo code. The simulated logging probe was the Density Gamma Probe - TRISOND produced by Robertson Geolloging. The absorbed dose values were estimated through the anthropomorphic simulator in male voxel MAX. The effective dose values were obtained using the ICRP 103

  18. TESTING THE ASSUMPTIONS AND INTERPRETING THE RESULTS OF THE RASCH MODEL USING LOG-LINEAR PROCEDURES IN SPSS

    NARCIS (Netherlands)

    TENVERGERT, E; GILLESPIE, M; KINGMA, J

    This paper shows how to use the log-linear subroutine of SPSS to fit the Rasch model. It also shows how to fit less restrictive models obtained by relaxing specific assumptions of the Rasch model. Conditional maximum likelihood estimation was achieved by including dummy variables for the total

  19. The thermal regime in the resurgent dome of Long Valley Caldera, California: Inferences from precision temperature logs in deep wells

    Science.gov (United States)

    Hurwitz, S.; Farrar, C.D.; Williams, C.F.

    2010-01-01

    Long Valley Caldera in eastern California formed 0.76Ma ago in a cataclysmic eruption that resulted in the deposition of 600km3 of Bishop Tuff. The total current heat flow from the caldera floor is estimated to be ~290MW, and a geothermal power plant in Casa Diablo on the flanks of the resurgent dome (RD) generates ~40MWe. The RD in the center of the caldera was uplifted by ~80cm between 1980 and 1999 and was explained by most models as a response to magma intrusion into the shallow crust. This unrest has led to extensive research on geothermal resources and volcanic hazards in the caldera. Here we present results from precise, high-resolution, temperature-depth profiles in five deep boreholes (327-1,158m) on the RD to assess its thermal state, and more specifically 1) to provide bounds on the advective heat transport as a guide for future geothermal exploration, 2) to provide constraints on the occurrence of magma at shallow crustal depths, and 3) to provide a baseline for future transient thermal phenomena in response to large earthquakes, volcanic activity, or geothermal production. The temperature profiles display substantial non-linearity within each profile and variability between the different profiles. All profiles display significant temperature reversals with depth and temperature gradients <50??C/km at their bottom. The maximum temperature in the individual boreholes ranges between 124.7??C and 129.5??C and bottom hole temperatures range between 99.4??C and 129.5??C. The high-temperature units in the three Fumarole Valley boreholes are at the approximate same elevation as the high-temperature unit in borehole M-1 in Casa Diablo indicating lateral or sub-lateral hydrothermal flow through the resurgent dome. Small differences in temperature between measurements in consecutive years in three of the wells suggest slow cooling of the shallow hydrothermal flow system. By matching theoretical curves to segments of the measured temperature profiles, we calculate

  20. Single and multiple object tracking using log-euclidean Riemannian subspace and block-division appearance model.

    Science.gov (United States)

    Hu, Weiming; Li, Xi; Luo, Wenhan; Zhang, Xiaoqin; Maybank, Stephen; Zhang, Zhongfei

    2012-12-01

    Object appearance modeling is crucial for tracking objects, especially in videos captured by nonstationary cameras and for reasoning about occlusions between multiple moving objects. Based on the log-euclidean Riemannian metric on symmetric positive definite matrices, we propose an incremental log-euclidean Riemannian subspace learning algorithm in which covariance matrices of image features are mapped into a vector space with the log-euclidean Riemannian metric. Based on the subspace learning algorithm, we develop a log-euclidean block-division appearance model which captures both the global and local spatial layout information about object appearances. Single object tracking and multi-object tracking with occlusion reasoning are then achieved by particle filtering-based Bayesian state inference. During tracking, incremental updating of the log-euclidean block-division appearance model captures changes in object appearance. For multi-object tracking, the appearance models of the objects can be updated even in the presence of occlusions. Experimental results demonstrate that the proposed tracking algorithm obtains more accurate results than six state-of-the-art tracking algorithms.

  1. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    Science.gov (United States)

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  2. Experimental and finite element study of the effect of temperature and moisture on the tangential tensile strength and fracture behavior in timber logs

    DEFF Research Database (Denmark)

    Larsen, Finn; Ormarsson, Sigurdur

    2014-01-01

    Timber is normally dried by kiln drying, in the course of which moisture-induced stresses and fractures can occur. Cracks occur primarily in the radial direction due to tangential tensile strength (TSt) that exceeds the strength of the material. The present article reports on experiments and nume......Timber is normally dried by kiln drying, in the course of which moisture-induced stresses and fractures can occur. Cracks occur primarily in the radial direction due to tangential tensile strength (TSt) that exceeds the strength of the material. The present article reports on experiments...... and numerical simulations by finite element modeling (FEM) concerning the TSt and fracture behavior of Norway spruce under various climatic conditions. Thin log disc specimens were studied to simplify the description of the moisture flow in the samples. The specimens designed for TS were acclimatized...... to a moisture content (MC) of 18% before TSt tests at 20°C, 60°C, and 90°C were carried out. The maximum stress results of the disc simulations by FEM were compared with the experimental strength results at the same temperature levels. There is a rather good agreement between the results of modeling...

  3. Impact of temperature, pH, and salinity changes on the physico-chemical properties of model naphthenic acids.

    Science.gov (United States)

    Celsie, Alena; Parnis, J Mark; Mackay, Donald

    2016-03-01

    The effects of temperature, pH, and salinity change on naphthenic acids (NAs) present in oil-sands process wastewater were modeled for 55 representative NAs. COSMO-RS was used to estimate octanol-water (KOW) and octanol-air (KOA) partition ratios and Henry's law constants (H). Validation with experimental carboxylic acid data yielded log KOW and log H RMS errors of 0.45 and 0.55 respectively. Calculations of log KOW, (or log D, for pH-dependence), log KOA and log H (or log HD, for pH-dependence) were made for model NAs between -20 °C and 40 °C, pH between 0 and 14, and salinity between 0 and 3 g NaCl L(-1). Temperature increase by 60 °C resulted in 3-5 log unit increase in H and a similar magnitude decrease in KOA. pH increase above the NA pKa resulted in a dramatic decrease in both log D and log HD. Salinity increase over the 0-3 g NaCl L(-1) range resulted in a 0.3 log unit increase on average for KOW and H values. Log KOW values of the sodium salt and anion of the conjugate base were also estimated to examine their potential for contribution to the overall partitioning of NAs. Sodium salts and anions of naphthenic acids are predicted to have on average 4 log units and 6 log units lower log KOW values, respectively, with respect to the corresponding neutral NA. Partitioning properties are profoundly influenced by the by the relative prevailing pH and the substance's pKa at the relevant temperature. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Modelo de Gerenciamento da Logística Reversa Reverse Logistics Management Model

    Directory of Open Access Journals (Sweden)

    Cecilia Toledo Hernández

    2012-01-01

    Full Text Available O aumento do número de produtos com vida útil menor, a intensificação no uso do comércio eletrônico, leis cada vez mais exigentes de responsabilidade sobre descarte dos produtos e uma crescente consciência ambiental têm gerado um elevado número de retornos, fazendo crescer a importância da Logística Reversa para as empresas e para a sociedade, de forma geral. Contudo, constatou-se na literatura que se trata de uma área ainda pouco explorada e, portanto, não existem dados concretos para se trabalhar e explorar as oportunidades de melhoria. Com o intuito de mitigar esta carência, foi realizada uma pesquisa bibliográfica acerca da relação entre a Logística Reversa e o desempenho empresarial, bem como uma pesquisa junto às empresas, procurando verificar como se dá este relacionamento. Como resultado principal, relacionado diretamente com o objetivo do trabalho, obteve-se um modelo conceitual que contribuiu para a ampliação da visão gerencial sobre o processo de Logística Reversa, modelo este que inclui indicadores de desempenho que permitem avaliar a atividade. Também é proposta do trabalho a utilização de métodos de Tomada de Decisão com Múltiplos Critérios, ferramenta esta que facilita a seleção dos indicadores segundo as estratégias das empresas.The increased number of items with a shorter useful life, the massive use of e-commerce, and increased environmental awareness with increasingly demanding laws on disposal of products, have created a large amount of returns increasing the importance of Reverse Logistics for society and companies, in general. However, according to the literature, Reverse Logistics is still a poorly explored activity, and therefore there are no concrete data to work with and to explore opportunities for improvement. With the goal of bridging this gap, a literature review on the relationship between Reverse Logistics and business performance was conducted in addition to a study carried out

  5. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: a simulation study.

    Science.gov (United States)

    Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P

    2014-06-26

    To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.

  6. The influence of log soaking temperature on surface quality and integrity performance of birch (Betula pendula Roth) veneer

    Science.gov (United States)

    Anti Rohumaa; Toni Antikainen; Christopher G. Hunt; Charles R. Frihart; Mark Hughes

    2016-01-01

    Wood material surface properties play an important role in adhesive bond formation and performance. In the present study, a test method was developed to evaluate the integrity of the wood surface, and the results were used to understand bond performance. Materials used were rotary cut birch (Betula pendula Roth) veneers, produced from logs soaked at 20 or 70 °C prior...

  7. Development of self-learning Monte Carlo technique for more efficient modeling of nuclear logging measurements

    International Nuclear Information System (INIS)

    Zazula, J.M.

    1988-01-01

    The self-learning Monte Carlo technique has been implemented to the commonly used general purpose neutron transport code MORSE, in order to enhance sampling of the particle histories that contribute to a detector response. The parameters of all the biasing techniques available in MORSE, i.e. of splitting, Russian roulette, source and collision outgoing energy importance sampling, path length transformation and additional biasing of the source angular distribution are optimized. The learning process is iteratively performed after each batch of particles, by retrieving the data concerning the subset of histories that passed the detector region and energy range in the previous batches. This procedure has been tested on two sample problems in nuclear geophysics, where an unoptimized Monte Carlo calculation is particularly inefficient. The results are encouraging, although the presented method does not directly minimize the variance and the convergence of our algorithm is restricted by the statistics of successful histories from previous random walk. Further applications for modeling of the nuclear logging measurements seem to be promising. 11 refs., 2 figs., 3 tabs. (author)

  8. A Poisson Log-Normal Model for Constructing Gene Covariation Network Using RNA-seq Data.

    Science.gov (United States)

    Choi, Yoonha; Coram, Marc; Peng, Jie; Tang, Hua

    2017-07-01

    Constructing expression networks using transcriptomic data is an effective approach for studying gene regulation. A popular approach for constructing such a network is based on the Gaussian graphical model (GGM), in which an edge between a pair of genes indicates that the expression levels of these two genes are conditionally dependent, given the expression levels of all other genes. However, GGMs are not appropriate for non-Gaussian data, such as those generated in RNA-seq experiments. We propose a novel statistical framework that maximizes a penalized likelihood, in which the observed count data follow a Poisson log-normal distribution. To overcome the computational challenges, we use Laplace's method to approximate the likelihood and its gradients, and apply the alternating directions method of multipliers to find the penalized maximum likelihood estimates. The proposed method is evaluated and compared with GGMs using both simulated and real RNA-seq data. The proposed method shows improved performance in detecting edges that represent covarying pairs of genes, particularly for edges connecting low-abundant genes and edges around regulatory hubs.

  9. The optimization model of the logging machinery usage in forestry practice

    Directory of Open Access Journals (Sweden)

    Jitka Janová

    2009-01-01

    Full Text Available The decision support systems commonly used in industry and economy managerial practice for optimizing the processes are based on algoritmization of the typical decision problems. In Czech forestry business, there is a lack of developed decision support systems, which could be easily used in daily practice. This stems from the fact, that the application of optimization methods is less successful in forestry decision making than in industry or economy due to inherent complexity of the forestry decision problems. There is worldwide ongoing research on optimization models applicable in forestry decision making, but the results are not globally applicable and moreover the cost of possibly arising software tools are indispensable. Especially small and medium forestry companies in Czech Republic can not afford such additional costs, although the results of optimization could positively in­fluen­ce not only the business itself but also the impact of forestry business on the environment. Hence there is a need for user friendly optimization models for forestry decision making in the area of Czech Republic, which could be easily solved in commonly available software, and whose results would be both, realistic and easily applicable in the daily decision making.The aim of this paper is to develop the optimization model for the machinery use planning in Czech logging firm in such a way, that the results can be obtained using MS EXCEL. The goal is to identify the integer number of particular machines which should be outsourced for the next period, when the total cost minimization is required. The linear programming model is designed covering the typical restrictions on available machinery and total volume of trees to be cut and transported. The model offers additional result in the form of optimal employment of particular machines. The solution procedure is described in detail and the results obtained are discussed with respect to its applicability in

  10. Com obtenir un Model de Regressió Logística Binària amb SPSS

    Directory of Open Access Journals (Sweden)

    Vanesa Berlanga-Silvente

    2014-04-01

    Full Text Available Els models de regressió logística són models estadístics en què es desitja conèixer la relació entre: una variable dependent qualitativa dicotòmica (regressió logística binària o binomial i una o més variables explicatives independents, o covariables, ja siguin qualitatives o quantitatives. També és possible una variable dependent qualitativa amb més de dos valors (regressió logística multinomial, encara que en aquesta fitxa ens centrarem en la regressió logística binària. En qualsevol cas, l'equació inicial del model és de tipus exponencial, si bé la seva transformació logarítmica (logit permet el seu ús com una funció lineal. L'objectiu primordial que resol aquesta tècnica és el de modelar com influeixen la probabilitat d'aparició d'un succés, habitualment dicotòmic, la presència o no de diversos factors, i el valor o nivell dels mateixos. Aquesta fitxa sobre la Regressió Logística Binària explica les opcions que té el programa estadístic SPSS (mètodes automàtics "per passos" i la interpretació dels principals resultats.

  11. Modeling and Inversion Methods for the Interpretation of Resistivity Logging Tool Response

    NARCIS (Netherlands)

    Anderson, B.I.

    2001-01-01

    The electrical resistivity measured by well logging tools is one of the most important rock parameters for indicating the amount of hydrocarbons present in a reservoir. The main interpretation challenge is to invert the measured data, solving for the true resistivity values in each zone of a

  12. Modeling nest survival of cavity-nesting birds in relation to postfire salvage logging

    Science.gov (United States)

    Vicki Saab; Robin E. Russell; Jay Rotella; Jonathan G. Dudley

    2011-01-01

    Salvage logging practices in recently burned forests often have direct effects on species associated with dead trees, particularly cavity-nesting birds. As such, evaluation of postfire management practices on nest survival rates of cavity nesters is necessary for determining conservation strategies. We monitored 1,797 nests of 6 cavity-nesting bird species: Lewis'...

  13. Integration of seismic and well log data for petrophysical modeling of ...

    African Journals Online (AJOL)

    For accurate reservoir property determination, four well logs and seismic data of 5500 to 5900 Xline and 1480 to 1720 Inline range were used to delineate the hydraulic zones of two reservoirs of interest and to determine the average petrophysical properties of the reservoirs.. All the wells contained GR, resistivity, sonic and ...

  14. Reading Logs and Literature Teaching Models in English Language Teacher Education

    Science.gov (United States)

    Ochoa Delarriva, Ornella; Basabe, Enrique Alejandro

    2016-01-01

    Reading logs are regularly used in foreign language education since they are not only critical in the development of reading comprehension but may also be instrumental in taking readers beyond the referential into the representational realms of language. In this paper we offer the results of a qualitative analysis of a series of reading logs…

  15. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    Science.gov (United States)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  16. QSPR Models for Predicting Log Pliver Values for Volatile Organic Compounds Combining Statistical Methods and Domain Knowledge

    Directory of Open Access Journals (Sweden)

    Mónica F. Díaz

    2012-12-01

    Full Text Available Volatile organic compounds (VOCs are contained in a variety of chemicals that can be found in household products and may have undesirable effects on health. Thereby, it is important to model blood-to-liver partition coefficients (log Pliver for VOCs in a fast and inexpensive way. In this paper, we present two new quantitative structure-property relationship (QSPR models for the prediction of log Pliver, where we also propose a hybrid approach for the selection of the descriptors. This hybrid methodology combines a machine learning method with a manual selection based on expert knowledge. This allows obtaining a set of descriptors that is interpretable in physicochemical terms. Our regression models were trained using decision trees and neural networks and validated using an external test set. Results show high prediction accuracy compared to previous log Pliver models, and the descriptor selection approach provides a means to get a small set of descriptors that is in agreement with theoretical understanding of the target property.

  17. Reducing Monte Carlo error in the Bayesian estimation of risk ratios using log-binomial regression models.

    Science.gov (United States)

    Salmerón, Diego; Cano, Juan A; Chirlaque, María D

    2015-08-30

    In cohort studies, binary outcomes are very often analyzed by logistic regression. However, it is well known that when the goal is to estimate a risk ratio, the logistic regression is inappropriate if the outcome is common. In these cases, a log-binomial regression model is preferable. On the other hand, the estimation of the regression coefficients of the log-binomial model is difficult owing to the constraints that must be imposed on these coefficients. Bayesian methods allow a straightforward approach for log-binomial regression models and produce smaller mean squared errors in the estimation of risk ratios than the frequentist methods, and the posterior inferences can be obtained using the software WinBUGS. However, Markov chain Monte Carlo methods implemented in WinBUGS can lead to large Monte Carlo errors in the approximations to the posterior inferences because they produce correlated simulations, and the accuracy of the approximations are inversely related to this correlation. To reduce correlation and to improve accuracy, we propose a reparameterization based on a Poisson model and a sampling algorithm coded in R. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Modeling the relationships among internal defect features and external Appalachian hardwood log defect indicators

    Science.gov (United States)

    R. Edward. Thomas

    2009-01-01

    As a hardwood tree grows and develops, surface defects such as branch stubs and wounds are overgrown. Evidence of these defects remain on the log surface for decades and in many instances for the life of the tree. As the tree grows the defect is encapsulated or grown over by new wood. During this process the appearance of the defect in the tree's bark changes. The...

  19. Weighted log-linear models for service delivery points in Ethiopia: a case of modern contraceptive users at health facilities.

    Science.gov (United States)

    Workie, Demeke Lakew; Zike, Dereje Tesfaye; Fenta, Haile Mekonnen; Mekonnen, Mulusew Admasu

    2018-05-10

    Ethiopia is among countries with low contraceptive usage prevalence rate and resulted in high total fertility rate and unwanted pregnancy which intern affects the maternal and child health status. This study aimed to investigate the major factors that affect the number of modern contraceptive users at service delivery point in Ethiopia. The Performance Monitoring and Accountability2020/Ethiopia data collected between March and April 2016 at round-4 from 461 eligible service delivery points were in this study. The weighted log-linear negative binomial model applied to analyze the service delivery point's data. Fifty percent of service delivery points in Ethiopia given service for 61 modern contraceptive users with the interquartile range of 0.62. The expected log number of modern contraceptive users at rural was 1.05 (95% Wald CI: - 1.42 to - 0.68) lower than the expected log number of modern contraceptive users at urban. In addition, the expected log count of modern contraceptive users at others facility type was 0.58 lower than the expected log count of modern contraceptive users at the health center. The numbers of nurses/midwives were affecting the number of modern contraceptive users. Since, the incidence rate of modern contraceptive users increased by one due to an additional nurse in the delivery point. Among different factors considered in this study, residence, region, facility type, the number of days per week family planning offered, the number of nurses/midwives and number of medical assistants were to be associated with the number of modern contraceptive users. Thus, the Government of Ethiopia would take immediate steps to address causes of the number of modern contraceptive users in Ethiopia.

  20. Revisiting maximum-a-posteriori estimation in log-concave models: from differential geometry to decision theory

    OpenAIRE

    Pereyra, Marcelo

    2016-01-01

    Maximum-a-posteriori (MAP) estimation is the main Bayesian estimation methodology in many areas of data science such as mathematical imaging and machine learning, where high dimensionality is addressed by using models that are log-concave and whose posterior mode can be computed efficiently by using convex optimisation algorithms. However, despite its success and rapid adoption, MAP estimation is not theoretically well understood yet, and the prevalent view is that it is generally not proper ...

  1. Correlation Models for Temperature Fields

    KAUST Repository

    North, Gerald R.

    2011-05-16

    This paper presents derivations of some analytical forms for spatial correlations of evolving random fields governed by a white-noise-driven damped diffusion equation that is the analog of autoregressive order 1 in time and autoregressive order 2 in space. The study considers the two-dimensional plane and the surface of a sphere, both of which have been studied before, but here time is introduced to the problem. Such models have a finite characteristic length (roughly the separation at which the autocorrelation falls to 1/e) and a relaxation time scale. In particular, the characteristic length of a particular temporal Fourier component of the field increases to a finite value as the frequency of the particular component decreases. Some near-analytical formulas are provided for the results. A potential application is to the correlation structure of surface temperature fields and to the estimation of large area averages, depending on how the original datastream is filtered into a distribution of Fourier frequencies (e.g., moving average, low pass, or narrow band). The form of the governing equation is just that of the simple energy balance climate models, which have a long history in climate studies. The physical motivation provided by the derivation from a climate model provides some heuristic appeal to the approach and suggests extensions of the work to nonuniform cases.

  2. Correlation Models for Temperature Fields

    KAUST Repository

    North, Gerald R.; Wang, Jue; Genton, Marc G.

    2011-01-01

    This paper presents derivations of some analytical forms for spatial correlations of evolving random fields governed by a white-noise-driven damped diffusion equation that is the analog of autoregressive order 1 in time and autoregressive order 2 in space. The study considers the two-dimensional plane and the surface of a sphere, both of which have been studied before, but here time is introduced to the problem. Such models have a finite characteristic length (roughly the separation at which the autocorrelation falls to 1/e) and a relaxation time scale. In particular, the characteristic length of a particular temporal Fourier component of the field increases to a finite value as the frequency of the particular component decreases. Some near-analytical formulas are provided for the results. A potential application is to the correlation structure of surface temperature fields and to the estimation of large area averages, depending on how the original datastream is filtered into a distribution of Fourier frequencies (e.g., moving average, low pass, or narrow band). The form of the governing equation is just that of the simple energy balance climate models, which have a long history in climate studies. The physical motivation provided by the derivation from a climate model provides some heuristic appeal to the approach and suggests extensions of the work to nonuniform cases.

  3. APPLICATION OF GIS AND GROUNDWATER MODELLING TECHNIQUES TO IDENTIFY THE PERCHED AQUIFERS TO DEMARKATE WATER LOGGING CONDITIONS IN PARTS OF MEHSANA

    Directory of Open Access Journals (Sweden)

    D. Rawal

    2016-06-01

    The study highlights the application of GIS in establishing the basic parameters of soil, land use and the distribution of water logging over a period of time and the groundwater modelling identifies the groundwater regime of the area and estimates the total recharge to the area due to surface water irrigation and rainfall and suggests suitable method to control water logging in the area.

  4. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  5. East to west retardation in the onset of the recent warming across Canada inferred from inversions of temperature logs

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J.; Šafanda, Jan; Skinner, W.

    2002-01-01

    Roč. 107, B10 (2002), s. ETG6 1-12 ISSN 0148-0227 Institutional research plan: CEZ:AV0Z3012916 Keywords : Canada climate warming * borehole temperature * geothermics Subject RIV: DB - Geology ; Mineralogy Impact factor: 2.245, year: 2002

  6. Ground surface warming history in northern Canada inferred from inversions of temperature logs and comparison with other proxy climate reconstructions

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J. A.; Skinner, W. R.; Šafanda, Jan

    2005-01-01

    Roč. 162, č. 2 (2005), s. 109-128 ISSN 0033-4553 Institutional research plan: CEZ:AV0Z30120515 Keywords : global warming * regional climate variability and change * borehole temperatures Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.975, year: 2005

  7. Effect of postglacial warming seen in high precision temperature log deep into the granites in NE Alberta

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J.; Šafanda, Jan

    2015-01-01

    Roč. 104, č. 6 (2015), s. 1563-1571 ISSN 1437-3254 R&D Projects: GA ČR(CZ) GAP210/11/0183 Institutional support: RVO:67985530 Keywords : surface processes * borehole temperatures * climatic warming * Ice Age * heat flow Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 2.133, year: 2015

  8. Structural equation and log-linear modeling: a comparison of methods in the analysis of a study on caregivers' health

    Directory of Open Access Journals (Sweden)

    Rosenbaum Peter L

    2006-10-01

    Full Text Available Abstract Background In this paper we compare the results in an analysis of determinants of caregivers' health derived from two approaches, a structural equation model and a log-linear model, using the same data set. Methods The data were collected from a cross-sectional population-based sample of 468 families in Ontario, Canada who had a child with cerebral palsy (CP. The self-completed questionnaires and the home-based interviews used in this study included scales reflecting socio-economic status, child and caregiver characteristics, and the physical and psychological well-being of the caregivers. Both analytic models were used to evaluate the relationships between child behaviour, caregiving demands, coping factors, and the well-being of primary caregivers of children with CP. Results The results were compared, together with an assessment of the positive and negative aspects of each approach, including their practical and conceptual implications. Conclusion No important differences were found in the substantive conclusions of the two analyses. The broad confirmation of the Structural Equation Modeling (SEM results by the Log-linear Modeling (LLM provided some reassurance that the SEM had been adequately specified, and that it broadly fitted the data.

  9. Detection and quantification of local anthropogenic and regional climatic transient signals in temperature logs from Czechia and Slovenia

    Czech Academy of Sciences Publication Activity Database

    Dědeček, Petr; Šafanda, Jan; Rajver, D.

    2012-01-01

    Roč. 113, č. 3-4 (2012), s. 787-801 ISSN 0165-0009 R&D Projects: GA ČR(CZ) GAP210/11/0183; GA AV ČR KSK3046108; GA ČR GETOP/08/E014 Institutional research plan: CEZ:AV0Z30120515 Keywords : subsurface temperature * thermal conductivity * urbanization Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 3.634, year: 2012

  10. Log-correlated random-energy models with extensive free-energy fluctuations: Pathologies caused by rare events as signatures of phase transitions

    Science.gov (United States)

    Cao, Xiangyu; Fyodorov, Yan V.; Le Doussal, Pierre

    2018-02-01

    We address systematically an apparent nonphysical behavior of the free-energy moment generating function for several instances of the logarithmically correlated models: the fractional Brownian motion with Hurst index H =0 (fBm0) (and its bridge version), a one-dimensional model appearing in decaying Burgers turbulence with log-correlated initial conditions and, finally, the two-dimensional log-correlated random-energy model (logREM) introduced in Cao et al. [Phys. Rev. Lett. 118, 090601 (2017), 10.1103/PhysRevLett.118.090601] based on the two-dimensional Gaussian free field with background charges and directly related to the Liouville field theory. All these models share anomalously large fluctuations of the associated free energy, with a variance proportional to the log of the system size. We argue that a seemingly nonphysical vanishing of the moment generating function for some values of parameters is related to the termination point transition (i.e., prefreezing). We study the associated universal log corrections in the frozen phase, both for logREMs and for the standard REM, filling a gap in the literature. For the above mentioned integrable instances of logREMs, we predict the nontrivial free-energy cumulants describing non-Gaussian fluctuations on the top of the Gaussian with extensive variance. Some of the predictions are tested numerically.

  11. Geothermal well log interpretation midterm report

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, S.K.; Wells, L.E.; Bickham, R.E.

    1979-02-01

    Reservoir types are defined according to fluid phase and temperature, lithology, geologic province, pore geometry, and salinity and fluid chemistry. Improvements are needed in lithology and porosity definition, fracture detection, and thermal evaluation for more accurate interpretation. Further efforts are directed toward improving diagnostic techniques for relating rock characteristics and log response, developing petrophysical models for geothermal systems, and developing thermal evaluation techniques. The Geothermal Well Log Interpretation study and report has concentrated only on hydrothermal geothermal reservoirs. Other geothermal reservoirs (hot dry rock, geopressured, etc.) are not considered.

  12. A DDES model with a Smagorinsky-type eddy viscosity formulation and log-layer mismatch correction

    International Nuclear Information System (INIS)

    Reddy, K.R.; Ryon, J.A.; Durbin, P.A.

    2014-01-01

    Highlights: • An alternate DDES formulation is proposed via the eddy viscosity definition. • Eddy viscosity is expressed as a Smagorinsky-type formula. • Log-layer mismatch is corrected by changing the length scale definition. • Model is validated for 2D as well as 3D flows. - Abstract: The current work develops a variant of delayed detached eddy simulation (DDES) that could be characterized as limiting the production term. Previous formulations have been based on limiting the dissipation rate (Spalart et al., 2006). A clipped length scale is applied directly to the eddy viscosity, yielding a Smagorinsky-like formulation when the model is on the eddy simulation branch. That clipped eddy viscosity limits the production rate. The length scale is modified in order to account for the log-layer mismatch (a well-known issue with DDES), without using additional blending functions. Another view of our approach is that the subgrid eddy-viscosity is represented by a mixing length formula l 2 ω; in the eddy field ω acts like a filtered rate of strain. Our model is validated for channel flow as well as separated flows (backward-facing step, 2D periodic hills) and illustrated via an air-blast atomizer

  13. Markovian Model in High Order Sequence Prediction From Log-Motif Patterns in Agbada Paralic Section, Niger Delta, Nigeria

    International Nuclear Information System (INIS)

    Olabode, S. O.; Adekoya, J. A.

    2002-01-01

    Markovian model in the elucidation of high order sequence was applied to repetitive events of regressive and transgressive phases in the Agbada paralic section Niger Delta. The repetitive events are made up of delta front, delta topset and fluvio-deltaic sediments. The sediments consist of sands, sandstones, siltstones and shales in various proportions. Five wells: MN1, AA1, NP2, NP6 and NP8 were studied.Summary of biostratigraphic report and well log-motif patterns was used to delineate the third order depositional sequences in the wells.Various Markovian properties - observed transition frequency matrix, observed transition probability matrix, fixed probability vector, expected random matrix (randomised transition matrix) and difference matrix were determined for stacked high order sequence (high frequency cyclic events) nested within the third-order sequences using the log-motif patterns for the various sand bodies and shales. Flow diagrams were constructed for each of the depositional sequences to know the likely occurrence of number of cycles.Upward transition matrix between the log-motif patterns and flow diagram to elucidate cyclicity show that the overall regressive sequence of the Niger Delta has been modified by deltaic depositional elements and fluctuations in sea level. The predictions of higher order sequence within third order sequences from Markovian Properties provide good basis for correlation within the depositional sequences. The model has also been used to decipher the dominant depositional processes during the formation of the sequences. Discrete reservoir intervals and seal potentials within the sequences were also predicted from the flow diagrams constructed

  14. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge

    Directory of Open Access Journals (Sweden)

    Matthew F. Hollon

    2015-12-01

    Full Text Available Background: By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Objectives: Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. Method: The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. Results: The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008; however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39, remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001. Conclusions: Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents.

  15. Past surface temperature changes as derived from continental temperature logs - Canadian and some global examples of application of a new tool in climate change studies

    Czech Academy of Sciences Publication Activity Database

    Majorowicz, J.; Šafanda, Jan; Skinner, W.

    2004-01-01

    Roč. 47, - (2004), s. 113-174 ISSN 0065-2687 R&D Projects: GA AV ČR KSK3046108 Institutional research plan: CEZ:AV0Z3012916 Keywords : well temperature * global warming * surface temperature Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.667, year: 2004

  16. Data logging of body temperatures provides precise information on phenology of reproductive events in a free-living arctic hibernator

    Science.gov (United States)

    Williams, C.T.; Sheriff, M.J.; Schmutz, J.A.; Kohl, F.; Toien, O.; Buck, C.L.; Barnes, B.M.

    2011-01-01

    Precise measures of phenology are critical to understanding how animals organize their annual cycles and how individuals and populations respond to climate-induced changes in physical and ecological stressors. We show that patterns of core body temperature (T b) can be used to precisely determine the timing of key seasonal events including hibernation, mating and parturition, and immergence and emergence from the hibernacula in free-living arctic ground squirrels (Urocitellus parryii). Using temperature loggers that recorded T b every 20 min for up to 18 months, we monitored core T b from three females that subsequently gave birth in captivity and from 66 female and 57 male ground squirrels free-living in the northern foothills of the Brooks Range Alaska. In addition, dates of emergence from hibernation were visually confirmed for four free-living male squirrels. Average T b in captive females decreased by 0.5–1.0°C during gestation and abruptly increased by 1–1.5°C on the day of parturition. In free-living females, similar shifts in T b were observed in 78% (n = 9) of yearlings and 94% (n = 31) of adults; females without the shift are assumed not to have given birth. Three of four ground squirrels for which dates of emergence from hibernation were visually confirmed did not exhibit obvious diurnal rhythms in T b until they first emerged onto the surface when T b patterns became diurnal. In free-living males undergoing reproductive maturation, this pre-emergence euthermic interval averaged 20.4 days (n = 56). T b-loggers represent a cost-effective and logistically feasible method to precisely investigate the phenology of reproduction and hibernation in ground squirrels.

  17. Dynamic Model of High Temperature PEM Fuel Cell Stack Temperature

    DEFF Research Database (Denmark)

    Andreasen, Søren Juhl; Kær, Søren Knudsen

    2007-01-01

    cathode air cooled 30 cell HTPEM fuel cell stack developed at the Institute of Energy Technology at Aalborg University. This fuel cell stack uses PEMEAS Celtec P-1000 membranes, runs on pure hydrogen in a dead end anode configuration with a purge valve. The cooling of the stack is managed by running......The present work involves the development of a model for predicting the dynamic temperature of a high temperature PEM (HTPEM) fuel cell stack. The model is developed to test different thermal control strategies before implementing them in the actual system. The test system consists of a prototype...... the stack at a high stoichiometric air flow. This is possible because of the PBI fuel cell membranes used, and the very low pressure drop in the stack. The model consists of a discrete thermal model dividing the stack into three parts: inlet, middle and end and predicting the temperatures in these three...

  18. Temperature Calculations in the Coastal Modeling System

    Science.gov (United States)

    2017-04-01

    ERDC/CHL CHETN-IV-110 April 2017 Approved for public release; distribution is unlimited . Temperature Calculations in the Coastal Modeling...tide) and river discharge at model boundaries, wave radiation stress, and wind forcing over a model computational domain. Physical processes calculated...calculated in the CMS using the following meteorological parameters: solar radiation, cloud cover, air temperature, wind speed, and surface water temperature

  19. Log-gamma linear-mixed effects models for multiple outcomes with application to a longitudinal glaucoma study

    Science.gov (United States)

    Zhang, Peng; Luo, Dandan; Li, Pengfei; Sharpsten, Lucie; Medeiros, Felipe A.

    2015-01-01

    Glaucoma is a progressive disease due to damage in the optic nerve with associated functional losses. Although the relationship between structural and functional progression in glaucoma is well established, there is disagreement on how this association evolves over time. In addressing this issue, we propose a new class of non-Gaussian linear-mixed models to estimate the correlations among subject-specific effects in multivariate longitudinal studies with a skewed distribution of random effects, to be used in a study of glaucoma. This class provides an efficient estimation of subject-specific effects by modeling the skewed random effects through the log-gamma distribution. It also provides more reliable estimates of the correlations between the random effects. To validate the log-gamma assumption against the usual normality assumption of the random effects, we propose a lack-of-fit test using the profile likelihood function of the shape parameter. We apply this method to data from a prospective observation study, the Diagnostic Innovations in Glaucoma Study, to present a statistically significant association between structural and functional change rates that leads to a better understanding of the progression of glaucoma over time. PMID:26075565

  20. Patterns for a log-based strengthening of declarative compliance models

    NARCIS (Netherlands)

    Schunselaar, Dennis M.M.; Maggi, Fabrizio M.; Sidorova, Natalia

    2012-01-01

    LTL-based declarative process models are very effective when modelling loosely structured processes or working in environments with a lot of variability. A process model is represented by a set of constraints that must be satisfied during the process execution. An important application of such

  1. Log Linear Models for Religious and Social Factors affecting the practice of Family Planning Methods in Lahore, Pakistan

    Directory of Open Access Journals (Sweden)

    Farooq Ahmad

    2006-01-01

    Full Text Available This is cross sectional study based on 304 households (couples with wives age less than 48 years, chosen from urban locality (city Lahore. Fourteen religious, demographic and socio-economic factors of categorical nature like husband education, wife education, husband’s monthly income, occupation of husband, household size, husband-wife discussion, number of living children, desire for more children, duration of marriage, present age of wife, age of wife at marriage, offering of prayers, political view, and religiously decisions were taken to understand acceptance of family planning. Multivariate log-linear analysis was applied to identify association pattern and interrelationship among factors. The logit model was applied to explore the relationship between predictor factors and dependent factor, and to explore which are the factors upon which acceptance of family planning is highly depending. Log-linear analysis demonstrate that preference of contraceptive use was found to be consistently associated with factors Husband-Wife discussion, Desire for more children, No. of children, Political view and Duration of married life. While Husband’s monthly income, Occupation of husband, Age of wife at marriage and Offering of prayers resulted in no statistical explanation of adoption of family planning methods.

  2. Weather Derivatives and Stochastic Modelling of Temperature

    Directory of Open Access Journals (Sweden)

    Fred Espen Benth

    2011-01-01

    Full Text Available We propose a continuous-time autoregressive model for the temperature dynamics with volatility being the product of a seasonal function and a stochastic process. We use the Barndorff-Nielsen and Shephard model for the stochastic volatility. The proposed temperature dynamics is flexible enough to model temperature data accurately, and at the same time being analytically tractable. Futures prices for commonly traded contracts at the Chicago Mercantile Exchange on indices like cooling- and heating-degree days and cumulative average temperatures are computed, as well as option prices on them.

  3. Modeling maximum daily temperature using a varying coefficient regression model

    Science.gov (United States)

    Han Li; Xinwei Deng; Dong-Yum Kim; Eric P. Smith

    2014-01-01

    Relationships between stream water and air temperatures are often modeled using linear or nonlinear regression methods. Despite a strong relationship between water and air temperatures and a variety of models that are effective for data summarized on a weekly basis, such models did not yield consistently good predictions for summaries such as daily maximum temperature...

  4. An automated method to build groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; He, X.

    2015-01-01

    of electrical resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study......Large-scale integrated hydrological models are important decision support tools in water resources management. The largest source of uncertainty in such models is the hydrostratigraphic model. Geometry and configuration of hydrogeological units are often poorly determined from hydrogeological data......-scale groundwater models. We present a novel method to automatically integrate large AEM data-sets and lithological information into large-scale hydrological models. Clay-fraction maps are produced by translating geophysical resistivity into clay-fraction values using lithological borehole information. Voxel models...

  5. Long-term impacts of selective logging on two Amazonian tree species with contrasting ecological and reproductive characteristics: inferences from Eco-gene model simulations.

    Science.gov (United States)

    Vinson, C C; Kanashiro, M; Sebbenn, A M; Williams, T C R; Harris, S A; Boshier, D H

    2015-08-01

    The impact of logging and subsequent recovery after logging is predicted to vary depending on specific life history traits of the logged species. The Eco-gene simulation model was used to evaluate the long-term impacts of selective logging over 300 years on two contrasting Brazilian Amazon tree species, Dipteryx odorata and Jacaranda copaia. D. odorata (Leguminosae), a slow growing climax tree, occurs at very low densities, whereas J. copaia (Bignoniaceae) is a fast growing pioneer tree that occurs at high densities. Microsatellite multilocus genotypes of the pre-logging populations were used as data inputs for the Eco-gene model and post-logging genetic data was used to verify the output from the simulations. Overall, under current Brazilian forest management regulations, there were neither short nor long-term impacts on J. copaia. By contrast, D. odorata cannot be sustainably logged under current regulations, a sustainable scenario was achieved by increasing the minimum cutting diameter at breast height from 50 to 100 cm over 30-year logging cycles. Genetic parameters were only slightly affected by selective logging, with reductions in the numbers of alleles and single genotypes. In the short term, the loss of alleles seen in J. copaia simulations was the same as in real data, whereas fewer alleles were lost in D. odorata simulations than in the field. The different impacts and periods of recovery for each species support the idea that ecological and genetic information are essential at species, ecological guild or reproductive group levels to help derive sustainable management scenarios for tropical forests.

  6. Interpretation of horizontal well production logs: influence of logging tool

    Energy Technology Data Exchange (ETDEWEB)

    Ozkan, E. [Colorado School of Mines, Boulder, CO (United States); Sarica, C. [Pennsylvania State Univ., College Park, PA (United States); Haci, M. [Drilling Measurements, Inc (United States)

    1998-12-31

    The influence of a production-logging tool on wellbore flow rate and pressure measurements was investigated, focusing on the disturbence caused by the production-logging tool and the coiled tubing on the original flow conditions in the wellbore. The investigation was carried out using an analytical model and single-phase liquid flow was assumed. Results showed that the production-logging tool influenced the measurements as shown by the deviation of the original flow-rate, pressure profiles and low-conductivity wellbores. High production rates increase the effect of the production-logging tool. Recovering or inferring the original flow conditions in the wellbore from the production-logging data is a very complex process which cannot be solved easily. For this reason, the conditions under which the information obtained by production-logging is meaningful is of considerable practical interest. 7 refs., 2 tabs., 15 figs.

  7. Performance evaluation of groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; He, X.

    2015-01-01

    resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study. Benchmarking......Large-scale hydrological models are important decision support tools in water resources management. The largest source of uncertainty in such models is the hydrostratigraphic model. Geometry and configuration of hydrogeological units are often poorly determined from hydrogeological data alone. Due...... present a novel method to automatically integrate large AEM data sets and lithological information into large-scale hydrological models. Clay-fraction maps are produced by translating geophysical resistivity into clay-fraction values using lithological borehole information. Voxel models of electrical...

  8. Analysis of the Factors Affecting the Interval between Blood Donations Using Log-Normal Hazard Model with Gamma Correlated Frailties.

    Science.gov (United States)

    Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center,  capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (Pdonation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.

  9. Bayesian Poisson log-bilinear models for mortality projections with multiple populations

    NARCIS (Netherlands)

    Antonio, K.; Bardoutsos, A.; Ouburg, W.

    2015-01-01

    Life insurers, pension funds, health care providers and social security institutions face increasing expenses due to continuing improvements of mortality rates. The actuarial and demographic literature has introduced a myriad of (deterministic and stochastic) models to forecast mortality rates of

  10. Design Thinking and Cloud Manufacturing: A Study of Cloud Model Sharing Platform Based on Separated Data Log

    Directory of Open Access Journals (Sweden)

    Zhe Wei

    2013-01-01

    Full Text Available To solve the product data consistency problem which is caused by the portable system that cannot conduct real-time update of product data in mobile environment under the mass customization production mode, a new product data optimistic replication method based on log is presented. This paper focuses on the design thinking provider, probing into a manufacturing resource design thinking cloud platform based on manufacturing resource-locating technologies, and also discuss several application scenarios of cloud locating technologies in the manufacturing environment. The actual demand of manufacturing creates a new mode which is service-oriented and has high efficiency and low consumption. Finally, they are different from the crowd-sourcing application model of Local-Motors. The sharing platform operator is responsible for a master plan for the platform, proposing a open interface standard and establishing a service operation mode.

  11. Informing groundwater model hydrostratigraphy with airborne time-domain electromagnetic data and borehole logs

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Bauer-Gottwein, Peter; Mosegaard, Klaus

    lithological information directly into groundwater models is proposed. The approach builds on a clay-fraction inversion which is a spatially variable translation of resistivity values from EM data into clay-fraction values using borehole lithological information. Hydrostratigraphical units are obtained through...... a k-means cluster analysis of the principal components of resistivity and clay-fraction values. Under the assumption that the units have uniform hydrological properties, the units constitute the hydrostratigraphy for a groundwater model. Only aquifer structures are obtained from geophysical...... and lithological data, while the estimation of the hydrological properties of the units is inversely derived from the groundwater model and hydrological data. A synthetic analysis was performed to investigate the principles underlying the clustering approach using three petrophysical relationships between...

  12. Validation of the OpCost logging cost model using contractor surveys

    Science.gov (United States)

    Conor K. Bell; Robert F. Keefe; Jeremy S. Fried

    2017-01-01

    OpCost is a harvest and fuel treatment operations cost model developed to function as both a standalone tool and an integrated component of the Bioregional Inventory Originated Simulation Under Management (BioSum) analytical framework for landscape-level analysis of forest management alternatives. OpCost is an updated implementation of the Fuel Reduction Cost Simulator...

  13. 2d forward modelling of marine CSEM survey geometry for seabed logging

    International Nuclear Information System (INIS)

    Hussain, N.; Noh, M.; Yahya, N.B.

    2011-01-01

    Hydrocarbon reserve exploration in deep water is done by geophysical surveys. Previously seismic geophysical surveys were explicitly used but it has indistinct results for both water and hydrocarbon saturated reservoir. Recent development for the detection of hydrocarbon reservoir in deeper water is Marine Controlled Source Electromagnetic (MCSEM) geophysical survey. MCSEM is sensitive to electrical conductivity of rocks by which it can differentiate between hydrocarbon reservoir and water saturated reservoir. MCSEM survey geometry put vital role and may causes for anomalies in synthetic data. Consequentially MCSEM is sensitive to survey geometry (e.g. source dipping, rotation and speed, receivers' orientation etc) which causes anomalies. The interpretation for delineating subsurface structure from survey data need to well understand the effects of survey geometry anomalies. Forward modelling is an alternative rather real time survey to study the aforementioned anomalies. In this paper finite difference method (FDM) is implemented for 2D forward modelling in the sense of qualitative understanding to how induced Electromagnetic (EM) signal changes its overall pattern while interact with physical earth properties. A stratified earth structure is developed and modelled in MatLabTM software to study the behaviour of EM field with physical earth properties. Obtained results of 2D geological models are also discussed in this paper. (author)

  14. Low-field NMR logging sensor for measuring hydraulic parameters of model soils

    Science.gov (United States)

    Sucre, Oscar; Pohlmeier, Andreas; Minière, Adrien; Blümich, Bernhard

    2011-08-01

    SummaryKnowing the exact hydraulic parameters of soils is very important for improving water management in agriculture and for the refinement of climate models. Up to now, however, the investigation of such parameters has required applying two techniques simultaneously which is time-consuming and invasive. Thus, the objective of this current study is to present only one technique, i.e., a new non-invasive method to measure hydraulic parameters of model soils by using low-field nuclear magnetic resonance (NMR). Hereby, two model clay or sandy soils were respectively filled in a 2 m-long acetate column having an integrated PVC tube. After the soils were completely saturated with water, a low-field NMR sensor was moved up and down in the PVC tube to quantitatively measure along the whole column the initial water content of each soil sample. Thereafter, both columns were allowed to drain. Meanwhile, the NMR sensor was set at a certain depth to measure the water content of that soil slice. Once the hydraulic equilibrium was reached in each of the two columns, a final moisture profile was taken along the whole column. Three curves were subsequently generated accordingly: (1) the initial moisture profile, (2) the evolution curve of the moisture depletion at that particular depth, and (3) the final moisture profile. All three curves were then inverse analyzed using a MATLAB code over numerical data produced with the van Genuchten-Mualem model. Hereby, a set of values ( α, n, θr and θs) was found for the hydraulic parameters for the soils under research. Additionally, the complete decaying NMR signal could be analyzed through Inverse Laplace Transformation and averaged on the 1/ T2 space. Through measurement of the decay in pure water, the effect on the relaxation caused by the sample could be estimated from the obtained spectra. The migration of the sample-related average with decreasing saturation speaks for a enhancement of the surface relaxation as the soil dries, in

  15. Temperature Dependent Models of Semiconductor Devices for ...

    African Journals Online (AJOL)

    The paper presents an investigation of the temperature dependent model of a diode and bipolar transistor built-in to the NAP-2 program and comparison of these models with experimentally measured characteristics of the BA 100 diode and BC 109 transistor. The detail of the modelling technique has been discussed and ...

  16. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  17. Multiple Temperature Model for Near Continuum Flows

    International Nuclear Information System (INIS)

    XU, Kun; Liu, Hongwei; Jiang, Jianzheng

    2007-01-01

    In the near continuum flow regime, the flow may have different translational temperatures in different directions. It is well known that for increasingly rarefied flow fields, the predictions from continuum formulation, such as the Navier-Stokes equations, lose accuracy. These inaccuracies may be partially due to the single temperature assumption in the Navier-Stokes equations. Here, based on the gas-kinetic Bhatnagar-Gross-Krook (BGK) equation, a multitranslational temperature model is proposed and used in the flow calculations. In order to fix all three translational temperatures, two constraints are additionally proposed to model the energy exchange in different directions. Based on the multiple temperature assumption, the Navier-Stokes relation between the stress and strain is replaced by the temperature relaxation term, and the Navier-Stokes assumption is recovered only in the limiting case when the flow is close to the equilibrium with the same temperature in different directions. In order to validate the current model, both the Couette and Poiseuille flows are studied in the transition flow regime

  18. Data Logging and Data Modelling: Using seismology and seismic data to create challenge in the academic classroom.

    Science.gov (United States)

    Neighbour, Gordon

    2013-04-01

    In 2012 Computing and Information Technology was disapplied from the English National Curriculum and therefore no longer has a compulsory programme of study. Data logging and data modelling are still essential components of the curriculum in the Computing and Information Technology classroom. Once the students have mastered the basics of both spreadsheet and information handling software they need to be further challenged. All too often the data used with relation to data-logging and data-handling is not realistic enough to really challenge very able students. However, using data from seismology allows students to manipulate "real" data and enhances their experience of geo-science, developing their skills and then allowing them to build on this work in both the science and geography classroom. This new scheme of work "Seismology at School" has allowed the students to work and develop skills beyond those normally expected for their age group and has allowed them to better appreciate their learning experience of "Natural Hazards" in the science and geography classroom in later years. The students undertake research to help them develop their understanding of earthquakes. This includes using materials from other nations within the European Economic Area, to also develop and challenge their use of Modern Foreign Languages. They are then challenged to create their own seismometers using simple kits and 'free' software - this "problem-solving" approach to their work is designed to enhance team-work and to extend the challenge they experience in the classroom. The students are then are asked to manipulate a "real" set of data using international earthquake data from the most recent whole year. This allows the students to make use of many of the analytical and statistical functions of both spreadsheet software and information handling software in a meaningful way. The students will need to have developed a hypothesis which their work should have provided either validation

  19. Modeling of concrete response at high temperature

    International Nuclear Information System (INIS)

    Pfeiffer, P.; Marchertas, A.

    1984-01-01

    A rate-type creep law is implemented into the computer code TEMP-STRESS for high temperature concrete analysis. The disposition of temperature, pore pressure and moisture for the particular structure in question is provided as input for the thermo-mechanical code. The loss of moisture from concrete also induces material shrinkage which is accounted for in the analytical model. Examples are given to illustrate the numerical results

  20. Geophysical constraints on Rio Grande rift structure and stratigraphy from magnetotelluric models and borehole resistivity logs, northern New Mexico

    Science.gov (United States)

    Rodriguez, Brian D.; Sawyer, David A.; Hudson, Mark R.; Grauch, V.J.S.

    2013-01-01

    Two- and three-dimensional electrical resistivity models derived from the magnetotelluric method were interpreted to provide more accurate hydrogeologic parameters for the Albuquerque and Española Basins. Analysis and interpretation of the resistivity models are aided by regional borehole resistivity data. Examination of the magnetotelluric response of hypothetical stratigraphic cases using resistivity characterizations from the borehole data elucidates two scenarios where the magnetotelluric method provides the strongest constraints. In the first scenario, the magnetotelluric method constrains the thickness of extensive volcanic cover, the underlying thickness of coarser-grained facies of buried Santa Fe Group sediments, and the depth to Precambrian basement or overlying Pennsylvanian limestones. In the second scenario, in the absence of volcanic cover, the magnetotelluric method constrains the thickness of coarser-grained facies of buried Santa Fe Group sediments and the depth to Precambrian basement or overlying Pennsylvanian limestones. Magnetotelluric surveys provide additional constraints on the relative positions of basement rocks and the thicknesses of Paleozoic, Mesozoic, and Tertiary sedimentary rocks in the region of the Albuquerque and Española Basins. The northern extent of a basement high beneath the Cerros del Rio volcanic field is delineated. Our results also reveal that the largest offset of the Hubbell Spring fault zone is located 5 km west of the exposed scarp. By correlating our resistivity models with surface geology and the deeper stratigraphic horizons using deep well log data, we are able to identify which of the resistivity variations in the upper 2 km belong to the upper Santa Fe Group sediment

  1. Comparison of Different Fuel Temperature Models

    Energy Technology Data Exchange (ETDEWEB)

    Weddig, Beatrice

    2003-02-01

    The purpose of this work is to improve the performance of the core calculation system used in Ringhals for in-core fuel management. It has been observed that, whereas the codes yield results that are in good agreement with measurements when the core operates at full nominal power, this agreement deteriorates noticeably when the reactor is running at reduced power. This deficiency of the code system was observed by comparing the calculated and measured boron concentrations in the moderator of the PWR. From the neutronic point of view, the difference between full power and reduced power in the same core is the different temperature of the fuel and the moderator. Whereas the coolant temperature can be measured and is thus relatively well known, the fuel temperature is only inferred from the moderator temperature as well as neutron physics and heat transfer calculations. The most likely reason for the above mentioned discrepancy is therefore the uncertainty of the fuel temperature at low power, and hence the incorrect calculation of the fuel temperature reactivity feedback through the so called Doppler effect. To obtain the fuel temperature at low power, usually some semi-empirical relations, sometimes called correlations, are used. The above-mentioned inaccuracy of the core calculation procedures can thus be tracked down to the insufficiency of these correlations. Therefore, the suggestion is that the above mentioned deficiency of the core calculation codes can be eliminated or reduced if the fuel temperature correlations are improved. An improved model, called the 30% model, is implemented in SIMULATE-3, the core calculation code used at Ringhals. The accuracy of the 30% model was compared to that of the present model by considering a number of cases, where measured values of the boron concentration at low power were available, and comparing them with calculated values using both the present and the new model. It was found that on the whole, the new fuel temperature

  2. Comparison of Different Fuel Temperature Models

    International Nuclear Information System (INIS)

    Weddig, Beatrice

    2003-02-01

    The purpose of this work is to improve the performance of the core calculation system used in Ringhals for in-core fuel management. It has been observed that, whereas the codes yield results that are in good agreement with measurements when the core operates at full nominal power, this agreement deteriorates noticeably when the reactor is running at reduced power. This deficiency of the code system was observed by comparing the calculated and measured boron concentrations in the moderator of the PWR. From the neutronic point of view, the difference between full power and reduced power in the same core is the different temperature of the fuel and the moderator. Whereas the coolant temperature can be measured and is thus relatively well known, the fuel temperature is only inferred from the moderator temperature as well as neutron physics and heat transfer calculations. The most likely reason for the above mentioned discrepancy is therefore the uncertainty of the fuel temperature at low power, and hence the incorrect calculation of the fuel temperature reactivity feedback through the so called Doppler effect. To obtain the fuel temperature at low power, usually some semi-empirical relations, sometimes called correlations, are used. The above-mentioned inaccuracy of the core calculation procedures can thus be tracked down to the insufficiency of these correlations. Therefore, the suggestion is that the above mentioned deficiency of the core calculation codes can be eliminated or reduced if the fuel temperature correlations are improved. An improved model, called the 30% model, is implemented in SIMULATE-3, the core calculation code used at Ringhals. The accuracy of the 30% model was compared to that of the present model by considering a number of cases, where measured values of the boron concentration at low power were available, and comparing them with calculated values using both the present and the new model. It was found that on the whole, the new fuel temperature

  3. FY 1999 report on the geothermal development promotion survey - Akinomiya area survey. Temperature/pressure logging before the long-term jetting test; 1999 nendo chinetsu kaihatsu sokushin chosa Akinomiya chiiki chosa hokokusho. Choki funshutsu shikenmae no ondo atsuryoku kenso

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-06-01

    As a part of the FY 1999 geothermal development promotion survey - Akinomiya area survey, the rising speed/vertical permeability of the fluid associated with the natural flow were calculated by grasping the temperature of geologic layers/reservoir pressure in the survey area and the temperature distribution for vertical depth. In the survey of boreholes in the Akinomiya area: N9-AY-3, N10-AY-6, N10-AY-7, N10-AY-8, temperature/pressure logging was conducted in the stationary state at the time when a lot of time passed after drilling, water-filling test and jetting test having been finished. In the temperature/pressure logging, the continued measurement was made using PTS logging device to simultaneously measure temperature/pressure/impeller revolution number and lowering measuring device. As a result of the survey, it was assumed that there is a possibility of occurrence of the borehole fluid flow around the depth of 980-1290m and 1320-1540m of N9-AY-3 and around the depth of 880-1090m of N10-AY-8. The rising speed and permeability of fluid from each well indicated the same order at three wells. (NEDO)

  4. Objective classification of latent behavioral states in bio-logging data using multivariate-normal hidden Markov models.

    Science.gov (United States)

    Phillips, Joe Scutt; Patterson, Toby A; Leroy, Bruno; Pilling, Graham M; Nicol, Simon J

    2015-07-01

    many different types of noisy autocorrelated data, as typically found across a range of ecological systems. Summarizing time-series data into a multivariate assemblage of dimensions relevant to the desired classification provides a means to examine these data in an appropriate behavioral space. We discuss how outputs of these models can be applied to bio-logging and other imperfect behavioral data, providing easily interpretable models for hypothesis testing.

  5. Effective model for deconfinement at high temperature

    International Nuclear Information System (INIS)

    Skokov, Vladimir

    2013-01-01

    In this talk I consider the deconfining phase transition at nonzero temperature in a SU(N) gauge theory, using a matrix model. I present some results including the position of the deconfining critical endpoint, where the first order transition for deconfinement is washed out by the presence of massive, dynamical quarks, and properites of the phase transition in the limit of large N. I show that the model is soluble at infinite N, and exhibits a Gross-Witten-Wadia transition

  6. Enhanced battery model including temperature effects

    NARCIS (Netherlands)

    Rosca, B.; Wilkins, S.

    2013-01-01

    Within electric and hybrid vehicles, batteries are used to provide/buffer the energy required for driving. However, battery performance varies throughout the temperature range specific to automotive applications, and as such, models that describe this behaviour are required. This paper presents a

  7. The interpretation of geochemical logs from the oceanic basement: mineral modelling in Ocean Drilling Program (ODP) Hole 735B

    International Nuclear Information System (INIS)

    Harvey, P.K.; Lovell, M.A.; Bristow, J.F.

    1991-01-01

    Leg 118 of the Ocean Drilling Program was carried out in the vicinity of the Southwest Indian Ridge. Of the boreholes drilled, by far the most important and scientifically spectacular is Hole 735B which was located on a shallow platform adjacent to the Atlantis II Transform. This hole penetrates some 500 m of gabbroic rocks representing Layer 3 of the oceanic crust. The recovered gabbros show considerable variation both in mineralogy and in the degree of deformation. Core recovery averages 87% and there is excellent control and correlation between the core and the wide range of logs obtained. Mineralogy logs are derived and presented using both core sample data and downhole geochemical logs for Hole 735B. The problems of transforming these data for the particular mineralogy encountered are discussed. (Author)

  8. A Novel Approach for Analysis of the Log-Linear Age-Period-Cohort Model: Application to Lung Cancer Incidence

    Directory of Open Access Journals (Sweden)

    Tengiz Mdzinarishvili

    2009-12-01

    Full Text Available A simple, computationally efficient procedure for analyses of the time period and birth cohort effects on the distribution of the age-specific incidence rates of cancers is proposed. Assuming that cohort effects for neighboring cohorts are almost equal and using the Log-Linear Age-Period-Cohort Model, this procedure allows one to evaluate temporal trends and birth cohort variations of any type of cancer without prior knowledge of the hazard function. This procedure was used to estimate the influence of time period and birth cohort effects on the distribution of the age-specific incidence rates of first primary, microscopically confirmed lung cancer (LC cases from the SEER9 database. It was shown that since 1975, the time period effect coefficients for men increase up to 1980 and then decrease until 2004. For women, these coefficients increase from 1975 up to 1990 and then remain nearly constant. The LC birth cohort effect coefficients for men and women increase from the cohort of 1890–94 until the cohort of 1925–29, then decrease until the cohort of 1950–54 and then remain almost unchanged. Overall, LC incidence rates, adjusted by period and cohort effects, increase up to the age of about 72–75, turn over, and then fall after the age of 75–78. The peak of the adjusted rates in men is around the age of 77–78, while in women, it is around the age of 72–73. Therefore, these results suggest that the age distribution of the incidence rates in men and women fall at old ages.

  9. Use of multispecies occupancy models to evaluate the response of bird communities to forest degradation associated with logging.

    Science.gov (United States)

    Carrillo-Rubio, Eduardo; Kéry, Marc; Morreale, Stephen J; Sullivan, Patrick J; Gardner, Beth; Cooch, Evan G; Lassoie, James P

    2014-08-01

    Forest degradation is arguably the greatest threat to biodiversity, ecosystem services, and rural livelihoods. Therefore, increasing understanding of how organisms respond to degradation is essential for management and conservation planning. We were motivated by the need for rapid and practical analytical tools to assess the influence of management and degradation on biodiversity and system state in areas subject to rapid environmental change. We compared bird community composition and size in managed (ejido, i.e., communally owned lands) and unmanaged (national park) forests in the Sierra Tarahumara region, Mexico, using multispecies occupancy models and data from a 2-year breeding bird survey. Unmanaged sites had on average higher species occupancy and richness than managed sites. Most species were present in low numbers as indicated by lower values of detection and occupancy associated with logging-induced degradation. Less than 10% of species had occupancy probabilities >0.5, and degradation had no positive effects on occupancy. The estimated metacommunity size of 125 exceeded previous estimates for the region, and sites with mature trees and uneven-aged forest stand characteristics contained the highest species richness. Higher estimation uncertainty and decreases in richness and occupancy for all species, including habitat generalists, were associated with degraded young, even-aged stands. Our findings show that multispecies occupancy methods provide tractable measures of biodiversity and system state and valuable decision support for landholders and managers. These techniques can be used to rapidly address gaps in biodiversity information, threats to biodiversity, and vulnerabilities of species of interest on a landscape level, even in degraded or fast-changing environments. Moreover, such tools may be particularly relevant in the assessment of species richness and distribution in a wide array of habitats. © 2014 Society for Conservation Biology.

  10. Predicting student satisfaction with courses based on log data from a virtual learning environment – a neural network and classification tree model

    Directory of Open Access Journals (Sweden)

    Ivana Đurđević Babić

    2015-03-01

    Full Text Available Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be developed based on course log data and compares the results obtained from implemented methods. The research was conducted at the Faculty of Education in Osijek and included analysis of log data and course satisfaction on a sample of third and fourth year students. Multilayer Perceptron (MLP with different activation functions and Radial Basis Function (RBF neural networks as well as classification tree models were developed, trained and tested in order to classify students into one of two categories of course satisfaction. Type I and type II errors, and input variable importance were used for model comparison and classification accuracy. The results indicate that a successful classification model using tested methods can be created. The MLP model provides the highest average classification accuracy and the lowest preference in misclassification of students with a low level of course satisfaction, although a t-test for the difference in proportions showed that the difference in performance between the compared models is not statistically significant. Student involvement in forum discussions is recognized as a valuable predictor of student satisfaction with courses in all observed models.

  11. Operational Modelling of High Temperature Electrolysis (HTE)

    International Nuclear Information System (INIS)

    Patrick Lovera; Franck Blein; Julien Vulliet

    2006-01-01

    Solid Oxide Fuel Cells (SOFC) and High Temperature Electrolysis (HTE) work on two opposite processes. The basic equations (Nernst equation, corrected by a term of over-voltage) are thus very similar, only a few signs are different. An operational model, based on measurable quantities, was finalized for HTE process, and adapted to SOFCs. The model is analytical, which requires some complementary assumptions (proportionality of over-tensions to the current density, linearization of the logarithmic term in Nernst equation). It allows determining hydrogen production by HTE using a limited number of parameters. At a given temperature, only one macroscopic parameter, related to over-voltages, is needed for adjusting the model to the experimental results (SOFC), in a wide range of hydrogen flow-rates. For a given cell, this parameter follows an Arrhenius law with a satisfactory precision. The prevision in HTE process is compared to the available experimental results. (authors)

  12. Face logging in Copenhagen Limestone, Denmark

    DEFF Research Database (Denmark)

    Jakobsen, Lisa; Foged, Niels Nielsen; Erichsen, Lars

    2015-01-01

    tunnel in Copenhagen more than 2.5 km face logs were made in 467 locations at underground stations, shafts, caverns and along bored tunnels. Over 160 geotechnical boreholes, many with geophysical logging were executed prior to construction works. The bedrock consists of Paleogene "Copenhagen limestone......The requirement for excavation support can be assessed from face logging. Face logs can also improve our knowledge of lithological and structural conditions within bedrock and supplement information from boreholes and geophysical logs. During the construction of 8 km metro tunnel and 4 km heating....... The induration degrees recorded in face logs and boreholes are compared and correlated. Distinct geophysical log markers are used to divide the limestone into three units. These marker horizons are correlated between face logs and geotechnical boreholes. A 3D model of the strength variations recorded within...

  13. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  14. Log N-log S in inconclusive

    Science.gov (United States)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  15. The Simulation of Temperature Field Based on 3D Modeling and Its Comparison versus Measured Temperature Distribution of Daqing Oilfield, NE China

    Science.gov (United States)

    Shi, Y.; Jiang, G.; Hu, S.

    2017-12-01

    Daqing, as the largest oil field of China with more than 50 years of exploration and production history for oil and gas, its geothermal energy utilization was started in 2000, with a main focus on district heating and direct use. In our ongoing study, data from multiple sources are collected, including BHT, DST, steady state temperature measurements in deep wells and thermophysical properties of formations. Based on these measurements, an elaborate investigation of the temperature field of Daqing Oilfield is made. Moreover, through exploration for oil and gas, subsurface geometry, depth, thickness and properties of the stratigraphic layers have been extensively delineated by well logs and seismic profiles. A 3D model of the study area is developed incorporating the information of structure, stratigraphy, basal heat flow, and petrophysical and thermophysical properties of strata. Based on the model, a simulation of the temperature field of Daqing Oilfield is generated. A purely conductive regime is presumed, as demonstrated by measured temperature log in deep wells. Wells W1, W2 and SK2 are used as key wells for model calibration. Among them, SK2, as part of the International Continental Deep Drilling Program, has a designed depth of 6400m, the steady state temperature measurement in the borehole has reached the depth of 4000m. The results of temperature distribution generated from simulation and investigation are compared, in order to evaluate the potential of applying the method to other sedimentary basins with limited borehole temperature measurements but available structural, stratigraphic and thermal regime information.

  16. The Meaning of Logs

    NARCIS (Netherlands)

    Etalle, Sandro; Massacci, Fabio; Yautsiukhin, Artsiom

    2007-01-01

    While logging events is becoming increasingly common in computing, in communication and in collaborative work, log systems need to satisfy increasingly challenging (if not conflicting) requirements.Despite the growing pervasiveness of log systems, to date there is no high-level framework which

  17. Modeling quantum fluid dynamics at nonzero temperatures

    Science.gov (United States)

    Berloff, Natalia G.; Brachet, Marc; Proukakis, Nick P.

    2014-01-01

    The detailed understanding of the intricate dynamics of quantum fluids, in particular in the rapidly growing subfield of quantum turbulence which elucidates the evolution of a vortex tangle in a superfluid, requires an in-depth understanding of the role of finite temperature in such systems. The Landau two-fluid model is the most successful hydrodynamical theory of superfluid helium, but by the nature of the scale separations it cannot give an adequate description of the processes involving vortex dynamics and interactions. In our contribution we introduce a framework based on a nonlinear classical-field equation that is mathematically identical to the Landau model and provides a mechanism for severing and coalescence of vortex lines, so that the questions related to the behavior of quantized vortices can be addressed self-consistently. The correct equation of state as well as nonlocality of interactions that leads to the existence of the roton minimum can also be introduced in such description. We review and apply the ideas developed for finite-temperature description of weakly interacting Bose gases as possible extensions and numerical refinements of the proposed method. We apply this method to elucidate the behavior of the vortices during expansion and contraction following the change in applied pressure. We show that at low temperatures, during the contraction of the vortex core as the negative pressure grows back to positive values, the vortex line density grows through a mechanism of vortex multiplication. This mechanism is suppressed at high temperatures. PMID:24704874

  18. POPULATION STRUCTURES OF FOUR TREE SPECIES IN LOGGED-OVER TROPICAL FOREST IN SOUTH PAPUA, INDONESIA: AN INTEGRAL PROJECTION MODEL APPROACH

    Directory of Open Access Journals (Sweden)

    Relawan kuswandi

    2015-12-01

    Full Text Available Selective logging has been taking place in Papua for several decades. In contrast, very little is known about the stand structure in post-logged forest. Hence, this paper investigates stand structures in logged-over area of tropical forest in South Papua. Four species were selected in three one-hectare permanent sample plots (PSPs: Vatica rassak, Syzygium sp, Litsea timoriana and Canarium asperum. PSPs were located in the forest concession area of PT. Tunas Sawaerma in Assiki, Boven Digul, in South Papua. Data sets comprised measurements made in 2005 and 2012 consisting of species, diameter at breast height (DBH, mortality and number of tree of each species. Integral Projection Models (IPMs were developed, taking into account mortality, growth, recruitment and fecundity. Results show the pattern of stand structures of the four species were more or less similar, i.e. more individual trees were present in the small diameter classes than in the larger diameter classes. The general pattern of the individual distribution of the four species is the typical reverse-J shape. Syzygium sp. has a greater number of individuals in the small diameter classes than the other three species. Population growth rates (λ are above one, indicating that the stand structures of the population dynamics of the four species are recuperating. Conclusively, these results suggest that species composition and population structure in these logged-over forests are recovering increasingly.

  19. A probit- log- skew-normal mixture model for repeated measures data with excess zeros, with application to a cohort study of paediatric respiratory symptoms

    Directory of Open Access Journals (Sweden)

    Johnston Neil W

    2010-06-01

    Full Text Available Abstract Background A zero-inflated continuous outcome is characterized by occurrence of "excess" zeros that more than a single distribution can explain, with the positive observations forming a skewed distribution. Mixture models are employed for regression analysis of zero-inflated data. Moreover, for repeated measures zero-inflated data the clustering structure should also be modeled for an adequate analysis. Methods Diary of Asthma and Viral Infections Study (DAVIS was a one year (2004 cohort study conducted at McMaster University to monitor viral infection and respiratory symptoms in children aged 5-11 years with and without asthma. Respiratory symptoms were recorded daily using either an Internet or paper-based diary. Changes in symptoms were assessed by study staff and led to collection of nasal fluid specimens for virological testing. The study objectives included investigating the response of respiratory symptoms to respiratory viral infection in children with and without asthma over a one year period. Due to sparse data daily respiratory symptom scores were aggregated into weekly average scores. More than 70% of the weekly average scores were zero, with the positive scores forming a skewed distribution. We propose a random effects probit/log-skew-normal mixture model to analyze the DAVIS data. The model parameters were estimated using a maximum marginal likelihood approach. A simulation study was conducted to assess the performance of the proposed mixture model if the underlying distribution of the positive response is different from log-skew normal. Results Viral infection status was highly significant in both probit and log-skew normal model components respectively. The probability of being symptom free was much lower for the week a child was viral positive relative to the week she/he was viral negative. The severity of the symptoms was also greater for the week a child was viral positive. The probability of being symptom free was

  20. A sub-circuit MOSFET model with a wide temperature range including cryogenic temperature

    Energy Technology Data Exchange (ETDEWEB)

    Jia Kan; Sun Weifeng; Shi Longxing, E-mail: jiakan.01@gmail.com [National ASIC System Engineering Research Center, Southeast University, Nanjing 210096 (China)

    2011-06-15

    A sub-circuit SPICE model of a MOSFET for low temperature operation is presented. Two resistors are introduced for the freeze-out effect, and the explicit behavioral models are developed for them. The model can be used in a wide temperature range covering both cryogenic temperature and regular temperatures. (semiconductor devices)

  1. Modeling forces in high-temperature superconductors

    International Nuclear Information System (INIS)

    Turner, L. R.; Foster, M. W.

    1997-01-01

    We have developed a simple model that uses computed shielding currents to determine the forces acting on a high-temperature superconductor (HTS). The model has been applied to measurements of the force between HTS and permanent magnets (PM). Results show the expected hysteretic variation of force as the HTS moves first toward and then away from a permanent magnet, including the reversal of the sign of the force. Optimization of the shielding currents is carried out through a simulated annealing algorithm in a C++ program that repeatedly calls a commercial electromagnetic software code. Agreement with measured forces is encouraging

  2. The influence of felling season and log-soaking temperature on the wetting and phenol formaldehyde adhesive bonding characteristics of birch veneer

    Science.gov (United States)

    Anti Rohumaa; Christopher G. Hunt; Charles R. Frihart; Pekka Saranpää; Martin Ohlmeyer; Mark Hughes

    2014-01-01

    Most adhesive studies employing wood veneer as the substrate assume that it is a relatively uniform material if wood species and veneer thickness are constant. In the present study, veneers from rotary cut birch (Betula pendula Roth) were produced from logs harvested in spring, autumn and winter, and soaked at 20°C and 70°C prior to peeling. Firstly...

  3. LogScope

    Science.gov (United States)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  4. Alloy model for high temperature superconductors

    International Nuclear Information System (INIS)

    Weissmann, M.; Saul, A.

    1991-07-01

    An alloy model is proposed for the electronic structure of high temperature superconductors. It is based on the assumption that holes and extra electrons are localized in small copper oxygen clusters, that would be the components of such alloy. This model, when used together with quantum chemical calculations on small clusters, can explain the structure observed in the experimental densities of states of both hole and electron superconductors close to the Fermi energy. The main point is the strong dependence of the energy level distribution and composition on the number of electrons in a cluster. The alloy model also suggests a way to correlate Tc with the number of holes, or extra electrons, and the number of adequate clusters to locate them. (author). 21 refs, 4 figs, 1 tab

  5. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Science.gov (United States)

    2010-07-01

    ... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable density log, and a temperature log after the casing is set and cemented. (iii) The Director may allow the...

  6. California-Nevada uranium logging. Final report

    International Nuclear Information System (INIS)

    1981-04-01

    The purpose of this project was to obtain geophysical logs of industry drill holes to assess the uranium resource potential of geologic formations of interest. The work was part of the US Department of Energy's National Uranium Resource Evaluation (NURE) Program. The principal objective of the logging program was to determine radioelement grade of formations through natural gamma ray detectors. Supplementary information was obtained from resistivity (R), self-potential (SP), point resistance (RE), and neutron density (NN) logs for formation interpretation. Additional data for log interpretation was obtained from caliper logs, casing schedules, and downhole temperature. This data was obtained from well operators when available, with new logs obtained where not formerly available. This report contains a summary of the project and data obtained to date

  7. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  8. Processing well logging data, for example for verification and calibration of well logs

    International Nuclear Information System (INIS)

    Suau, J.; Boutemy, Y.

    1981-01-01

    A method is described of machine processing well logging data derived from borehole exploring devices which investigate earth formations traversed by boreholes. The method can be used for verifying and recalibrating logs, reconstructing missing logs and combining the data to form a statistical model of the traversed earth formations. (U.K.)

  9. Temperature Buffer Test. Final THM modelling

    International Nuclear Information System (INIS)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan; Ledesma, Alberto; Jacinto, Abel

    2012-01-01

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code B right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code B right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  10. Temperature Buffer Test. Final THM modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan [Clay Technology AB, Lund (Sweden); Ledesma, Alberto; Jacinto, Abel [UPC, Universitat Politecnica de Catalunya, Barcelona (Spain)

    2012-01-15

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code{sub B}right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code{sub B}right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  11. Analytic Models of High-Temperature Hohlraums

    International Nuclear Information System (INIS)

    Stygar, W.A.; Olson, R.E.; Spielman, R.B.; Leeper, R.J.

    2000-01-01

    A unified set of high-temperature-hohlraum models has been developed. For a simple hohlraum, P s = (A s +(1minusα W )A W +A H )σT R 4 + (4Vσ/c)(dT R r /dt) where P S is the total power radiated by the source, A s is the source area, A W is the area of the cavity wall excluding the source and holes in the wall, A H is the area of the holes, σ is the Stefan-Boltzmann constant, T R is the radiation brightness temperature, V is the hohlraum volume, and c is the speed of light. The wall albedo α W triple b ond (T W /T R ) 4 where T W is the brightness temperature of area A W . The net power radiated by the source P N = P S -A S σT R 4 , which suggests that for laser-driven hohlraums the conversion efficiency η CE be defined as P N /P LASER . The characteristic time required to change T R 4 in response to a change in P N is 4V/C((lminusα W )A W +A H ). Using this model, T R , α W , and η CE can be expressed in terms of quantities directly measurable in a hohlraum experiment. For a steady-state hohlraum that encloses a convex capsule, P N = {(1minusα W )A W +A H +((1minusα C )(A S +A W α W )A C /A T = )}σT RC 4 where α C is the capsule albedo, A C is the capsule area, A T triple b ond (A S +A W +A H ), and T RC is the brightness temperature of the radiation that drives the capsule. According to this relation, the capsule-coupling efficiency of the baseline National-Ignition-Facility (NIF) hohlraum is 15% higher than predicted by previous analytic expressions. A model of a hohlraum that encloses a z pinch is also presented

  12. Fuzzy model to predict porosity thought conventional well logs; Modelo Fuzzy para predicao de porosidade via perfis convencionais de poco

    Energy Technology Data Exchange (ETDEWEB)

    Mimbela, Renzo R.F.; Silva, Jadir C. [Universidade Estadual do Norte Fluminense (UENF), Macae, RJ (Brazil). Lab. de Engenharia e Exploracao do Petroleo (LENEP)

    2004-07-01

    The well logs have a great applicability in the search and evaluation of hydrocarbon. In this work we calculate porosities of the Namorado field with help of the 'Fuzzy Rule'. This is done segmenting jointly both the neutron ({phi}{sub N}) and density ({phi}{sub d}) porosities logs in groups with better relation of internal linearity. The grouping is processed keeping the best number of groups, which is efficiently chosen by a criterion related to the minimum value of 'Fuzzy Validity' measurement. As a first step, we choose the {phi}{sub N} and {phi}{sub d} values only at that depths where cores exist. To prevent picking measurements errors a previous data filtering is performed by selecting only the and their correspondent values that exhibit a maximum discrepancy with core porosity ({phi}{sub C}) around 5pu (porosity unit). A conventional average porosity {phi}{sub MED}, mixing {phi}{sub N} and {phi}{sub d} is calculated at each point, concerning its own lithological and fluids characteristics. Finally, an inversion algorithm is applied to indicate the best curve curve that fit linearly {phi}{sub C} vs. {phi}{sub MED}, {phi}{sub C} vs. {phi}{sub D} and {phi}{sub C} vs. {phi}{sub N}, and at the same time determines the values of the constants to be extrapolated in order to calculate the porosity of the whole field. (author)

  13. Pulsed neutron generator for logging

    International Nuclear Information System (INIS)

    Thibideau, F.D.

    1977-01-01

    A pulsed neutron generator for uranium logging is described. This generator is one component of a prototype uranium logging probe which is being developed by SLA to detect, and assay, uranium by borehole logging. The logging method is based on the measurement of epithermal neutrons resulting from the prompt fissioning of uranium from a pulsed source of 17.6 MeV neutrons. An objective of the prototype probe was that its diameter not exceed 2.75 inches, which would allow its use in conventional rotary drill holes of 4.75-inch diameter. This restriction limited the generator to a maximum 2.375-inch diameter. The performance requirements for the neutron generator specified that it operate with a nominal output of 5 x 10 6 neutrons/pulse at up to 100 pulses/second for a one-hour period. The development of a neutron generator meeting the preliminary design goals was completed and two prototype models were delivered to SLA. These two generators have been used by SLA to log a number of boreholes in field evaluation of the probe. The results of the field evaluations have led to the recommendation of several changes to improve the probe's operation. Some of these changes will require additional development effort on the neutron generator. It is expected that this work will be performed during 1977. The design and operation of the first prototype neutron generators is described

  14. Integrated flow and temperature modeling at the catchment scale

    DEFF Research Database (Denmark)

    Loinaz, Maria Christina; Davidsen, Hasse Kampp; Butts, Michael

    2013-01-01

    –groundwater dynamics affect stream temperature. A coupled surface water–groundwater and temperature model has therefore been developed to quantify the impacts of land management and water use on stream flow and temperatures. The model is applied to the simulation of stream temperature levels in a spring-fed stream...

  15. Radiometric well logging instruments

    International Nuclear Information System (INIS)

    Davydov, A.V.

    1975-01-01

    The technical properties of well instruments for radioactive logging used in the radiometric logging complexes PKS-1000-1 (''Sond-1'') and PRKS-2 (''Vitok-2'') are described. The main features of the electric circuit of the measuring channels are given

  16. Power to the logs!

    CERN Multimedia

    CERN. Geneva; MACMAHON, Joseph

    2015-01-01

    Are you tired of using grep, vi and emacs to read your logs? Do you feel like you’re missing the big picture? Does the word "statistics" put a smile on your face? Then it’s time to give power to the logs!

  17. An investigation into the reduction of log-layer mismatch in wall-modeled LES with a hybrid RANS/LES approach

    Science.gov (United States)

    Balin, Riccardo; Spalart, Philippe R.; Jansen, Kenneth E.

    2017-11-01

    Hybrid RANS/LES modeling approaches used in the context of wall-modeled LES (WMLES) of channel flows and boundary layers often suffer from a mismatch in the RANS and LES log-layer intercepts of the mean velocity profile. In the vicinity of the interface between the RANS and LES regions, the mean velocity gradient is too steep causing a departure from the log-law, an over-prediction of the velocity in the outer layer and an under-prediction of the skin-friction. This steep gradient is attributed to inadequate modeled Reynolds stresses in the upper portion of the RANS layer and at the interface. Channel flow computations were carried out with the IDDES approach of Shur et al. in WMLES mode based on the Spalart-Allmaras RANS model. This talk investigates the robustness of this approach for unstructured grids and explores changes required for grids where insufficient elevation of the Reynolds stresses is observed. Awards of computer time were provided by Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and Early Science programs. Resources of the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, were used.

  18. Modeling the wafer temperature profile in a multiwafer LPCVD furnace

    Energy Technology Data Exchange (ETDEWEB)

    Badgwell, T.A. [Rice Univ., Houston, TX (United States). Dept. of Chemical Engineering; Trachtenberg, I.; Edgar, T.F. [Univ. of Texas, Austin, TX (United States). Dept. of Chemical Engineering

    1994-01-01

    A mathematical model has been developed to predict wafer temperatures within a hot-wall multiwafer low pressure chemical vapor deposition (LPCVD) reactor. The model predicts both axial (wafer-to-wafer) and radial (across-wafer) temperature profiles. Model predictions compare favorably with in situ wafer temperature measurements described in an earlier paper. Measured axial and radial temperature nonuniformities are explained in terms of radiative heat-transfer effects. A simulation study demonstrates how changes in the outer tube temperature profile and reactor geometry affect wafer temperatures. Reactor design changes which could improve the wafer temperature profile are discussed.

  19. Digital archive of drilling mud weight pressures and wellbore temperatures from 49 regional cross sections of 967 well logs in Louisiana and Texas, onshore Gulf of Mexico basin

    Science.gov (United States)

    Burke, Lauri A.; Kinney, Scott A.; Kola-Kehinde, Temidayo B.

    2011-01-01

    This document provides the digital archive of in-situ temperature and drilling mud weight pressure data that were compiled from several historical sources. The data coverage includes the states of Texas and Louisiana in the Gulf of Mexico basin. Data are also provided graphically, for both Texas and Louisiana, as plots of temperature as a function of depth and pressure as a function of depth. The minimum, arithmetic average, and maximum values are tabulated for each 1,000-foot depth increment for temperature as well as pressure in the Texas and Louisiana data.

  20. 3D Reservoir Modeling of Semutang Gas Field: A lonely Gas field in Chittagong-Tripura Fold Belt, with Integrated Well Log, 2D Seismic Reflectivity and Attributes.

    Science.gov (United States)

    Salehin, Z.; Woobaidullah, A. S. M.; Snigdha, S. S.

    2015-12-01

    Bengal Basin with its prolific gas rich province provides needed energy to Bangladesh. Present energy situation demands more Hydrocarbon explorations. Only 'Semutang' is discovered in the high amplitude structures, where rest of are in the gentle to moderate structures of western part of Chittagong-Tripura Fold Belt. But it has some major thrust faults which have strongly breached the reservoir zone. The major objectives of this research are interpretation of gas horizons and faults, then to perform velocity model, structural and property modeling to obtain reservoir properties. It is needed to properly identify the faults and reservoir heterogeneities. 3D modeling is widely used to reveal the subsurface structure in faulted zone where planning and development drilling is major challenge. Thirteen 2D seismic and six well logs have been used to identify six gas bearing horizons and a network of faults and to map the structure at reservoir level. Variance attributes were used to identify faults. Velocity model is performed for domain conversion. Synthetics were prepared from two wells where sonic and density logs are available. Well to seismic tie at reservoir zone shows good match with Direct Hydrocarbon Indicator on seismic section. Vsh, porosity, water saturation and permeability have been calculated and various cross plots among porosity logs have been shown. Structural modeling is used to make zone and layering accordance with minimum sand thickness. Fault model shows the possible fault network, those liable for several dry wells. Facies model have been constrained with Sequential Indicator Simulation method to show the facies distribution along the depth surfaces. Petrophysical models have been prepared with Sequential Gaussian Simulation to estimate petrophysical parameters away from the existing wells to other parts of the field and to observe heterogeneities in reservoir. Average porosity map for each gas zone were constructed. The outcomes of the research

  1. Digital mineral logging system

    International Nuclear Information System (INIS)

    West, J.B.

    1980-01-01

    A digital mineral logging system acquires data from a mineral logging tool passing through a borehole and transmits the data uphole to an electronic digital signal processor. A predetermined combination of sensors, including a deviometer, is located in a logging tool for the acquisition of the desired data as the logging tool is raised from the borehole. Sensor data in analog format is converted in the logging tool to a digital format and periodically batch transmitted to the surface at a predetermined sampling rate. An identification code is provided for each mineral logging tool, and the code is transmitted to the surface along with the sensor data. The self-identifying tool code is transmitted to the digital signal processor to identify the code against a stored list of the range of numbers assigned to that type of tool. The data is transmitted up the d-c power lines of the tool by a frequency shift key transmission technique. At the surface, a frequency shift key demodulation unit transmits the decoupled data to an asynchronous receiver interfaced to the electronic digital signal processor. During a recording phase, the signals from the logging tool are read by the electronic digital signal processor and stored for later processing. During a calculating phase, the stored data is processed by the digital signal processor and the results are outputted to a printer or plotter, or both

  2. Subsurface temperature of the onshore Netherlands: new temperature dataset and modelling

    NARCIS (Netherlands)

    Bonté, D.; Wees, J.-D. van; Verweij, J.M.

    2012-01-01

    Subsurface temperature is a key parameter for geothermal energy prospection in sedimentary basins. Here, we present the results of a 3D temperature modelling using a thermal-tectonic forward modelling method, calibrated with subsurface temperature measurements in the Netherlands. The first step

  3. Modelling global fresh surface water temperature

    NARCIS (Netherlands)

    Beek, L.P.H. van; Eikelboom, T.; Vliet, M.T.H. van; Bierkens, M.F.P.

    2011-01-01

    Temperature directly determines a range of water physical properties including vapour pressure, surface tension, density and viscosity, and the solubility of oxygen and other gases. Indirectly water temperature acts as a strong control on fresh water biogeochemistry, influencing sediment

  4. Mariners Weather Log

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Mariners Weather Log (MWL) is a publication containing articles, news and information about marine weather events and phenomena, worldwide environmental impact...

  5. Log-inject-log in sand consolidation

    International Nuclear Information System (INIS)

    Murphy, R.P.; Spurlock, J.W.

    1977-01-01

    A method is described for gathering information for the determination of the adequacy of placement of sand consolidating plastic for sand control in oil and gas wells. The method uses a high neutron cross-section tracer which becomes part of the plastic and uses pulsed neutron logging before and after injection of the plastic. Preferably, the method uses lithium, boron, indium, and/or cadmium tracers. Boron oxide is especially useful and can be dissolved in alcohol and mixed with the plastic ingredients

  6. Elephant logging and environment

    International Nuclear Information System (INIS)

    Tin-Aung-Hla

    1995-01-01

    The natural environment comprises non-biological elements such as air, water, light, heat and biological elements of animal and plant life; all interact with each other to create an ecosystem. Human activities like over-exploitation of forest results in deforestation and desertification. This consequently changes ecological balance. Topics on: (1) history of elephants utilization; (2) elephant logging; (3) classification of elephants; (4) dragging gear; (5) elephant power; (6) elephant logging and environment, are discussed

  7. A temperature dependent slip factor based thermal model for friction

    Indian Academy of Sciences (India)

    This paper proposes a new slip factor based three-dimensional thermal model to predict the temperature distribution during friction stir welding of 304L stainless steel plates. The proposed model employs temperature and radius dependent heat source to study the thermal cycle, temperature distribution, power required, the ...

  8. Modeling of AlMg Sheet Forming at Elevated Temperatures

    NARCIS (Netherlands)

    van den Boogaard, Antonius H.; Bolt, P.; Werkhoven, R.

    2001-01-01

    The process limits of aluminum sheet forming processes can be improved by control-ling local flow behavior by means of elevated temperatures and temperature gradients. In order to accurately model the deep drawing or stretching of aluminum sheet at elevated temperatures, a model is required that

  9. Modeling shoot-tip temperature in the greenhouse environment

    International Nuclear Information System (INIS)

    Faust, J.E.; Heins, R.D.

    1998-01-01

    An energy-balance model is described that predicts vinca (Catharanthus roseus L.) shoot-tip temperature using four environmental measurements: solar radiation and dry bulb, wet bulb, and glazing material temperature. The time and magnitude of the differences between shoot-tip and air temperature were determined in greenhouses maintained at air temperatures of 15, 20, 25, 30, or 35 °C. At night, shoot-tip temperature was always below air temperature. Shoot-tip temperature decreased from 0.5 to 5 °C below air temperature as greenhouse glass temperature decreased from 2 to 15 °C below air temperature. During the photoperiod under low vapor-pressure deficit (VPD) and low air temperature, shoot-tip temperature increased ≈4 °C as solar radiation increased from 0 to 600 W·m -2 . Under high VPD and high air temperature, shoot-tip temperature initially decreased 1 to 2 °C at sunrise, then increased later in the morning as solar radiation increased. The model predicted shoot-tip temperatures within ±1 °C of 81% of the observed 1-hour average shoot-tip temperatures. The model was used to simulate shoot-tip temperatures under different VPD, solar radiation, and air temperatures. Since the rate of leaf and flower development are influenced by the temperature of the meristematic tissues, a model of shoot-tip temperature will be a valuable tool to predict plant development in greenhouses and to control the greenhouse environment based on a plant temperature setpoint. (author)

  10. Dynamic Model of the High Temperature Proton Exchange Membrane Fuel Cell Stack Temperature

    DEFF Research Database (Denmark)

    Andreasen, Søren Juhl; Kær, Søren Knudsen

    2009-01-01

    The present work involves the development of a model for predicting the dynamic temperature of a high temperature proton exchange membrane (HTPEM) fuel cell stack. The model is developed to test different thermal control strategies before implementing them in the actual system. The test system co...... elements for start-up, heat conduction through stack insulation, cathode air convection, and heating of the inlet gases in the manifold. Various measurements are presented to validate the model predictions of the stack temperatures....

  11. Modelling of temperature distribution and temperature pulsations in elements of fast breeder reactor

    International Nuclear Information System (INIS)

    Sorokin, A.P.; Bogoslovskaia, G.P.; Ushakov, P.A.; Zhukov, A.V.; Ivanov, Eu.F.; Matjukhin, N.M.

    2004-01-01

    From thermophysical point of view, integrated configuration of liquid metal cooled reactor has some limitations. Large volume of mixing chamber causes a complex behavior of thermal hydraulic characteristics in such facilities. Also, this volume is responsible for large-scale eddies in the coolant, existence of stagnant areas and flow stratification, occurrence of temperature non-uniformity and pulsation of coolant and structure temperatures. Temperature non-uniformities and temperature pulsations depend heavily even on small variations in reactor core design. The paper presents some results on modeling of thermal hydraulic processes occurring in liquid metal cooled reactor. The behavior of following parameters are discussed: temperature non-uniformities at the core output and related temperature pulsations; temperature pulsations due to mixing of sodium jets at different temperatures; temperature pulsations arising if a part of loop (circuit) is shut off; temperature non-uniformities and pulsation at the core output and related temperature pulsation; temperature pulsations due to mixing of sodium jets at different temperatures; temperature pulsations arising if a part of loop (circuit) is shut off; temperature non-uniformities and pulsation of temperature during transients and during transition to natural convection cooling. Also, the issue of modeling of temperature behavior in compact arrangement of fast reactor fuel pins using water as modeling liquid is considered in the paper. One more discussion is concerned with experimental method of modeling of liquid metal mixing with the use of air. The method is based on freon tracer technique. The results of simulation of the thermal hydraulic processes mentioned above have been analyzed, that will allow the main lines of the study to be determined and conclusion to be drawn regarding the temperature behavior in fast reactor units. (author)

  12. Low-temperature plasma modelling and simulation

    NARCIS (Netherlands)

    Dijk, van J.

    2011-01-01

    Since its inception in the beginning of the twentieth century, low-temperature plasma science has become a major ¿eld of science. Low-temperature plasma sources and gas discharges are found in domestic, industrial, atmospheric and extra-terrestrial settings. Examples of domestic discharges are those

  13. Uranium logging in earth formations

    International Nuclear Information System (INIS)

    Givens, W.W.

    1979-01-01

    A technique is provided for assaying the formations surrounding a borehole for uranium. A borehole logging tool cyclically irradiates the formations with neutrons and responds to neutron fluxes produced during the period of time that prompt neutrons are being produced by the neutron fission of uranium in the formations. A borehole calibration tool employs a steady-state (continuous output) neutron source, firstly, to produce a response to neutron fluxes in models having known concentrations of uranium and, secondly, to to produce a response to neutron fluxes in the formations surrounding the borehole. The neutron flux responses of the borehole calibration tool in both the model and the formations surrounding the borehole are utilized to correct the neutron flux response of the borehole logging tool for the effects of epithermal/thermal neutron moderation, scattering, and absorption within the borehole itself

  14. Temperature bounds in a model of laminar flames

    International Nuclear Information System (INIS)

    Kirane, M.; Badraoui, S.

    1994-06-01

    We consider reaction-diffusion equations coupling temperature and mass fraction in one-step-reaction model of combustion in R N . Uniform temperature bounds are derived when the Lewis number is less than one. This result completes the case of Lewis number greater than one studied by J.D. Avrin and M. Kirane ''Temperature growth and temperature bounds in special cases of combustion models'' (to appear in Applicable Analysis). (author). 5 refs

  15. Grid for the calculation of log g and T/sub eff/ from Stroemgren four-color indices

    International Nuclear Information System (INIS)

    Philip, A.G.D.; Relyea, L.J.

    1979-01-01

    Large scale grids of the four-color indices (c 1 ) 0 , (b-y) 0 and log g, T/sub eff/ are presented for use in computing log g and T/sub eff/ values derived from the dereddened Stroemgren four-color indices of Population I and II stars in the range 10,000>T/sub eff/>5,500, 4.5>log g>2.0 and -2.0 0 . Log g values have a rms error of +- 0.2. Relative to the latest Kurucz (1979) models the zero point of the log g values for F-type stars now agree with the models. Colors predicted from temperatures are within 0/sup m/.01 of those predicted by the older models

  16. Minimum dimensions of rock models for calibration of radiometric probes for the neutron-gamma well logging

    International Nuclear Information System (INIS)

    Czubek, J.A.; Lenda, A.

    1979-01-01

    The minimum dimensions have been calculated assuring 91, 96 and 98 % of the probe response in respect to the infinite medium. The models are of cylindrical form, the probe (source-to-detector distance equal to 60 or 90 cm) being placed on the model axis, symmetrically with respect to the two end-faces. All the models are ''embedded'' in various media, such as: air, sand of 40% porosity and completely saturated with water, sand of 30 % porosity and of moisture content equal to 10 %, and water. The models are of three types of material: sandstone, limestone and dolomite, with various porosities, ranging from 0 to 100 %. The probe response is due to gamma rays arising from the radiativecapture of thermal neutrons. The calculations were carried out for the highest energy line of gamma rays arising in given litology. Gamma-ray flux from the neutron radiative capture has been calculated versus rock porosity and model dimensions and radiation migration lengths determined for given litologies. The minimum dimensions of cylindrical models are given as functions of: porosity, probe length (source-to-detector distance) lithology of model and type of medium surrounding our model. (author)

  17. Replenishment policy for Entropic Order Quantity (EnOQ model with two component demand and partial back-logging under inflation

    Directory of Open Access Journals (Sweden)

    Bhanupriya Dash

    2017-09-01

    Full Text Available Background: Replenishment policy for entropic order quantity model with two component demand and partial backlogging under inflation is an important subject in the stock management. Methods: In this paper an inventory model for  non-instantaneous  deteriorating items with stock dependant consumption rate and partial back logged in addition the effect of inflection and time value of money on replacement policy with zero lead time consider was developed. Profit maximization model is formulated by considering the effects of partial backlogging under inflation with cash discounts. Further numerical example presented to evaluate the relative performance between the entropic order quantity and EOQ models separately. Numerical example is present to demonstrate the developed model and to illustrate the procedure. Lingo 13.0 version software used to derive optimal order quantity and total cost of inventory. Finally sensitivity analysis of the optimal solution with respect to different parameters of the system carried out. Results and conclusions: The obtained inventory model is very useful in retail business. This model can extend to total backorder.

  18. Log4J

    CERN Document Server

    Perry, Steven

    2009-01-01

    Log4j has been around for a while now, and it seems like so many applications use it. I've used it in my applications for years now, and I'll bet you have too. But every time I need to do something with log4j I've never done before I find myself searching for examples of how to do whatever that is, and I don't usually have much luck. I believe the reason for this is that there is a not a great deal of useful information about log4j, either in print or on the Internet. The information is too simple to be of real-world use, too complicated to be distilled quickly (which is what most developers

  19. A material model for aluminium sheet forming at elevated temperatures

    NARCIS (Netherlands)

    van den Boogaard, Antonius H.; Werkhoven, R.J.; Bolt, P.J.

    2001-01-01

    In order to accurately simulate the deep drawing or stretching of aluminum sheet at elevated temperatures, a model is required that incorporates the temperature and strain-rate dependency of the material. In this paper two models are compared: a phenomenological material model in which the

  20. Mechanics of log calibration

    International Nuclear Information System (INIS)

    Waller, W.C.; Cram, M.E.; Hall, J.E.

    1975-01-01

    For any measurement to have meaning, it must be related to generally accepted standard units by a valid and specified system of comparison. To calibrate well-logging tools, sensing systems are designed which produce consistent and repeatable indications over the range for which the tool was intended. The basics of calibration theory, procedures, and calibration record presentations are reviewed. Calibrations for induction, electrical, radioactivity, and sonic logging tools will be discussed. The authors' intent is to provide an understanding of the sources of errors, of the way errors are minimized in the calibration process, and of the significance of changes in recorded calibration data

  1. Site-response Estimation by 1D Heterogeneous Velocity Model using Borehole Log and its Relationship to Damping Factor

    International Nuclear Information System (INIS)

    Sato, Hiroaki

    2014-01-01

    In the Niigata area, which suffered from several large earthquakes such as the 2007 Chuetsu-oki earthquake, geographical observation that elucidates the S-wave structure of the underground is advancing. Modeling of S-wave velocity structure in the subsurface is underway to enable simulation of long-period ground motion. The one-dimensional velocity model by inverse analysis of micro-tremors is sufficiently appropriate for long-period site response but not for short-period, which is important for ground motion evaluation at NPP sites. The high-frequency site responses may be controlled by the strength of heterogeneity of underground structure because the heterogeneity of the 1D model plays an important role in estimating high-frequency site responses and is strongly related to the damping factor of the 1D layered velocity model. (author)

  2. A Temperature-Dependent Hysteresis Model for Relaxor Ferroelectric Compounds

    National Research Council Canada - National Science Library

    Raye, Julie K; Smith, Ralph C

    2004-01-01

    This paper summarizes the development of a homogenized free energy model which characterizes the temperature-dependent hysteresis and constitutive nonlinearities inherent to relaxor ferroelectric materials...

  3. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    OpenAIRE

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-01-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US?Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3%...

  4. Log of Apollo 11.

    Science.gov (United States)

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  5. Borehole logging system

    International Nuclear Information System (INIS)

    Allen, L.S.

    1988-01-01

    A radioactive borehole logging tool employs an epithermal neutron detector having a neutron counter surrounded by an inner thermal neutron filter and an outer thermal neutron filter. Located between the inner and outer filters is a neutron moderating material for extending the lifetime of epithermal neutrons to enhance the counting rate of such epithermal neutrons by the neutron counter

  6. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  7. Nuclear borehole logging for oil exploration

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1989-01-01

    Reactor physics can be applied to the logging of boreholes for the exploration of oil and gas and the results obtained can be interpreted more correctly by use of reactor physics models, e.g. one-dimensional multi-group diffusion theory adapted for gamma quanta. The standard nuclear logging tools are: natural gamma, gamma density, neutron porosity and the pulsed-neutron tool. The models and interpretation procedures are discussed. 1 fig

  8. Well logging. Acquisition and applications; Diagraphies. Acquisition et applications

    Energy Technology Data Exchange (ETDEWEB)

    Serra, O.; Serra, L.

    2001-07-01

    This reference book on wire-line and LWD well logging covers all geophysical methods of underground survey in a synthetic, visual and dynamical way. It treats of: the physical principle of well logging measurements, the different types of existing probes, the factors that can influence the measurements, and the applications of these measurements. The following well-logging methods are reviewed: resistivity; electromagnetic wave propagation; magnetic susceptibility and magnetic field; spontaneous potential; nuclear logging: natural gamma radioactivity, density logging, photoelectric index, neutron emission probes, hydrogen index or neutron porosity, neutron induced gamma spectroscopy, neutron relaxation time, NMR; acoustic measurements: sonic logging, seismic profiles; texture, structure and stratigraphy data acquisition; borehole diameter measurement; temperature measurement; wire sampling methods; place and role of well-logging in petroleum exploration; well-logging programs. (J.S.)

  9. MODELS OF HOURLY DRY BULB TEMPERATURE AND ...

    African Journals Online (AJOL)

    Hourly meteorological data of both dry bulb temperature and relative humidity for 18 locations in Nigeria for the period 1995 to 2009 were analysed to obtain the mean monthly average and monthly hourly average of each of the two meteorological variables for each month for each location. The difference between the ...

  10. Modelling of tandem cell temperature coefficients

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, D.J. [National Renewable Energy Lab., Golden, CO (United States)

    1996-05-01

    This paper discusses the temperature dependence of the basic solar-cell operating parameters for a GaInP/GaAs series-connected two-terminal tandem cell. The effects of series resistance and of different incident solar spectra are also discussed.

  11. A concise wall temperature model for DI Diesel engines

    Energy Technology Data Exchange (ETDEWEB)

    Torregrosa, A.; Olmeda, P.; Degraeuwe, B. [CMT-Motores Termicos, Universidad Politecnica de Valencia (Spain); Reyes, M. [Centro de Mecanica de Fluidos y Aplicaciones, Universidad Simon Bolivar (Venezuela)

    2006-08-15

    A concise resistor model for wall temperature prediction in diesel engines with piston cooling is presented here. The model uses the instantaneous in-cylinder pressure and some usually measured operational parameters to predict the temperature of the structural elements of the engine. The resistor model was adjusted by means of temperature measurements in the cylinder head, the liner and the piston. For each model parameter, an expression as a function of the engine geometry, operational parameters and material properties was derived to make the model applicable to other similar engines. The model predicts well the cylinder head, liner and piston temperature and is sensitive to variations of operational parameters such as the start of injection, coolant and oil temperature and engine speed and load. (author)

  12. Modeling of the Temperature Field Recovery in the Oil Pool

    Science.gov (United States)

    Khabibullin, I. L.; Davtetbaev, A. Ya.; Mar'in, D. F.; Khisamov, A. A.

    2018-05-01

    This paper considers the problem on mathematical modeling of the temperature field recovery in the oil pool upon termination of injection of water into the pool. The problem is broken down into two stages: injection of water and temperature and pressure recovery upon termination of injection. A review of the existing mathematical models is presented, analytical solutions for a number of cases have been constructed, and a comparison of the analytical solutions of different models has been made. In the general form, the expression has been obtained that permits determining the temperature change in the oil pool upon termination of injection of water (recovery of the temperature field).

  13. Nuclear log interpretation by first principle

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1988-07-01

    A weakness connected to the present interpretation of nuclear borehole logs is that the interdependence of the various logs and physical effects of importance for the tools are not always taken into account in a correct way. Therefore a new approach to the interpretation of nuclear borehole logs is considered. It is based on the logs obtained with the natural gamma, the neutron porosity, the gamma density, and the pulsed neutron tools. For each of these tools a model, taking into account the important physical effects, is established. These models are incorporated into a computer programme which from the tool signals calculates, by use of iteration, a consistent set of the corresponding formation properties. In the paper the models developed for the four tools and the interpretation programme are briefly described. An example of the use of the interpretation programme is given and compared with a conventional interpretation. (author)

  14. New Temperature-based Models for Predicting Global Solar Radiation

    International Nuclear Information System (INIS)

    Hassan, Gasser E.; Youssef, M. Elsayed; Mohamed, Zahraa E.; Ali, Mohamed A.; Hanafy, Ahmed A.

    2016-01-01

    Highlights: • New temperature-based models for estimating solar radiation are investigated. • The models are validated against 20-years measured data of global solar radiation. • The new temperature-based model shows the best performance for coastal sites. • The new temperature-based model is more accurate than the sunshine-based models. • The new model is highly applicable with weather temperature forecast techniques. - Abstract: This study presents new ambient-temperature-based models for estimating global solar radiation as alternatives to the widely used sunshine-based models owing to the unavailability of sunshine data at all locations around the world. Seventeen new temperature-based models are established, validated and compared with other three models proposed in the literature (the Annandale, Allen and Goodin models) to estimate the monthly average daily global solar radiation on a horizontal surface. These models are developed using a 20-year measured dataset of global solar radiation for the case study location (Lat. 30°51′N and long. 29°34′E), and then, the general formulae of the newly suggested models are examined for ten different locations around Egypt. Moreover, the local formulae for the models are established and validated for two coastal locations where the general formulae give inaccurate predictions. Mostly common statistical errors are utilized to evaluate the performance of these models and identify the most accurate model. The obtained results show that the local formula for the most accurate new model provides good predictions for global solar radiation at different locations, especially at coastal sites. Moreover, the local and general formulas of the most accurate temperature-based model also perform better than the two most accurate sunshine-based models from the literature. The quick and accurate estimations of the global solar radiation using this approach can be employed in the design and evaluation of performance for

  15. Stability of the Hartree-Fock model with temperature

    OpenAIRE

    Dolbeault, Jean; Felmer, Patricio; Lewin, Mathieu

    2008-01-01

    This paper is devoted to the Hartree-Fock model with temperature in the euclidean space. For large classes of free energy functionals, minimizers are obtained as long as the total charge of the system does not exceed a threshold which depends on the temperature. The usual Hartree-Fock model is recovered in the zero temperature limit. An orbital stability result for the Cauchy problem is deduced from the variational approach.

  16. Modelling of temperature in deep boreholes and evaluation of geothermal heat flow at Forsmark and Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Laendell, Maerta; Sundberg, Anders (GEO INNOVA AB, Linkoeping (Sweden))

    2009-06-15

    This report presents modelling of temperature and temperature gradients in boreholes in Laxemar and Forsmark and fitting to measured temperature data. The modelling is performed with an analytical expression including thermal conductivity, thermal diffusivity, heat flow, internal heat generation and climate events in the past. As a result of the fitting procedure it is also possible to evaluate local heat flow values for the two sites. However, since there is no independent evaluation of the heat flow, uncertainties in for example thermal conductivity, diffusivity and the palaeoclimate temperature curve are transferred into uncertainties in the heat flow. Both for Forsmark and Laxemar, reasonably good fits were achieved between models and data on borehole temperatures. However, none of the general models achieved a fit within the 95% confidence intervals of the measurements. This was achieved in some cases for the additional optimised models. Several of the model parameters are uncertain. A good model fit does not automatically imply that 'correct' values have been used for these parameters. Similar model fits can be expected with different sets of parameter values. The palaeoclimatically corrected surface mean heat flow at Forsmark and Laxemar is suggested to be 61 and 56 mW/m2 respectively. If all uncertainties are combined, including data uncertainties, the total uncertainty in the heat flow determination is judged to be within +12% to -14% for both sites. The corrections for palaeoclimate are quite large and verify the need of site-specific climate descriptions. Estimations of the current ground surface temperature have been made by extrapolations from measured temperature logging. The mean extrapolated ground surface temperature in Forsmark and Laxemar is estimated to 6.5 deg and 7.3 deg C respectively. This is approximately 1.7 deg C higher for Forsmark, and 1.6 deg C higher for Laxemar compared to data in the report SKB-TR-06-23. Comparison with

  17. Modelling of temperature in deep boreholes and evaluation of geothermal heat flow at Forsmark and Laxemar

    International Nuclear Information System (INIS)

    Sundberg, Jan; Back, Paer-Erik; Laendell, Maerta; Sundberg, Anders

    2009-05-01

    This report presents modelling of temperature and temperature gradients in boreholes in Laxemar and Forsmark and fitting to measured temperature data. The modelling is performed with an analytical expression including thermal conductivity, thermal diffusivity, heat flow, internal heat generation and climate events in the past. As a result of the fitting procedure it is also possible to evaluate local heat flow values for the two sites. However, since there is no independent evaluation of the heat flow, uncertainties in for example thermal conductivity, diffusivity and the palaeoclimate temperature curve are transferred into uncertainties in the heat flow. Both for Forsmark and Laxemar, reasonably good fits were achieved between models and data on borehole temperatures. However, none of the general models achieved a fit within the 95% confidence intervals of the measurements. This was achieved in some cases for the additional optimised models. Several of the model parameters are uncertain. A good model fit does not automatically imply that 'correct' values have been used for these parameters. Similar model fits can be expected with different sets of parameter values. The palaeoclimatically corrected surface mean heat flow at Forsmark and Laxemar is suggested to be 61 and 56 mW/m 2 respectively. If all uncertainties are combined, including data uncertainties, the total uncertainty in the heat flow determination is judged to be within +12% to -14% for both sites. The corrections for palaeoclimate are quite large and verify the need of site-specific climate descriptions. Estimations of the current ground surface temperature have been made by extrapolations from measured temperature logging. The mean extrapolated ground surface temperature in Forsmark and Laxemar is estimated to 6.5 deg and 7.3 deg C respectively. This is approximately 1.7 deg C higher for Forsmark, and 1.6 deg C higher for Laxemar compared to data in the report SKB-TR-06-23. Comparison with air

  18. A physically based analytical spatial air temperature and humidity model

    Science.gov (United States)

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2013-01-01

    Spatial variation of urban surface air temperature and humidity influences human thermal comfort, the settling rate of atmospheric pollutants, and plant physiology and growth. Given the lack of observations, we developed a Physically based Analytical Spatial Air Temperature and Humidity (PASATH) model. The PASATH model calculates spatial solar radiation and heat...

  19. A mathematical model for transducer working at high temperature

    International Nuclear Information System (INIS)

    Fabre, J.P.

    1974-01-01

    A mathematical model is proposed for a lithium niobate piezoelectric transducer working at high temperature in liquid sodium. The model proposed suitably described the operation of the high temperature transducer presented; it allows the optimization of the efficiency and band-pass [fr

  20. Enhancement of subsurface geologic structure model based on gravity, magnetotelluric, and well log data in Kamojang geothermal field

    Science.gov (United States)

    Yustin Kamah, Muhammad; Armando, Adilla; Larasati Rahmani, Dinda; Paramitha, Shabrina

    2017-12-01

    Geophysical methods such as gravity and magnetotelluric methods commonly used in conventional and unconventional energy exploration, notably for exploring geothermal prospect. They used to identify the subsurface geology structures which is estimated as a path of fluid flow. This study was conducted in Kamojang Geothermal Field with the aim of highlighting the volcanic lineament in West Java, precisely in Guntur-Papandayan chain where there are three geothermal systems. Kendang Fault has predominant direction NE-SW, identified by magnetotelluric techniques and gravity data processing techniques. Gravity techniques such as spectral analysis, derivative solutions, and Euler deconvolution indicate the type and geometry of anomaly. Magnetotelluric techniques such as inverse modeling and polar diagram are required to know subsurface resistivity charactersitics and major orientation. Furthermore, the result from those methods will be compared to geology information and some section of well data, which is sufficiently suitable. This research is very useful to trace out another potential development area.

  1. Modeling, Prediction, and Control of Heating Temperature for Tube Billet

    Directory of Open Access Journals (Sweden)

    Yachun Mao

    2015-01-01

    Full Text Available Annular furnaces have multivariate, nonlinear, large time lag, and cross coupling characteristics. The prediction and control of the exit temperature of a tube billet are important but difficult. We establish a prediction model for the final temperature of a tube billet through OS-ELM-DRPLS method. We address the complex production characteristics, integrate the advantages of PLS and ELM algorithms in establishing linear and nonlinear models, and consider model update and data lag. Based on the proposed model, we design a prediction control algorithm for tube billet temperature. The algorithm is validated using the practical production data of Baosteel Co., Ltd. Results show that the model achieves the precision required in industrial applications. The temperature of the tube billet can be controlled within the required temperature range through compensation control method.

  2. Mathematical modeling of large floating roof reservoir temperature arena

    Directory of Open Access Journals (Sweden)

    Liu Yang

    2018-03-01

    Full Text Available The current study is a simplification of related components of large floating roof tank and modeling for three dimensional temperature field of large floating roof tank. The heat transfer involves its transfer between the hot fluid in the oil tank, between the hot fluid and the tank wall and between the tank wall and the external environment. The mathematical model of heat transfer and flow of oil in the tank simulates the temperature field of oil in tank. Oil temperature field of large floating roof tank is obtained by numerical simulation, map the curve of central temperature dynamics with time and analyze axial and radial temperature of storage tank. It determines the distribution of low temperature storage tank location based on the thickness of the reservoir temperature. Finally, it compared the calculated results and the field test data; eventually validated the calculated results based on the experimental results.

  3. Experiences in the use of an electronic tool to measure pressure, temperature and spinner logs in the Mexican geothermal fields; Experiencias en el uso de sondas electronicas de presion, temperatura y flujo en campos geotermicos de Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Flores Armenta, Magaly; Jaimes Maldonado, Guillermo [Gerencia de Proyectos Geotermoelectricos, Comision Federal de Electricidad, Morelia, Michoacan (Mexico)

    1999-08-01

    In this article are exposed the results of an electronic tool to measure pressure-temperature and spinner profiles in the geothermal wells of Mexico, utilized in order to identify unobservable phenomena with traditional Kuster type pressure and temperature logs. Some examples of the applications are the identifications of production zones, interaction from between two or more zones of contribution under several conditions of operation, casing damages and apparition of sink flow intervals into the formation in producer wells. It is also presented the quantitative method utilized to calculate the masic contribution of the intervals of interest. [Spanish] En este articulo se exponen los resultados obtenidos mediante el uso de una sonda electronica para la medicion de presion-temperatura y flujo en los pozos geotermicos de Mexico, utilizada para identificar fenomenos que no son observables con las mediciones tradicionales tipo Kuster de presion y temperatura. Se ejemplifican algunas de las aplicaciones hechas, tales como la identificacion de zonas de produccion, forma de interaccion entre dos o mas zonas de aporte bajo diferentes condiciones de operacion, roturas en tuberias y aparicion de zonas ladronas en pozos. Se presenta brevemente el metodo cuantitativo utilizado para calcular el aporte masico de las intervalos de interes.

  4. Temperature field and heat flow of the Danish-German border region − borehole measurements and numerical modelling

    DEFF Research Database (Denmark)

    Fuchs, Sven; Balling, Niels

    We present a regional 3D numerical crustal temperature model and analyze the present-day conductive thermal field of the Danish-German border region located in the North German Basin. A comprehensive analysis of borehole and well-log data on a regional scale is conducted to derive both the model......W/m² higher than low values reported in some previous studies for this region. Heat flow from the mantle is estimated to be between 33 and 40 mW/m² (q1–q3; mean of 37 ± 7 mW/m²). Pronounced lateral temperature variations are caused mainly by complex geological structures, including a large amount of salt...... structures and marked lateral variations in the thickness of basin sediments. The associated variations in rock thermal conductivity generate significant variations in model heat flow and large variations in temperature gradients. Major geothermal sandstone reservoirs (e.g. Rhaetian and Middle Buntsandstein...

  5. Neutron--neutron logging

    International Nuclear Information System (INIS)

    Allen, L.S.

    1977-01-01

    A borehole logging tool includes a steady-state source of fast neutrons, two epithermal neutron detectors, and two thermal neutron detectors. A count rate meter is connected to each neutron detector. A first ratio detector provides an indication of the porosity of the formation surrounding the borehole by determining the ratio of the outputs of the two count rate meters connected to the two epithermal neutron detectors. A second ratio detector provides an indication of both porosity and macroscopic absorption cross section of the formation surrounding the borehole by determining the ratio of the outputs of the two count rate meters connected to the two thermal neutron detectors. By comparing the signals of the two ratio detectors, oil bearing zones and salt water bearing zones within the formation being logged can be distinguished and the amount of oil saturation can be determined. 6 claims, 2 figures

  6. Pulsed radiation decay logging

    International Nuclear Information System (INIS)

    Mills, W.R. Jr.

    1983-01-01

    There are provided new and improved well logging processes and systems wherein the detection of secondary radiation is accomplished during a plurality of time windows in a manner to accurately characterize the decay rate of the secondary radiation. The system comprises a well logging tool having a primary pulsed radiation source which emits repetitive time-spaced bursts of primary radiation and detector means for detecting secondary radiation resulting from the primary radiation and producing output signals in response to the detected radiation. A plurality of measuring channels are provided, each of which produces a count rate function representative of signals received from the detector means during successive time windows occurring between the primary radiation bursts. The logging system further comprises means responsive to the measuring channels for producing a plurality of functions representative of the ratios of the radiation count rates measured during adjacent pairs of the time windows. Comparator means function to compare the ratio functions and select at least one of the ratio functions to generate a signal representative of the decay rate of the secondary radiation

  7. Geophysical borehole logging

    International Nuclear Information System (INIS)

    McCann, D.; Barton, K.J.; Hearn, K.

    1981-08-01

    Most of the available literature on geophysical borehole logging refers to studies carried out in sedimentary rocks. It is only in recent years that any great interest has been shown in geophysical logging in boreholes in metamorphic and igneous rocks following the development of research programmes associated with geothermal energy and nuclear waste disposal. This report is concerned with the programme of geophysical logging carried out on the three deep boreholes at Altnabreac, Caithness, to examine the effectiveness of these methods in crystalline rock. Of particular importance is the assessment of the performance of the various geophysical sondes run in the boreholes in relation to the rock mass properties. The geophysical data can be used to provide additional in-situ information on the geological, hydrogeological and engineering properties of the rock mass. Fracturing and weathering in the rock mass have a considerable effect on both the design parameters for an engineering structure and the flow of water through the rock mass; hence, the relation between the geophysical properties and the degree of fracturing and weathering is examined in some detail. (author)

  8. Finite temperature CPN-1 model and long range Neel order

    International Nuclear Information System (INIS)

    Ichinose, Ikuo; Yamamoto, Hisashi.

    1989-09-01

    We study in d space-dimensions the finite temperature behavior of long range Neel order (LRNO) in CP N-1 model as a low energy effective field theory of the antiferromagnetic Heisenberg model. For d≤1, or d≤2 at any nonzero temperature, LRNO disappears, in agreement with Mermin-Wagner-Coleman's theorem. For d=3 in the weak coupling region, LRNO exists below the critical temperature T N (Neel temperature). T N decreases as the interlayer coupling becomes relatively weak compared with that within Cu-O layers. (author)

  9. Energy based model for temperature dependent behavior of ferromagnetic materials

    International Nuclear Information System (INIS)

    Sah, Sanjay; Atulasimha, Jayasimha

    2017-01-01

    An energy based model for temperature dependent anhysteretic magnetization curves of ferromagnetic materials is proposed and benchmarked against experimental data. This is based on the calculation of macroscopic magnetic properties by performing an energy weighted average over all possible orientations of the magnetization vector. Most prior approaches that employ this method are unable to independently account for the effect of both inhomogeneity and temperature in performing the averaging necessary to model experimental data. Here we propose a way to account for both effects simultaneously and benchmark the model against experimental data from ~5 K to ~300 K for two different materials in both annealed (fewer inhomogeneities) and deformed (more inhomogeneities) samples. This demonstrates that this framework is well suited to simulate temperature dependent experimental magnetic behavior. - Highlights: • Energy based model for temperature dependent ferromagnetic behavior. • Simultaneously accounts for effect of temperature and inhomogeneities. • Benchmarked against experimental data from 5 K to 300 K.

  10. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  11. Modelling of a multi-temperature plasma composition

    International Nuclear Information System (INIS)

    Liani, B.; Benallal, R.; Bentalha, Z.

    2005-01-01

    Knowledge of plasma composition is very important for various plasma applications and prediction of plasma properties. The authors use the Saha equation and Debye length equation to calculate the non-local thermodynamic-equilibrium plasma composition. It has been shown that the model to 2T with T representing the temperature (electron temperature and heavy-particle temperature) described by Chen and Han [J. Phys. D 32(1999)1711] can be applied for a mixture of gases, where each atomic species has its own temperature, but the model to 4T is more general because it can be applicable to temperatures distant enough of the heavy particles. This can occur in a plasma composed of big- or macro-molecules. The electron temperature T e varies in the range 8000∼20000 K at atmospheric pressure. (authors)

  12. Modeling the temperature dependence of thermophysical properties: Study on the effect of temperature dependence for RFA.

    Science.gov (United States)

    Watanabe, Hiroki; Kobayashi, Yo; Hashizume, Makoto; Fujie, Masakatsu G

    2009-01-01

    Radio frequency ablation (RFA) has increasingly been used over the past few years and RFA treatment is minimally invasive for patients. However, it is difficult for operators to control the precise formation of coagulation zones due to inadequate imaging modalities. With this in mind, an ablation system using numerical simulation to analyze the temperature distribution of the organ is needed to overcome this deficiency. The objective of our work is to develop a temperature dependent thermophysical liver model. First, an overview is given of the development of the thermophysical liver model. Second, a simulation to evaluate the effect of temperature dependence of the thermophysical properties of the liver is explained. Finally, the result of the simulation, which indicated that the temperature dependence of thermophysical properties accounts for temperature differences influencing the accuracy of RFA treatment is described.

  13. Univaried models in the series of temperature of the air

    International Nuclear Information System (INIS)

    Leon Aristizabal Gloria esperanza

    2000-01-01

    The theoretical framework for the study of the air's temperature time series is the theory of stochastic processes, particularly those known as ARIMA, that make it possible to carry out a univaried analysis. ARIMA models are built in order to explain the structure of the monthly temperatures corresponding to the mean, the absolute maximum, absolute minimum, maximum mean and minimum mean temperatures, for four stations in Colombia. By means of those models, the possible evolution of the latter variables is estimated with predictive aims in mind. The application and utility of the models is discussed

  14. A physically based model of global freshwater surface temperature

    Science.gov (United States)

    van Beek, Ludovicus P. H.; Eikelboom, Tessa; van Vliet, Michelle T. H.; Bierkens, Marc F. P.

    2012-09-01

    Temperature determines a range of physical properties of water and exerts a strong control on surface water biogeochemistry. Thus, in freshwater ecosystems the thermal regime directly affects the geographical distribution of aquatic species through their growth and metabolism and indirectly through their tolerance to parasites and diseases. Models used to predict surface water temperature range between physically based deterministic models and statistical approaches. Here we present the initial results of a physically based deterministic model of global freshwater surface temperature. The model adds a surface water energy balance to river discharge modeled by the global hydrological model PCR-GLOBWB. In addition to advection of energy from direct precipitation, runoff, and lateral exchange along the drainage network, energy is exchanged between the water body and the atmosphere by shortwave and longwave radiation and sensible and latent heat fluxes. Also included are ice formation and its effect on heat storage and river hydraulics. We use the coupled surface water and energy balance model to simulate global freshwater surface temperature at daily time steps with a spatial resolution of 0.5° on a regular grid for the period 1976-2000. We opt to parameterize the model with globally available data and apply it without calibration in order to preserve its physical basis with the outlook of evaluating the effects of atmospheric warming on freshwater surface temperature. We validate our simulation results with daily temperature data from rivers and lakes (U.S. Geological Survey (USGS), limited to the USA) and compare mean monthly temperatures with those recorded in the Global Environment Monitoring System (GEMS) data set. Results show that the model is able to capture the mean monthly surface temperature for the majority of the GEMS stations, while the interannual variability as derived from the USGS and NOAA data was captured reasonably well. Results are poorest for

  15. Modelling of temperature distribution and pulsations in fast reactor units

    International Nuclear Information System (INIS)

    Ushakov, P.A.; Sorokin, A.P.

    1994-01-01

    Reasons for the occurrence of thermal stresses in reactor units have been analyzed. The main reasons for this analysis are: temperature non-uniformity at the output of reactor core and breeder and the ensuing temperature pulsation; temperature pulsations due to mixing of sodium jets of a different temperature; temperature nonuniformity and pulsations resulting from the part of loops (circuits) un-plug; temperature nonuniformity and fluctuations in transient and accidental shut down of reactor or transfer to cooling by natural circulation. The results of investigating the thermal hydraulic characteristics are obtained by modelling the processes mentioned above. Analysis carried out allows the main lines of investigation to be defined and conclusions can be drawn regarding the problem of temperature distribution and fluctuation in fast reactor units

  16. Modeling temperature noise in a fast-reactor pile

    International Nuclear Information System (INIS)

    Kebadze, B.V.; Pykhtina, T.V.; Tarasko, M.Z.

    1987-01-01

    To observe partial overlapping of the heat carrier cross section in piles, leading to local temperature rise or boiling of the sodium, provision is made for individual monitoring of the fuel assemblies with respect to the output temperature. Since the deviation of the mean flow rate through the pile and the output temperature is slight with this anomaly, the temperature fluctuations may provide a more informative index. The change in noise characteristics with partial overlapping of the cross sections occurs because of strong distortion of the temperature profile in the overlap region. The turbulent flow in the upper part of the pile transforms this nonuniformity into temperature pulsations which may be recorded by a sensor at the pile output. In this paper the characteristics of temperature noise are studied for various pile conditions and sensor locations by statistical modeling

  17. Dynamic modeling of temperature change in outdoor operated tubular photobioreactors.

    Science.gov (United States)

    Androga, Dominic Deo; Uyar, Basar; Koku, Harun; Eroglu, Inci

    2017-07-01

    In this study, a one-dimensional transient model was developed to analyze the temperature variation of tubular photobioreactors operated outdoors and the validity of the model was tested by comparing the predictions of the model with the experimental data. The model included the effects of convection and radiative heat exchange on the reactor temperature throughout the day. The temperatures in the reactors increased with increasing solar radiation and air temperatures, and the predicted reactor temperatures corresponded well to the measured experimental values. The heat transferred to the reactor was mainly through radiation: the radiative heat absorbed by the reactor medium, ground radiation, air radiation, and solar (direct and diffuse) radiation, while heat loss was mainly through the heat transfer to the cooling water and forced convection. The amount of heat transferred by reflected radiation and metabolic activities of the bacteria and pump work was negligible. Counter-current cooling was more effective in controlling reactor temperature than co-current cooling. The model developed identifies major heat transfer mechanisms in outdoor operated tubular photobioreactors, and accurately predicts temperature changes in these systems. This is useful in determining cooling duty under transient conditions and scaling up photobioreactors. The photobioreactor design and the thermal modeling were carried out and experimental results obtained for the case study of photofermentative hydrogen production by Rhodobacter capsulatus, but the approach is applicable to photobiological systems that are to be operated under outdoor conditions with significant cooling demands.

  18. Geophysical well logging operations and log analysis in Geothermal Well Desert Peak No. B-23-1

    Energy Technology Data Exchange (ETDEWEB)

    Sethi, D.K.; Fertl, W.H.

    1980-03-01

    Geothermal Well Desert Peak No. B-23-1 was logged by Dresser Atlas during April/May 1979 to a total depth of 2939 m (9642 ft). A temperature of 209/sup 0/C (408/sup 0/F) was observed on the maximum thermometer run with one of the logging tools. Borehole tools rated to a maximum temperature of 204.4/sup 0/C (400/sup 0/F) were utilized for logging except for the Densilog tool, which was from the other set of borehole instruments, rated to a still higher temperature, i.e., 260/sup 0/C (500/sup 0/F). The quality of the logs recorded and the environmental effects on the log response have been considered. The log response in the unusual lithologies of igneous and metamorphic formations encountered in this well could be correlated with the drill cutting data. An empirical, statistical log interpretation approach has made it possible to obtain meaningful information on the rocks penetrated. Various crossplots/histograms of the corrected log data have been generated on the computer. These are found to provide good resolution between the lithological units in the rock sequence. The crossplotting techniques and the statistical approach were combined with the drill cutting descriptions in order to arrive at the lithological characteristics. The results of log analysis and recommendations for logging of future wells have been included.

  19. Pulse neutron logging technique

    International Nuclear Information System (INIS)

    Bespalov, D.F.; Dylyuk, A.A.

    1975-01-01

    A new method of neutron-burst logging is proposed, residing in irradiating rocks with fast neutron bursts and registering the integrated flux burst of thermal and/or epithermal neutrons, from the moment of its initiation to that of full absorption. The obtaained value is representative of the rock properties (porosity, hydrogen content). The integrated flux in a burst of thermal and epithermal neutrons can be measured both by way of activation of a reference sample of a known chemical composition during the neutron burst and by recording the radiation of induced activity of the sample within an interval between two bursts. The proposed method features high informative value, accuracy and efficiency

  20. Well logging, atom and geology

    International Nuclear Information System (INIS)

    Serra, O.

    1994-01-01

    Well logging techniques exploit interactions of gamma photons and neutrons with atoms. Interactions of neutrons of different energies with atoms allow the detection and evaluation of the weight percentage of several elements composing the rocks (C, O, Si, Ca, Fe, S); spectrometry of gamma rays produced by thermal neutron absorption allows for the weight percentage determination of Si, Ca, Fe, S, Cl, H, Ti and Gd, etc. High resolution detectors (germanium doped by Li, at liquid nitrogen temperature) allow the recognition of more elements. Other techniques involving neutrons consist in determining the population in epithermal neutrons at a certain distance of the neutron source (measurement of the hydrogen index). By analyzing the intensity of the gamma flux produced by Compton scattering, the electronic and bulk densities of the rocks are measured. All these data lead to the detection and evaluation of ore deposits (uranium and potassium) and coal, and determination of the lithology, the main minerals composing the rocks, petrophysical properties... 1 fig

  1. A generalized conditional heteroscedastic model for temperature downscaling

    Science.gov (United States)

    Modarres, R.; Ouarda, T. B. M. J.

    2014-11-01

    This study describes a method for deriving the time varying second order moment, or heteroscedasticity, of local daily temperature and its association to large Coupled Canadian General Circulation Models predictors. This is carried out by applying a multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) approach to construct the conditional variance-covariance structure between General Circulation Models (GCMs) predictors and maximum and minimum temperature time series during 1980-2000. Two MGARCH specifications namely diagonal VECH and dynamic conditional correlation (DCC) are applied and 25 GCM predictors were selected for a bivariate temperature heteroscedastic modeling. It is observed that the conditional covariance between predictors and temperature is not very strong and mostly depends on the interaction between the random process governing temporal variation of predictors and predictants. The DCC model reveals a time varying conditional correlation between GCM predictors and temperature time series. No remarkable increasing or decreasing change is observed for correlation coefficients between GCM predictors and observed temperature during 1980-2000 while weak winter-summer seasonality is clear for both conditional covariance and correlation. Furthermore, the stationarity and nonlinearity Kwiatkowski-Phillips-Schmidt-Shin (KPSS) and Brock-Dechert-Scheinkman (BDS) tests showed that GCM predictors, temperature and their conditional correlation time series are nonlinear but stationary during 1980-2000 according to BDS and KPSS test results. However, the degree of nonlinearity of temperature time series is higher than most of the GCM predictors.

  2. Potential for shared log transport services

    Science.gov (United States)

    Tim McDonald; Steve Taylor; Jorge Valenzuela

    2001-01-01

    A simulation model of a log transport logistics network was developed. The model could be structured to either share truck capacity among a group of loggers, or to assign a fixed number of trucks to individual loggers. Another variation of the model allowed the use of a staging yard to set out loaded trailers and deliver them to destinations using dedicated shuttle...

  3. Relativistic finite-temperature Thomas-Fermi model

    Science.gov (United States)

    Faussurier, Gérald

    2017-11-01

    We investigate the relativistic finite-temperature Thomas-Fermi model, which has been proposed recently in an astrophysical context. Assuming a constant distribution of protons inside the nucleus of finite size avoids severe divergence of the electron density with respect to a point-like nucleus. A formula for the nuclear radius is chosen to treat any element. The relativistic finite-temperature Thomas-Fermi model matches the two asymptotic regimes, i.e., the non-relativistic and the ultra-relativistic finite-temperature Thomas-Fermi models. The equation of state is considered in detail. For each version of the finite-temperature Thomas-Fermi model, the pressure, the kinetic energy, and the entropy are calculated. The internal energy and free energy are also considered. The thermodynamic consistency of the three models is considered by working from the free energy. The virial question is also studied in the three cases as well as the relationship with the density functional theory. The relativistic finite-temperature Thomas-Fermi model is far more involved than the non-relativistic and ultra-relativistic finite-temperature Thomas-Fermi models that are very close to each other from a mathematical point of view.

  4. A physically based analytical spatial air temperature and humidity model

    Science.gov (United States)

    Yang, Yang; Endreny, Theodore A.; Nowak, David J.

    2013-09-01

    Spatial variation of urban surface air temperature and humidity influences human thermal comfort, the settling rate of atmospheric pollutants, and plant physiology and growth. Given the lack of observations, we developed a Physically based Analytical Spatial Air Temperature and Humidity (PASATH) model. The PASATH model calculates spatial solar radiation and heat storage based on semiempirical functions and generates spatially distributed estimates based on inputs of topography, land cover, and the weather data measured at a reference site. The model assumes that for all grids under the same mesoscale climate, grid air temperature and humidity are modified by local variation in absorbed solar radiation and the partitioning of sensible and latent heat. The model uses a reference grid site for time series meteorological data and the air temperature and humidity of any other grid can be obtained by solving the heat flux network equations. PASATH was coupled with the USDA iTree-Hydro water balance model to obtain evapotranspiration terms and run from 20 to 29 August 2010 at a 360 m by 360 m grid scale and hourly time step across a 285 km2 watershed including the urban area of Syracuse, NY. PASATH predictions were tested at nine urban weather stations representing variability in urban topography and land cover. The PASATH model predictive efficiency R2 ranged from 0.81 to 0.99 for air temperature and 0.77 to 0.97 for dew point temperature. PASATH is expected to have broad applications on environmental and ecological models.

  5. Modelling characteristics of ferromagnetic cores with the influence of temperature

    International Nuclear Information System (INIS)

    Górecki, K; Rogalska, M; Zarȩbski, J; Detka, K

    2014-01-01

    The paper is devoted to modelling characteristics of ferromagnetic cores with the use of SPICE software. Some disadvantages of the selected literature models of such cores are discussed. A modified model of ferromagnetic cores taking into account the influence of temperature on the magnetizing characteristics and the core losses is proposed. The form of the elaborated model is presented and discussed. The correctness of this model is verified by comparing the calculated and the measured characteristics of the selected ferromagnetic cores.

  6. A high temperature interparticle potential for an alternative gauge model

    International Nuclear Information System (INIS)

    Doria, R.M.

    1984-01-01

    A thermal Wilson loop for a model with two gauge fields associated with the same gauge group is discussed. Deconfinement appears at high temperature. It is not possible however specify the colour of the deconfined matter. (Author) [pt

  7. Modelling and analysis of radial thermal stresses and temperature ...

    African Journals Online (AJOL)

    user

    The temperature field, heat transfer rate and thermal stresses were investigated with numerical simulation models using FORTRAN FE (finite element) software. ...... specific heats, International Communications in Heat and Mass Transfer, Vol.

  8. Evaluation of brightness temperature from a forward model of ...

    Indian Academy of Sciences (India)

    profile the temperature and humidity at high temporal and vertical resolution in the lower troposphere. The process of ... structure of the atmosphere in numerical weather prediction models. ..... quency channels that can be used in building.

  9. The incidence of injuries in young people: II. Log-linear multivariable models for risk factors in a collaborative study in Brazil, Chile, Cuba and Venezuela.

    Science.gov (United States)

    Bangdiwala, S I; Anzola-Pérez, E

    1990-03-01

    Injuries and accidents are acknowledged as leading causes of morbidity and mortality among children and adolescents in the developing countries of the world. The Pan American Health Organization sponsored a collaborative study in four selected countries in Latin America to study the extent of the problem as well as to examine the potential risk factors associated with selected non-fatal injuries in the countries. The study subjects were injured children and adolescents (0-19 years of age) presenting at the study hospitals in chosen urban centres, as well as injured that were surveyed in households in the catchment areas of the hospitals. Study methods and descriptive frequency results were presented earlier. In this paper, log-linear multivariate regression models are used to examine the potentiating effects within country of several measured variables on specific types of injuries. The significance of risk factors varied between countries; however, some general patterns emerged. Falls were more likely in younger children, and occurred at home. The main risk factor for home accidents was the age of the child. The education of the head of the household was an important risk factor for the type of injury suffered. The likelihood of traffic accident injury varied with time of day and day of the week, but also was more likely in higher educated households. The results found are consistent with those found in other studies in the developed world and suggest specific areas of concern for health planners to address.

  10. Applying Time Series Analysis Model to Temperature Data in Greenhouses

    Directory of Open Access Journals (Sweden)

    Abdelhafid Hasni

    2011-03-01

    Full Text Available The objective of the research is to find an appropriate Seasonal Auto-Regressive Integrated Moving Average (SARIMA Model for fitting the inside air temperature (Tin of a naturally ventilated greenhouse under Mediterranean conditions by considering the minimum of Akaike Information Criterion (AIC. The results of fitting were as follows: the best SARIMA Model for fitting air temperature of greenhouse is SARIMA (1,0,0 (1,0,224.

  11. Modeling the Temperature Effect of Orientations in Residential Buildings

    Directory of Open Access Journals (Sweden)

    Sabahat Arif

    2012-07-01

    Full Text Available Indoor thermal comfort in a building has been an important issue for the environmental sustainability. It is an accepted fact that their designs and planning consume a lot of energy in the modern architecture of 20th and 21st centuries. An appropriate orientation of a building can provide thermally comfortable indoor temperatures which otherwise can consume extra energy to condition these spaces through all the seasons. This experimental study investigates the potential effect of this solar passive design strategy on indoor temperatures and a simple model is presented for predicting indoor temperatures based upon the ambient temperatures.

  12. Artificial neural network modeling and cluster analysis for organic facies and burial history estimation using well log data: A case study of the South Pars Gas Field, Persian Gulf, Iran

    Science.gov (United States)

    Alizadeh, Bahram; Najjari, Saeid; Kadkhodaie-Ilkhchi, Ali

    2012-08-01

    Intelligent and statistical techniques were used to extract the hidden organic facies from well log responses in the Giant South Pars Gas Field, Persian Gulf, Iran. Kazhdomi Formation of Mid-Cretaceous and Kangan-Dalan Formations of Permo-Triassic Data were used for this purpose. Initially GR, SGR, CGR, THOR, POTA, NPHI and DT logs were applied to model the relationship between wireline logs and Total Organic Carbon (TOC) content using Artificial Neural Networks (ANN). The correlation coefficient (R2) between the measured and ANN predicted TOC equals to 89%. The performance of the model is measured by the Mean Squared Error function, which does not exceed 0.0073. Using Cluster Analysis technique and creating a binary hierarchical cluster tree the constructed TOC column of each formation was clustered into 5 organic facies according to their geochemical similarity. Later a second model with the accuracy of 84% was created by ANN to determine the specified clusters (facies) directly from well logs for quick cluster recognition in other wells of the studied field. Each created facies was correlated to its appropriate burial history curve. Hence each and every facies of a formation could be scrutinized separately and directly from its well logs, demonstrating the time and depth of oil or gas generation. Therefore potential production zone of Kazhdomi probable source rock and Kangan- Dalan reservoir formation could be identified while well logging operations (especially in LWD cases) were in progress. This could reduce uncertainty and save plenty of time and cost for oil industries and aid in the successful implementation of exploration and exploitation plans.

  13. Peltier cells as temperature control elements: Experimental characterization and modeling

    International Nuclear Information System (INIS)

    Mannella, Gianluca A.; La Carrubba, Vincenzo; Brucato, Valerio

    2014-01-01

    The use of Peltier cells to realize compact and precise temperature controlled devices is under continuous extension in recent years. In order to support the design of temperature control systems, a simplified modeling of heat transfer dynamics for thermoelectric devices is presented. By following a macroscopic approach, the heat flux removed at the cold side of Peltier cell can be expressed as Q . c =γ(T c −T c eq ), where γ is a coefficient dependent on the electric current, T c and T c eq are the actual and steady state cold side temperature, respectively. On the other hand, a microscopic modeling approach was pursued via finite element analysis software packages. To validate the models, an experimental apparatus was designed and build-up, consisting in a sample vial with the surfaces in direct contact with Peltier cells. Both modeling approaches led to reliable prediction of transient and steady state sample temperature. -- Highlights: • Simplified modeling of heat transfer dynamics in Peltier cells. • Coupled macroscopic and microscopic approach. • Experimental apparatus: temperature control of a sample vial. • Both modeling approaches predict accurately the transient and steady state sample temperature

  14. Temperature modelling and prediction for activated sludge systems.

    Science.gov (United States)

    Lippi, S; Rosso, D; Lubello, C; Canziani, R; Stenstrom, M K

    2009-01-01

    Temperature is an important factor affecting biomass activity, which is critical to maintain efficient biological wastewater treatment, and also physiochemical properties of mixed liquor as dissolved oxygen saturation and settling velocity. Controlling temperature is not normally possible for treatment systems but incorporating factors impacting temperature in the design process, such as aeration system, surface to volume ratio, and tank geometry can reduce the range of temperature extremes and improve the overall process performance. Determining how much these design or up-grade options affect the tank temperature requires a temperature model that can be used with existing design methodologies. This paper presents a new steady state temperature model developed by incorporating the best aspects of previously published models, introducing new functions for selected heat exchange paths and improving the method for predicting the effects of covering aeration tanks. Numerical improvements with embedded reference data provide simpler formulation, faster execution, easier sensitivity analyses, using an ordinary spreadsheet. The paper presents several cases to validate the model.

  15. Better temperature predictions in geothermal modelling by improved quality of input parameters: a regional case study from the Danish-German border region

    Science.gov (United States)

    Fuchs, Sven; Bording, Thue S.; Balling, Niels

    2015-04-01

    Thermal modelling is used to examine the subsurface temperature field and geothermal conditions at various scales (e.g. sedimentary basins, deep crust) and in the framework of different problem settings (e.g. scientific or industrial use). In such models, knowledge of rock thermal properties is prerequisites for the parameterisation of boundary conditions and layer properties. In contrast to hydrogeological ground-water models, where parameterization of the major rock property (i.e. hydraulic conductivity) is generally conducted considering lateral variations within geological layers, parameterization of thermal models (in particular regarding thermal conductivity but also radiogenic heat production and specific heat capacity) in most cases is conducted using constant parameters for each modelled layer. For such constant thermal parameter values, moreover, initial values are normally obtained from rare core measurements and/or literature values, which raise questions for their representativeness. Some few studies have considered lithological composition or well log information, but still keeping the layer values constant. In the present thermal-modelling scenario analysis, we demonstrate how the use of different parameter input type (from literature, well logs and lithology) and parameter input style (constant or laterally varying layer values) affects the temperature model prediction in sedimentary basins. For this purpose, rock thermal properties are deduced from standard petrophysical well logs and lithological descriptions for several wells in a project area. Statistical values of thermal properties (mean, standard deviation, moments, etc.) are calculated at each borehole location for each geological formation and, moreover, for the entire dataset. Our case study is located at the Danish-German border region (model dimension: 135 x115 km, depth: 20 km). Results clearly show that (i) the use of location-specific well-log derived rock thermal properties and (i

  16. Estimation of the non records logs from existing logs using artificial neural networks

    Directory of Open Access Journals (Sweden)

    Mehdi Mohammad Salehi

    2017-12-01

    Full Text Available Finding the information of the hydrocarbon reservoirs from well logs is one of the main objectives of the engineers. But, missing the log records (due to many reasons such as broken instruments, unsuitable borehole and etc. is a major challenge to achieve it. Prediction of the density and resistivity logs (Rt, DT and LLS from the conventional wire-line logs in one of the Iranian southwest oil fields is the main purpose of this study. Multilayer neural network was applied to develop an intelligent predictive model for prediction of the logs. A total of 3000 data sets from 3 wells (A, B and C of the studied field were used. Among them, the data of A, B and C wells were used to constructing and testing the model, respectively. To evaluate the performance of the model, the mean square error (MSE and correlation coefficient (R2 in the test data were calculated. A comparison between the MSE of the proposed model and recently intelligent models shows that the proposed model is more accurate than others. Acceptable accuracy and using conventional well logging data are the highlight advantages of the proposed intelligent model.

  17. Investigation of approximate models of experimental temperature characteristics of machines

    Science.gov (United States)

    Parfenov, I. V.; Polyakov, A. N.

    2018-05-01

    This work is devoted to the investigation of various approaches to the approximation of experimental data and the creation of simulation mathematical models of thermal processes in machines with the aim of finding ways to reduce the time of their field tests and reducing the temperature error of the treatments. The main methods of research which the authors used in this work are: the full-scale thermal testing of machines; realization of various approaches at approximation of experimental temperature characteristics of machine tools by polynomial models; analysis and evaluation of modelling results (model quality) of the temperature characteristics of machines and their derivatives up to the third order in time. As a result of the performed researches, rational methods, type, parameters and complexity of simulation mathematical models of thermal processes in machine tools are proposed.

  18. Temperature shifts in the Sinai model: static and dynamical effects

    International Nuclear Information System (INIS)

    Sales, Marta; Bouchaud, Jean-Philippe; Ritort, Felix

    2003-01-01

    We study analytically and numerically the role of temperature shifts in the simplest model where the energy landscape is explicitly hierarchical, namely the Sinai model. This model has both attractive features (there are valleys within valleys in a strict self-similar sense), but also one important drawback: there is no phase transition so that the model is, in the large-size limit, effectively at zero temperature. We compute various static chaos indicators, that are found to be trivial in the large-size limit, but exhibit interesting features for finite sizes. Correspondingly, for finite times, some interesting rejuvenation effects, related to the self-similar nature of the potential, are observed. Still, the separation of time scales/length scales with temperature in this model is much weaker than in experimental spin glasses

  19. A new weighted mean temperature model in China

    Science.gov (United States)

    Liu, Jinghong; Yao, Yibin; Sang, Jizhang

    2018-01-01

    The Global Positioning System (GPS) has been applied in meteorology to monitor the change of Precipitable Water Vapor (PWV) in atmosphere, transformed from Zenith Wet Delay (ZWD). A key factor in converting the ZWD into the PWV is the weighted mean temperature (Tm), which has a direct impact on the accuracy of the transformation. A number of Bevis-type models, like Tm -Ts and Tm -Ts,Ps type models, have been developed by statistics approaches, and are not able to clearly depict the relationship between Tm and the surface temperature, Ts . A new model for Tm , called weighted mean temperature norm model (abbreviated as norm model), is derived as a function of Ts , the lapse rate of temperature, δ, the tropopause height, htrop , and the radiosonde station height, hs . It is found that Tm is better related to Ts through an intermediate temperature. The small effects of lapse rate can be ignored and the tropopause height be obtained from an empirical model. Then the norm model is reduced to a simplified form, which causes fewer loss of accuracy and needs two inputs, Ts and hs . In site-specific fittings, the norm model performs much better, with RMS values reduced averagely by 0.45 K and the Mean of Absolute Differences (MAD) values by 0.2 K. The norm model is also found more appropriate than the linear models to fit Tm in a large area, not only with the RMS value reduced from 4.3 K to 3.80 K, correlation coefficient R2 increased from 0.84 to 0.88, and MAD decreased from 3.24 K to 2.90 K, but also with the distribution of simplified model values to be more reasonable. The RMS and MAD values of the differences between reference and computed PWVs are reduced by on average 16.3% and 14.27%, respectively, when using the new norm models instead of the linear model.

  20. Hourly predictive Levenberg-Marquardt ANN and multi linear regression models for predicting of dew point temperature

    Science.gov (United States)

    Zounemat-Kermani, Mohammad

    2012-08-01

    In this study, the ability of two models of multi linear regression (MLR) and Levenberg-Marquardt (LM) feed-forward neural network was examined to estimate the hourly dew point temperature. Dew point temperature is the temperature at which water vapor in the air condenses into liquid. This temperature can be useful in estimating meteorological variables such as fog, rain, snow, dew, and evapotranspiration and in investigating agronomical issues as stomatal closure in plants. The availability of hourly records of climatic data (air temperature, relative humidity and pressure) which could be used to predict dew point temperature initiated the practice of modeling. Additionally, the wind vector (wind speed magnitude and direction) and conceptual input of weather condition were employed as other input variables. The three quantitative standard statistical performance evaluation measures, i.e. the root mean squared error, mean absolute error, and absolute logarithmic Nash-Sutcliffe efficiency coefficient ( {| {{{Log}}({{NS}})} |} ) were employed to evaluate the performances of the developed models. The results showed that applying wind vector and weather condition as input vectors along with meteorological variables could slightly increase the ANN and MLR predictive accuracy. The results also revealed that LM-NN was superior to MLR model and the best performance was obtained by considering all potential input variables in terms of different evaluation criteria.

  1. Nova Event Logging System

    International Nuclear Information System (INIS)

    Calliger, R.J.; Suski, G.J.

    1981-01-01

    Nova is a 200 terawatt, 10-beam High Energy Glass Laser currently under construction at LLNL. This facility, designed to demonstrate the feasibility of laser driven inertial confinement fusion, contains over 5000 elements requiring coordinated control, data acquisition, and analysis functions. The large amounts of data that will be generated must be maintained over the life of the facility. Often the most useful but inaccessible data is that related to time dependent events associated with, for example, operator actions or experiment activity. We have developed an Event Logging System to synchronously record, maintain, and analyze, in part, this data. We see the system as being particularly useful to the physics and engineering staffs of medium and large facilities in that it is entirely separate from experimental apparatus and control devices. The design criteria, implementation, use, and benefits of such a system will be discussed

  2. Modelling the effect of temperature on seed germination in some ...

    African Journals Online (AJOL)

    The prediction of germination percentage (GP) and germination speed (GS) of the seeds for some cucurbits (watermelon, melon, cucumber, summer squash, pumpkin and winter squash) was investigated by mathematical model based on temperature. The model, D = [a - (b x T) + (c x T2)] of Uzun et al. (2001), was adapted ...

  3. Geomicrobial Optical Logging Detectors (GOLD)

    Science.gov (United States)

    Bramall, N. E.; Stoker, C. R.; Price, P. B.; Coates, J. D.; Allamandola, L. J.; Mattioda, A. L.

    2008-12-01

    We will present concepts for downhole instrumentation that could be used in the Deep Underground Science and Engineering Laboratory (DUSEL). We envision optical borehole-logging instruments that could monitor bacterial concentration, mineralogy, aromatic organics, temperature and oxygen concentration, allowing for the in situ monitoring of time-dependent microbial and short-scale geologic processes and provide valuable in situ data on stratigraphy to supplement core analyses, especially where instances of missing or damaged core sections make such studies difficult. Incorporated into these instruments will be a sampling/inoculation tool to allow for the recovery and/or manipulation of particularly interesting sections of the borehole wall for further study, enabling a series of microbiological studies. The borehole tools we will develop revolve around key emerging technologies and methods, some of which are briefly described below: 1) Autofluorescence Spectroscopy: Building on past instruments, we will develop a new borehole logger that searches for microbial life and organics using fluorescence spectroscopy. Many important organic compounds (e.g. PAHs) and biomolecules (e.g. aromatic amino acids, proteins, methanogenic coenzymes) fluoresce when excited with ultraviolet and visible light. Through the careful selection of excitation wavelength(s) and temporal gating parameters, a borehole logging instrument can detect and differentiate between these different compounds and the mineral matrix in which they exist. 2) Raman Spectroscopy: Though less sensitive than fluorescence spectroscopy, Raman spectroscopy is more definitive: it can provide important mineral phase distribution/proportions and other chemical data enabling studies of mineralogy and microbe-mineral interactions (when combined with fluorescence). 3) Borehole Camera: Imaging of the borehole wall with extended information in the UV, visible, and NIR for a more informative view can provide a lot of insight

  4. Mathematical modelling of steam generator and design of temperature regulator

    Energy Technology Data Exchange (ETDEWEB)

    Bogdanovic, S.S. [EE Institute Nikola Tesla, Belgrade (Yugoslavia)

    1999-07-01

    The paper considers mathematical modelling of once-through power station boiler and numerical algorithm for simulation of the model. Fast and numerically stable algorithm based on the linearisation of model equations and on the simultaneous solving of differential and algebraic equations is proposed. The paper also presents the design of steam temperature regulator by using the method of projective controls. Dynamic behaviour of the system closed with optimal linear quadratic regulator is taken as the reference system. The desired proprieties of the reference system are retained and solutions for superheated steam temperature regulator are determined. (author)

  5. A model of the ground surface temperature for micrometeorological analysis

    Science.gov (United States)

    Leaf, Julian S.; Erell, Evyatar

    2017-07-01

    Micrometeorological models at various scales require ground surface temperature, which may not always be measured in sufficient spatial or temporal detail. There is thus a need for a model that can calculate the surface temperature using only widely available weather data, thermal properties of the ground, and surface properties. The vegetated/permeable surface energy balance (VP-SEB) model introduced here requires no a priori knowledge of soil temperature or moisture at any depth. It combines a two-layer characterization of the soil column following the heat conservation law with a sinusoidal function to estimate deep soil temperature, and a simplified procedure for calculating moisture content. A physically based solution is used for each of the energy balance components allowing VP-SEB to be highly portable. VP-SEB was tested using field data measuring bare loess desert soil in dry weather and following rain events. Modeled hourly surface temperature correlated well with the measured data (r 2 = 0.95 for a whole year), with a root-mean-square error of 2.77 K. The model was used to generate input for a pedestrian thermal comfort study using the Index of Thermal Stress (ITS). The simulation shows that the thermal stress on a pedestrian standing in the sun on a fully paved surface, which may be over 500 W on a warm summer day, may be as much as 100 W lower on a grass surface exposed to the same meteorological conditions.

  6. Incorporation of the equilibrium temperature approach in a Soil and Water Assessment Tool hydroclimatological stream temperature model

    Science.gov (United States)

    Du, Xinzhong; Shrestha, Narayan Kumar; Ficklin, Darren L.; Wang, Junye

    2018-04-01

    Stream temperature is an important indicator for biodiversity and sustainability in aquatic ecosystems. The stream temperature model currently in the Soil and Water Assessment Tool (SWAT) only considers the impact of air temperature on stream temperature, while the hydroclimatological stream temperature model developed within the SWAT model considers hydrology and the impact of air temperature in simulating the water-air heat transfer process. In this study, we modified the hydroclimatological model by including the equilibrium temperature approach to model heat transfer processes at the water-air interface, which reflects the influences of air temperature, solar radiation, wind speed and streamflow conditions on the heat transfer process. The thermal capacity of the streamflow is modeled by the variation of the stream water depth. An advantage of this equilibrium temperature model is the simple parameterization, with only two parameters added to model the heat transfer processes. The equilibrium temperature model proposed in this study is applied and tested in the Athabasca River basin (ARB) in Alberta, Canada. The model is calibrated and validated at five stations throughout different parts of the ARB, where close to monthly samplings of stream temperatures are available. The results indicate that the equilibrium temperature model proposed in this study provided better and more consistent performances for the different regions of the ARB with the values of the Nash-Sutcliffe Efficiency coefficient (NSE) greater than those of the original SWAT model and the hydroclimatological model. To test the model performance for different hydrological and environmental conditions, the equilibrium temperature model was also applied to the North Fork Tolt River Watershed in Washington, United States. The results indicate a reasonable simulation of stream temperature using the model proposed in this study, with minimum relative error values compared to the other two models

  7. Temperature response functions introduce high uncertainty in modelled carbon stocks in cold temperature regimes

    Science.gov (United States)

    Portner, H.; Wolf, A.; Bugmann, H.

    2009-04-01

    Many biogeochemical models have been applied to study the response of the carbon cycle to changes in climate, whereby the process of carbon uptake (photosynthesis) has usually gained more attention than the equally important process of carbon release by respiration. The decomposition of soil organic matter is driven by a combination of factors with a prominent one being soil temperature [Berg and Laskowski(2005)]. One uncertainty concerns the response function used to describe the sensitivity of soil organic matter decomposition to temperature. This relationship is often described by one out of a set of similar exponential functions, but it has not been investigated how uncertainties in the choice of the response function influence the long term predictions of biogeochemical models. We built upon the well-established LPJ-GUESS model [Smith et al.(2001)]. We tested five candidate functions and calibrated them against eight datasets from different Ameriflux and CarboEuropeIP sites [Hibbard et al.(2006)]. We used a simple Exponential function with a constant Q10, the Arrhenius function, the Gaussian function [Tuomi et al.(2008), O'Connell(1990)], the Van't Hoff function [Van't Hoff(1901)] and the Lloyd&Taylor function [Lloyd and Taylor(1994)]. We assessed the impact of uncertainty in model formulation of temperature response on estimates of present and future long-term carbon storage in ecosystems and hence on the CO2 feedback potential to the atmosphere. We specifically investigated the relative importance of model formulation and the error introduced by using different data sets for the parameterization. Our results suggested that the Exponential and Arrhenius functions are inappropriate, as they overestimated the respiration rates at lower temperatures. The Gaussian, Van't Hoff and Lloyd&Taylor functions all fit the observed data better, whereby the functions of Gaussian and Van't Hoff underestimated the response at higher temperatures. We suggest, that the

  8. Heat Transfer Modeling for Rigid High-Temperature Fibrous Insulation

    Science.gov (United States)

    Daryabeigi, Kamran; Cunnington, George R.; Knutson, Jeffrey R.

    2012-01-01

    Combined radiation and conduction heat transfer through a high-temperature, high-porosity, rigid multiple-fiber fibrous insulation was modeled using a thermal model previously used to model heat transfer in flexible single-fiber fibrous insulation. The rigid insulation studied was alumina enhanced thermal barrier (AETB) at densities between 130 and 260 kilograms per cubic meter. The model consists of using the diffusion approximation for radiation heat transfer, a semi-empirical solid conduction model, and a standard gas conduction model. The relevant parameters needed for the heat transfer model were estimated from steady-state thermal measurements in nitrogen gas at various temperatures and environmental pressures. The heat transfer modeling methodology was evaluated by comparison with standard thermal conductivity measurements, and steady-state thermal measurements in helium and carbon dioxide gases. The heat transfer model is applicable over the temperature range of 300 to 1360 K, pressure range of 0.133 to 101.3 x 10(exp 3) Pa, and over the insulation density range of 130 to 260 kilograms per cubic meter in various gaseous environments.

  9. Statistical Downscaling of Temperature with the Random Forest Model

    Directory of Open Access Journals (Sweden)

    Bo Pang

    2017-01-01

    Full Text Available The issues with downscaling the outputs of a global climate model (GCM to a regional scale that are appropriate to hydrological impact studies are investigated using the random forest (RF model, which has been shown to be superior for large dataset analysis and variable importance evaluation. The RF is proposed for downscaling daily mean temperature in the Pearl River basin in southern China. Four downscaling models were developed and validated by using the observed temperature series from 61 national stations and large-scale predictor variables derived from the National Center for Environmental Prediction–National Center for Atmospheric Research reanalysis dataset. The proposed RF downscaling model was compared to multiple linear regression, artificial neural network, and support vector machine models. Principal component analysis (PCA and partial correlation analysis (PAR were used in the predictor selection for the other models for a comprehensive study. It was shown that the model efficiency of the RF model was higher than that of the other models according to five selected criteria. By evaluating the predictor importance, the RF could choose the best predictor combination without using PCA and PAR. The results indicate that the RF is a feasible tool for the statistical downscaling of temperature.

  10. Genetic Programming and Standardization in Water Temperature Modelling

    Directory of Open Access Journals (Sweden)

    Maritza Arganis

    2009-01-01

    Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.

  11. Understanding and quantifying foliar temperature acclimation for Earth System Models

    Science.gov (United States)

    Smith, N. G.; Dukes, J.

    2015-12-01

    Photosynthesis and respiration on land are the two largest carbon fluxes between the atmosphere and Earth's surface. The parameterization of these processes represent major uncertainties in the terrestrial component of the Earth System Models used to project future climate change. Research has shown that much of this uncertainty is due to the parameterization of the temperature responses of leaf photosynthesis and autotrophic respiration, which are typically based on short-term empirical responses. Here, we show that including longer-term responses to temperature, such as temperature acclimation, can help to reduce this uncertainty and improve model performance, leading to drastic changes in future land-atmosphere carbon feedbacks across multiple models. However, these acclimation formulations have many flaws, including an underrepresentation of many important global flora. In addition, these parameterizations were done using multiple studies that employed differing methodology. As such, we used a consistent methodology to quantify the short- and long-term temperature responses of maximum Rubisco carboxylation (Vcmax), maximum rate of Ribulos-1,5-bisphosphate regeneration (Jmax), and dark respiration (Rd) in multiple species representing each of the plant functional types used in global-scale land surface models. Short-term temperature responses of each process were measured in individuals acclimated for 7 days at one of 5 temperatures (15-35°C). The comparison of short-term curves in plants acclimated to different temperatures were used to evaluate long-term responses. Our analyses indicated that the instantaneous response of each parameter was highly sensitive to the temperature at which they were acclimated. However, we found that this sensitivity was larger in species whose leaves typically experience a greater range of temperatures over the course of their lifespan. These data indicate that models using previous acclimation formulations are likely incorrectly

  12. Design and modeling of low temperature solar thermal power station

    International Nuclear Information System (INIS)

    Shankar Ganesh, N.; Srinivas, T.

    2012-01-01

    Highlights: ► The optimum conditions are different for efficiency and power conditions. ► The current model works up to a maximum separator temperature of 150 °C. ► The turbine concentration influences the high pressure. ► High solar beam radiation and optimized cycle conditions give low collector cost. -- Abstract: During the heat recovery in a Kalina cycle, a binary aqua–ammonia mixture changes its state from liquid to vapor, the more volatile ammonia vaporizes first and then the water starts vaporization to match temperature profile of the hot fluid. In the present work, a low temperature Kalina cycle has been investigated to optimize the heat recovery from solar thermal collectors. Hot fluid coming from solar parabolic trough collector with vacuum tubes is used to generate ammonia rich vapor in a boiler for power generation. The turbine inlet conditions are optimized to match the variable hot fluid temperature with the intermittent nature of the solar radiation. The key parameters discussed in this study are strong solution concentration, separator temperature which affects the hot fluid inlet temperature and turbine ammonia concentration. Solar parabolic collector system with vacuum tubes has been designed at the optimized power plant conditions. This work can be used in the selection of boiler, separator and turbine conditions to maximize the power output as well as efficiency of power generation system. The current model results a maximum limit temperature for separator as 150 °C at the Indian climatic conditions. A maximum specific power of 105 kW per kg/s of working fluid can be obtained at 80% of strong solution concentration with 140 °C separator temperature. The corresponding plant and cycle efficiencies are 5.25% and 13% respectively. But the maximum efficiencies of 6% and 15% can be obtained respectively for plant and Kalina cycle at 150 °C of separator temperature.

  13. Intelligent approaches for the synthesis of petrophysical logs

    International Nuclear Information System (INIS)

    Rezaee, M Reza; Kadkhodaie-Ilkhchi, Ali; Alizadeh, Pooya Mohammad

    2008-01-01

    Log data are of prime importance in acquiring petrophysical data from hydrocarbon reservoirs. Reliable log analysis in a hydrocarbon reservoir requires a complete set of logs. For many reasons, such as incomplete logging in old wells, destruction of logs due to inappropriate data storage and measurement errors due to problems with logging apparatus or hole conditions, log suites are either incomplete or unreliable. In this study, fuzzy logic and artificial neural networks were used as intelligent tools to synthesize petrophysical logs including neutron, density, sonic and deep resistivity. The petrophysical data from two wells were used for constructing intelligent models in the Fahlian limestone reservoir, Southern Iran. A third well from the field was used to evaluate the reliability of the models. The results showed that fuzzy logic and artificial neural networks were successful in synthesizing wireline logs. The combination of the results obtained from fuzzy logic and neural networks in a simple averaging committee machine (CM) showed a significant improvement in the accuracy of the estimations. This committee machine performed better than fuzzy logic or the neural network model in the problem of estimating petrophysical properties from well logs

  14. Cloud Impacts on Pavement Temperature in Energy Balance Models

    Science.gov (United States)

    Walker, C. L.

    2013-12-01

    Forecast systems provide decision support for end-users ranging from the solar energy industry to municipalities concerned with road safety. Pavement temperature is an important variable when considering vehicle response to various weather conditions. A complex, yet direct relationship exists between tire and pavement temperatures. Literature has shown that as tire temperature increases, friction decreases which affects vehicle performance. Many forecast systems suffer from inaccurate radiation forecasts resulting in part from the inability to model different types of clouds and their influence on radiation. This research focused on forecast improvement by determining how cloud type impacts the amount of shortwave radiation reaching the surface and subsequent pavement temperatures. The study region was the Great Plains where surface solar radiation data were obtained from the High Plains Regional Climate Center's Automated Weather Data Network stations. Road pavement temperature data were obtained from the Meteorological Assimilation Data Ingest System. Cloud properties and radiative transfer quantities were obtained from the Clouds and Earth's Radiant Energy System mission via Aqua and Terra Moderate Resolution Imaging Spectroradiometer satellite products. An additional cloud data set was incorporated from the Naval Research Laboratory Cloud Classification algorithm. Statistical analyses using a modified nearest neighbor approach were first performed relating shortwave radiation variability with road pavement temperature fluctuations. Then statistical associations were determined between the shortwave radiation and cloud property data sets. Preliminary results suggest that substantial pavement forecasting improvement is possible with the inclusion of cloud-specific information. Future model sensitivity testing seeks to quantify the magnitude of forecast improvement.

  15. Thermal modelling of PV module performance under high ambient temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Diarra, D.C.; Harrison, S.J. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering Solar Calorimetry Lab; Akuffo, F.O. [Kwame Nkrumah Univ. of Science and Technology, Kumasi (Ghana). Dept. of Mechanical Engineering

    2005-07-01

    When predicting the performance of photovoltaic (PV) generators, the actual performance is typically lower than test results conducted under standard test conditions because the radiant energy absorbed in the module under normal operation raises the temperature of the cell and other multilayer components. The increase in temperature translates to a lower conversion efficiency of the solar cells. In order to address these discrepancies, a thermal model of a characteristic PV module was developed to assess and predict its performance under real field-conditions. The PV module consisted of monocrystalline silicon cells in EVA between a glass cover and a tedlar backing sheet. The EES program was used to compute the equilibrium temperature profile in the PV module. It was shown that heat is dissipated towards the bottom and the top of the module, and that its temperature can be much higher than the ambient temperature. Modelling results indicate that 70-75 per cent of the absorbed solar radiation is dissipated from the solar cells as heat, while 4.7 per cent of the solar energy is absorbed in the glass cover and the EVA. It was also shown that the operating temperature of the PV module decreases with increased wind speed. 2 refs.

  16. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    Science.gov (United States)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  17. Logging concessions enable illegal logging crisis in the Peruvian Amazon.

    Science.gov (United States)

    Finer, Matt; Jenkins, Clinton N; Sky, Melissa A Blue; Pine, Justin

    2014-04-17

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  18. Modeling Apple Surface Temperature Dynamics Based on Weather Data

    Directory of Open Access Journals (Sweden)

    Lei Li

    2014-10-01

    Full Text Available The exposure of fruit surfaces to direct sunlight during the summer months can result in sunburn damage. Losses due to sunburn damage are a major economic problem when marketing fresh apples. The objective of this study was to develop and validate a model for simulating fruit surface temperature (FST dynamics based on energy balance and measured weather data. A series of weather data (air temperature, humidity, solar radiation, and wind speed was recorded for seven hours between 11:00–18:00 for two months at fifteen minute intervals. To validate the model, the FSTs of “Fuji” apples were monitored using an infrared camera in a natural orchard environment. The FST dynamics were measured using a series of thermal images. For the apples that were completely exposed to the sun, the RMSE of the model for estimating FST was less than 2.0 °C. A sensitivity analysis of the emissivity of the apple surface and the conductance of the fruit surface to water vapour showed that accurate estimations of the apple surface emissivity were important for the model. The validation results showed that the model was capable of accurately describing the thermal performances of apples under different solar radiation intensities. Thus, this model could be used to more accurately estimate the FST relative to estimates that only consider the air temperature. In addition, this model provides useful information for sunburn protection management.

  19. Modeling apple surface temperature dynamics based on weather data.

    Science.gov (United States)

    Li, Lei; Peters, Troy; Zhang, Qin; Zhang, Jingjin; Huang, Danfeng

    2014-10-27

    The exposure of fruit surfaces to direct sunlight during the summer months can result in sunburn damage. Losses due to sunburn damage are a major economic problem when marketing fresh apples. The objective of this study was to develop and validate a model for simulating fruit surface temperature (FST) dynamics based on energy balance and measured weather data. A series of weather data (air temperature, humidity, solar radiation, and wind speed) was recorded for seven hours between 11:00-18:00 for two months at fifteen minute intervals. To validate the model, the FSTs of "Fuji" apples were monitored using an infrared camera in a natural orchard environment. The FST dynamics were measured using a series of thermal images. For the apples that were completely exposed to the sun, the RMSE of the model for estimating FST was less than 2.0 °C. A sensitivity analysis of the emissivity of the apple surface and the conductance of the fruit surface to water vapour showed that accurate estimations of the apple surface emissivity were important for the model. The validation results showed that the model was capable of accurately describing the thermal performances of apples under different solar radiation intensities. Thus, this model could be used to more accurately estimate the FST relative to estimates that only consider the air temperature. In addition, this model provides useful information for sunburn protection management.

  20. A complex autoregressive model and application to monthly temperature forecasts

    Directory of Open Access Journals (Sweden)

    X. Gu

    2005-11-01

    Full Text Available A complex autoregressive model was established based on the mathematic derivation of the least squares for the complex number domain which is referred to as the complex least squares. The model is different from the conventional way that the real number and the imaginary number are separately calculated. An application of this new model shows a better forecast than forecasts from other conventional statistical models, in predicting monthly temperature anomalies in July at 160 meteorological stations in mainland China. The conventional statistical models include an autoregressive model, where the real number and the imaginary number are separately disposed, an autoregressive model in the real number domain, and a persistence-forecast model.

  1. Can spatial statistical river temperature models be transferred between catchments?

    Science.gov (United States)

    Jackson, Faye L.; Fryer, Robert J.; Hannah, David M.; Malcolm, Iain A.

    2017-09-01

    There has been increasing use of spatial statistical models to understand and predict river temperature (Tw) from landscape covariates. However, it is not financially or logistically feasible to monitor all rivers and the transferability of such models has not been explored. This paper uses Tw data from four river catchments collected in August 2015 to assess how well spatial regression models predict the maximum 7-day rolling mean of daily maximum Tw (Twmax) within and between catchments. Models were fitted for each catchment separately using (1) landscape covariates only (LS models) and (2) landscape covariates and an air temperature (Ta) metric (LS_Ta models). All the LS models included upstream catchment area and three included a river network smoother (RNS) that accounted for unexplained spatial structure. The LS models transferred reasonably to other catchments, at least when predicting relative levels of Twmax. However, the predictions were biased when mean Twmax differed between catchments. The RNS was needed to characterise and predict finer-scale spatially correlated variation. Because the RNS was unique to each catchment and thus non-transferable, predictions were better within catchments than between catchments. A single model fitted to all catchments found no interactions between the landscape covariates and catchment, suggesting that the landscape relationships were transferable. The LS_Ta models transferred less well, with particularly poor performance when the relationship with the Ta metric was physically implausible or required extrapolation outside the range of the data. A single model fitted to all catchments found catchment-specific relationships between Twmax and the Ta metric, indicating that the Ta metric was not transferable. These findings improve our understanding of the transferability of spatial statistical river temperature models and provide a foundation for developing new approaches for predicting Tw at unmonitored locations across

  2. Temperature driven annealing of perforations in bicellar model membranes.

    Science.gov (United States)

    Nieh, Mu-Ping; Raghunathan, V A; Pabst, Georg; Harroun, Thad; Nagashima, Kazuomi; Morales, Hannah; Katsaras, John; Macdonald, Peter

    2011-04-19

    Bicellar model membranes composed of 1,2-dimyristoylphosphatidylcholine (DMPC) and 1,2-dihexanoylphosphatidylcholine (DHPC), with a DMPC/DHPC molar ratio of 5, and doped with the negatively charged lipid 1,2-dimyristoylphosphatidylglycerol (DMPG), at DMPG/DMPC molar ratios of 0.02 or 0.1, were examined using small angle neutron scattering (SANS), (31)P NMR, and (1)H pulsed field gradient (PFG) diffusion NMR with the goal of understanding temperature effects on the DHPC-dependent perforations in these self-assembled membrane mimetics. Over the temperature range studied via SANS (300-330 K), these bicellar lipid mixtures exhibited a well-ordered lamellar phase. The interlamellar spacing d increased with increasing temperature, in direct contrast to the decrease in d observed upon increasing temperature with otherwise identical lipid mixtures lacking DHPC. (31)P NMR measurements on magnetically aligned bicellar mixtures of identical composition indicated a progressive migration of DHPC from regions of high curvature into planar regions with increasing temperature, and in accord with the "mixed bicelle model" (Triba, M. N.; Warschawski, D. E.; Devaux, P. E. Biophys. J.2005, 88, 1887-1901). Parallel PFG diffusion NMR measurements of transbilayer water diffusion, where the observed diffusion is dependent on the fractional surface area of lamellar perforations, showed that transbilayer water diffusion decreased with increasing temperature. A model is proposed consistent with the SANS, (31)P NMR, and PFG diffusion NMR data, wherein increasing temperature drives the progressive migration of DHPC out of high-curvature regions, consequently decreasing the fractional volume of lamellar perforations, so that water occupying these perforations redistributes into the interlamellar volume, thereby increasing the interlamellar spacing. © 2011 American Chemical Society

  3. Modelling of the high temperature behaviour of metallic materials

    International Nuclear Information System (INIS)

    Mohr, R.

    1999-01-01

    The design of components of metallic high-temperature materials by the finite element method requires the application of phenomenological viscoplastic material models. The route from the choice of a convenient model, the numerical integration of the equations and the parameter identification to the design of components is described. The Chaboche-model is used whose evolution equations are explicitly integrated. The parameters are determined by graphical and numerical methods in order to use the material model for describing the deformation behaviour of a chromium steel and an intermetallic titanium aluminide alloy. (orig.)

  4. Neuro-models for discharge air temperature system

    International Nuclear Information System (INIS)

    Zaheer-uddin, M.; Tudoroiu, N.

    2004-01-01

    Nonlinear neuro-models for a discharge air temperature (DAT) system are developed. Experimental data gathered in a heating ventilating and air conditioning (HVAC) test facility is used to develop multi-input multi-output (MIMO) and single-input single-output (SISO) neuro-models. Several different network architectures were explored to build the models. Results show that a three layer second order neural network structure is necessary to achieve good accuracy of the predictions. Results from the developed models are compared, and some observations on sensitivity and standard deviation errors are presented

  5. Modeling and Forecasting Average Temperature for Weather Derivative Pricing

    Directory of Open Access Journals (Sweden)

    Zhiliang Wang

    2015-01-01

    Full Text Available The main purpose of this paper is to present a feasible model for the daily average temperature on the area of Zhengzhou and apply it to weather derivatives pricing. We start by exploring the background of weather derivatives market and then use the 62 years of daily historical data to apply the mean-reverting Ornstein-Uhlenbeck process to describe the evolution of the temperature. Finally, Monte Carlo simulations are used to price heating degree day (HDD call option for this city, and the slow convergence of the price of the HDD call can be found through taking 100,000 simulations. The methods of the research will provide a frame work for modeling temperature and pricing weather derivatives in other similar places in China.

  6. Braking System Modeling and Brake Temperature Response to Repeated Cycle

    Directory of Open Access Journals (Sweden)

    Zaini Dalimus

    2014-12-01

    Full Text Available Braking safety is crucial while driving the passenger or commercial vehicles. Large amount of kinetic energy is absorbed by four brakes fitted in the vehicle. If the braking system fails to work, road accident could happen and may result in death. This research aims to model braking system together with vehicle in Matlab/Simulink software and measure actual brake temperature. First, brake characteristic and vehicle dynamic model were generated to estimate friction force and dissipated heat. Next, Arduino based prototype brake temperature monitoring was developed and tested on the road. From the experiment, it was found that brake temperature tends to increase steadily in long repeated deceleration and acceleration cycle.

  7. Information needs for increasing log transport efficiency

    Science.gov (United States)

    Timothy P. McDonald; Steven E. Taylor; Robert B. Rummer; Jorge Valenzuela

    2001-01-01

    Three methods of dispatching trucks to loggers were tested using a log transport simulation model: random allocation, fixed assignment of trucks to loggers, and dispatch based on knowledge of the current status of trucks and loggers within the system. This 'informed' dispatch algorithm attempted to minimize the difference in time between when a logger would...

  8. Mining process performance from event logs

    NARCIS (Netherlands)

    Adriansyah, A.; Buijs, J.C.A.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    In systems where process executions are not strictly enforced by a predefined process model, obtaining reliable performance information is not trivial. In this paper, we analyzed an event log of a real-life process, taken from a Dutch financial institute, using process mining techniques. In

  9. Predicción de quiebras empresariales en economías emergentes: uso de un modelo logístico mixto || Bankruptcy Prediction in Emerging Economies: Use of a Mixed Logistic Model

    Directory of Open Access Journals (Sweden)

    Caro, Norma Patricia

    2013-01-01

    Full Text Available Este trabajo replica y adapta el modelo de Jones y Hensher (2004 a los datos de una economía emergente con el propósito de evaluar su validez externa. Se compara el desempeño del modelo logístico estándar en relación con el modelo logístico mixto para predecir el riesgo de crisis en el periodo 1993-2000, utilizando estados contables de empresas argentinas y ratios definidos en estudios de Altman y Jones y Hensher. Como en estudios anteriores, rentabilidad, rotación, endeudamiento y flujo de fondos operativos explican la probabilidad de crisis financiera. La contribución de esta nueva metodología reduce la tasa de error del tipo I a un 9 %. Se demuestra que el modelo logístico mixto, que tiene en cuenta la heterogeneidad no observada, supera ampliamente el desempeño del modelo logístico estándar. || This study is a replication and adaptation of Jones and Hensher (2004 model in an emerging economy with the purpose of testing its eternal validity. It compares the logistic standard model's performance with the logistic mixed model to predict bankruptcy risk of Argentinean companies between 1993-2000 by using financial statements and ratios defined in previous studies by Altman and Jones and Hensher. Similar to previous studies, profitability, asset turnover, debt and cash flow from operations explain financial distress' probability. The main contribution of this new methodology is the important reduction of error type I to the 9 %. This study asserts that the logistic mixed model, that considers the effect of non-observed heterogeneity, significantly improves the performance of the logistic standard model.

  10. The Model of Temperature Dynamics of Pulsed Fuel Assembly

    CERN Document Server

    Bondarchenko, E A; Popov, A K

    2002-01-01

    Heat exchange process differential equations are considered for a subcritical fuel assembly with an injector. The equations are obtained by means of the use of the Hermit polynomial. The model is created for modelling of temperature transitional processes. The parameters and dynamics are estimated for hypothetical fuel assembly consisting of real mountings: the powerful proton accelerator and the reactor IBR-2 core at its subcritica l state.

  11. Analytic regularization of the Yukawa model at finite temperature

    International Nuclear Information System (INIS)

    Malbouisson, A.P.C.; Svaiter, N.F.; Svaiter, B.F.

    1996-07-01

    It is analysed the one-loop fermionic contribution for the scalar effective potential in the temperature dependent Yukawa model. Ir order to regularize the model a mix between dimensional and analytic regularization procedures is used. It is found a general expression for the fermionic contribution in arbitrary spacetime dimension. It is also found that in D = 3 this contribution is finite. (author). 19 refs

  12. Modeling Silicate Weathering for Elevated CO2 and Temperature

    Science.gov (United States)

    Bolton, E. W.

    2016-12-01

    A reactive transport model (RTM) is used to assess CO2 drawdown by silicate weathering over a wide range of temperature, pCO2, and infiltration rates for basalts and granites. Although RTM's have been used extensively to model weathering of basalts and granites for present-day conditions, we extend such modeling to higher CO2 that could have existed during the Archean and Proterozoic. We also consider a wide range of surface temperatures and infiltration rates. We consider several model basalt and granite compositions. We normally impose CO2 in equilibrium with the various atmospheric ranges modeled and CO2 is delivered to the weathering zone by aqueous transport. We also consider models with fixed CO2 (aq) throughout the weathering zone as could occur in soils with partial water saturation or with plant respiration, which can strongly influence pH and mineral dissolution rates. For the modeling, we use Kinflow: a model developed at Yale that includes mineral dissolution and precipitation under kinetic control, aqueous speciation, surface erosion, dynamic porosity, permeability, and mineral surface areas via sub-grid-scale grain models, and exchange of volatiles at the surface. Most of the modeling is done in 1D, but some comparisons to 2D domains with heterogeneous permeability are made. We find that when CO2 is fixed only at the surface, the pH tends toward higher values for basalts than granites, in large part due to the presence of more divalent than monovalent cations in the primary minerals, tending to decrease rates of mineral dissolution. Weathering rates increase (as expected) with increasing CO2 and temperature. This modeling is done with the support of the Virtual Planetary Laboratory.

  13. Modeling temperature variations in a pilot plant thermophilic anaerobic digester.

    Science.gov (United States)

    Valle-Guadarrama, Salvador; Espinosa-Solares, Teodoro; López-Cruz, Irineo L; Domaschko, Max

    2011-05-01

    A model that predicts temperature changes in a pilot plant thermophilic anaerobic digester was developed based on fundamental thermodynamic laws. The methodology utilized two simulation strategies. In the first, model equations were solved through a searching routine based on a minimal square optimization criterion, from which the overall heat transfer coefficient values, for both biodigester and heat exchanger, were determined. In the second, the simulation was performed with variable values of these overall coefficients. The prediction with both strategies allowed reproducing experimental data within 5% of the temperature span permitted in the equipment by the system control, which validated the model. The temperature variation was affected by the heterogeneity of the feeding and extraction processes, by the heterogeneity of the digestate recirculation through the heating system and by the lack of a perfect mixing inside the biodigester tank. The use of variable overall heat transfer coefficients improved the temperature change prediction and reduced the effect of a non-ideal performance of the pilot plant modeled.

  14. Modelling and analysis of radial thermal stresses and temperature ...

    African Journals Online (AJOL)

    user

    it acts as an insulating medium and prevents the heat flow, hence the need of providing insulation coating on valves is ... geometry metal components (piston, liner and cylinder head) and found a satisfactory .... model. Step8: Find the radial thermal stress at all the nodal point with the use of temperature ..... Cast iron St. 70.

  15. Modelling the effect of temperature on seed germination in some ...

    African Journals Online (AJOL)

    USER

    2010-03-01

    Mar 1, 2010 ... utilizes temperature for predicting seed germination and the model can be applied to ... seeds were sprinkled on round filter papers (Watman No. 1) in a 9 cm Petri dish and ..... A review of research on seedbed preparation for ...

  16. Last interglacial temperature evolution – a model inter-comparison

    Directory of Open Access Journals (Sweden)

    P. Bakker

    2013-03-01

    Full Text Available There is a growing number of proxy-based reconstructions detailing the climatic changes that occurred during the last interglacial period (LIG. This period is of special interest, because large parts of the globe were characterized by a warmer-than-present-day climate, making this period an interesting test bed for climate models in light of projected global warming. However, mainly because synchronizing the different palaeoclimatic records is difficult, there is no consensus on a global picture of LIG temperature changes. Here we present the first model inter-comparison of transient simulations covering the LIG period. By comparing the different simulations, we aim at investigating the common signal in the LIG temperature evolution, investigating the main driving forces behind it and at listing the climate feedbacks which cause the most apparent inter-model differences. The model inter-comparison shows a robust Northern Hemisphere July temperature evolution characterized by a maximum between 130–125 ka BP with temperatures 0.3 to 5.3 K above present day. A Southern Hemisphere July temperature maximum, −1.3 to 2.5 K at around 128 ka BP, is only found when changes in the greenhouse gas concentrations are included. The robustness of simulated January temperatures is large in the Southern Hemisphere and the mid-latitudes of the Northern Hemisphere. For these regions maximum January temperature anomalies of respectively −1 to 1.2 K and −0.8 to 2.1 K are simulated for the period after 121 ka BP. In both hemispheres these temperature maxima are in line with the maximum in local summer insolation. In a number of specific regions, a common temperature evolution is not found amongst the models. We show that this is related to feedbacks within the climate system which largely determine the simulated LIG temperature evolution in these regions. Firstly, in the Arctic region, changes in the summer sea-ice cover control the evolution of LIG winter

  17. Area under the curve predictions of dalbavancin, a new lipoglycopeptide agent, using the end of intravenous infusion concentration data point by regression analyses such as linear, log-linear and power models.

    Science.gov (United States)

    Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally

    2018-02-01

    1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.

  18. SNG-logs at Skjern

    DEFF Research Database (Denmark)

    Korsbech, Uffe C C; Petersen, Jesper; Aage, Helle Karina

    1998-01-01

    Spectral Natural Gamma-ray logs have been run in two water supply borings at Skjern. The log data have been examined by a new technique - Noise Adjusted Singular Value Decomposition - in order to get a detailed and reliable picture of the distribution of uranium and thorium gamma-rays from heavy...

  19. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several

  20. A model for quantification of temperature profiles via germination times

    DEFF Research Database (Denmark)

    Pipper, Christian Bressen; Adolf, Verena Isabelle; Jacobsen, Sven-Erik

    2013-01-01

    Current methodology to quantify temperature characteristics in germination of seeds is predominantly based on analysis of the time to reach a given germination fraction, that is, the quantiles in the distribution of the germination time of a seed. In practice interpolation between observed...... time and a specific type of accelerated failure time models is provided. As a consequence the observed number of germinated seeds at given monitoring times may be analysed directly by a grouped time-to-event model from which characteristics of the temperature profile may be identified and estimated...... germination fractions at given monitoring times is used to obtain the time to reach a given germination fraction. As a consequence the obtained value will be highly dependent on the actual monitoring scheme used in the experiment. In this paper a link between currently used quantile models for the germination...

  1. On the Temperature Dependence of the UNIQUAC/UNIFAC Models

    DEFF Research Database (Denmark)

    Skjold-Jørgensen, Steen; Rasmussen, Peter; Fredenslund, Aage

    1980-01-01

    of the simultaneous correlation. The temperature dependent parameters have, however, little physical meaning and very odd results are frequently obtained when the interaction parameters obtained from excess enthalpy information alone are used for the prediction of vapor-liquid equilibria. The UNIQUAC/UNIFAC models...... parameters based on excess enthalpy data, and the prediction of excess enthalpy information from only one isothermal set of vapor-liquid equilibrium data is qualitatively acceptable. A parameter table for the modified UNIFAC model is given for the five main groups: CH2, C = C, ACH, ACCH2 and CH2O.......Local composition models for the description of the properties of liquid mixtures do not in general give an accurate representation of excess Gibbs energy and excess enthalpy simultaneously. The introduction of temperature dependent interaction parameters leads to considerable improvements...

  2. Hall Thruster Modeling with a Given Temperature Profile

    International Nuclear Information System (INIS)

    Dorf, L.; Semenov, V.; Raitses, Y.; Fisch, N.J.

    2002-01-01

    A quasi one-dimensional steady-state model of the Hall thruster is presented. For given mass flow rate, magnetic field profile, and discharge voltage the unique solution can be constructed, assuming that the thruster operates in one of the two regimes: with or without the anode sheath. It is shown that for a given temperature profile, the applied discharge voltage uniquely determines the operating regime; for discharge voltages greater than a certain value, the sheath disappears. That result is obtained over a wide range of incoming neutral velocities, channel lengths and widths, and cathode plane locations. A good correlation between the quasi one-dimensional model and experimental results can be achieved by selecting an appropriate temperature profile. We also show how the presented model can be used to obtain a two-dimensional potential distribution

  3. Theoretical modeling of critical temperature increase in metamaterial superconductors

    Science.gov (United States)

    Smolyaninov, Igor; Smolyaninova, Vera

    Recent experiments have demonstrated that the metamaterial approach is capable of drastic increase of the critical temperature Tc of epsilon near zero (ENZ) metamaterial superconductors. For example, tripling of the critical temperature has been observed in Al-Al2O3 ENZ core-shell metamaterials. Here, we perform theoretical modelling of Tc increase in metamaterial superconductors based on the Maxwell-Garnett approximation of their dielectric response function. Good agreement is demonstrated between theoretical modelling and experimental results in both aluminum and tin-based metamaterials. Taking advantage of the demonstrated success of this model, the critical temperature of hypothetic niobium, MgB2 and H2S-based metamaterial superconductors is evaluated. The MgB2-based metamaterial superconductors are projected to reach the liquid nitrogen temperature range. In the case of an H2S-based metamaterial Tc appears to reach 250 K. This work was supported in part by NSF Grant DMR-1104676 and the School of Emerging Technologies at Towson University.

  4. On effective temperature in network models of collective behavior

    International Nuclear Information System (INIS)

    Porfiri, Maurizio; Ariel, Gil

    2016-01-01

    Collective behavior of self-propelled units is studied analytically within the Vectorial Network Model (VNM), a mean-field approximation of the well-known Vicsek model. We propose a dynamical systems framework to study the stochastic dynamics of the VNM in the presence of general additive noise. We establish that a single parameter, which is a linear function of the circular mean of the noise, controls the macroscopic phase of the system—ordered or disordered. By establishing a fluctuation–dissipation relation, we posit that this parameter can be regarded as an effective temperature of collective behavior. The exact critical temperature is obtained analytically for systems with small connectivity, equivalent to low-density ensembles of self-propelled units. Numerical simulations are conducted to demonstrate the applicability of this new notion of effective temperature to the Vicsek model. The identification of an effective temperature of collective behavior is an important step toward understanding order–disorder phase transitions, informing consistent coarse-graining techniques and explaining the physics underlying the emergence of collective phenomena.

  5. Baryon number dissipation at finite temperature in the standard model

    International Nuclear Information System (INIS)

    Mottola, E.; Raby, S.; Starkman, G.

    1990-01-01

    We analyze the phenomenon of baryon number violation at finite temperature in the standard model, and derive the relaxation rate for the baryon density in the high temperature electroweak plasma. The relaxation rate, γ is given in terms of real time correlation functions of the operator E·B, and is directly proportional to the sphaleron transition rate, Γ: γ preceq n f Γ/T 3 . Hence it is not instanton suppressed, as claimed by Cohen, Dugan and Manohar (CDM). We show explicitly how this result is consistent with the methods of CDM, once it is recognized that a new anomalous commutator is required in their approach. 19 refs., 2 figs

  6. Modeling high temperature materials behavior for structural analysis

    CERN Document Server

    Naumenko, Konstantin

    2016-01-01

    This monograph presents approaches to characterize inelastic behavior of materials and structures at high temperature. Starting from experimental observations, it discusses basic features of inelastic phenomena including creep, plasticity, relaxation, low cycle and thermal fatigue. The authors formulate constitutive equations to describe the inelastic response for the given states of stress and microstructure. They introduce evolution equations to capture hardening, recovery, softening, ageing and damage processes. Principles of continuum mechanics and thermodynamics are presented to provide a framework for the modeling materials behavior with the aim of structural analysis of high-temperature engineering components.

  7. Hawaiian forest bird trends: using log-linear models to assess long-term trends is supported by model diagnostics and assumptions (reply to Freed and Cann 2013)

    Science.gov (United States)

    Camp, Richard J.; Pratt, Thane K.; Gorresen, P. Marcos; Woodworth, Bethany L.; Jeffrey, John J.

    2014-01-01

    Freed and Cann (2013) criticized our use of linear models to assess trends in the status of Hawaiian forest birds through time (Camp et al. 2009a, 2009b, 2010) by questioning our sampling scheme, whether we met model assumptions, and whether we ignored short-term changes in the population time series. In the present paper, we address these concerns and reiterate that our results do not support the position of Freed and Cann (2013) that the forest birds in the Hakalau Forest National Wildlife Refuge (NWR) are declining, or that the federally listed endangered birds are showing signs of imminent collapse. On the contrary, our data indicate that the 21-year long-term trends for native birds in Hakalau Forest NWR are stable to increasing, especially in areas that have received active management.

  8. Elevated temperature alters carbon cycling in a model microbial community

    Science.gov (United States)

    Mosier, A.; Li, Z.; Thomas, B. C.; Hettich, R. L.; Pan, C.; Banfield, J. F.

    2013-12-01

    Earth's climate is regulated by biogeochemical carbon exchanges between the land, oceans and atmosphere that are chiefly driven by microorganisms. Microbial communities are therefore indispensible to the study of carbon cycling and its impacts on the global climate system. In spite of the critical role of microbial communities in carbon cycling processes, microbial activity is currently minimally represented or altogether absent from most Earth System Models. Method development and hypothesis-driven experimentation on tractable model ecosystems of reduced complexity, as presented here, are essential for building molecularly resolved, benchmarked carbon-climate models. Here, we use chemoautotropic acid mine drainage biofilms as a model community to determine how elevated temperature, a key parameter of global climate change, regulates the flow of carbon through microbial-based ecosystems. This study represents the first community proteomics analysis using tandem mass tags (TMT), which enable accurate, precise, and reproducible quantification of proteins. We compare protein expression levels of biofilms growing over a narrow temperature range expected to occur with predicted climate changes. We show that elevated temperature leads to up-regulation of proteins involved in amino acid metabolism and protein modification, and down-regulation of proteins involved in growth and reproduction. Closely related bacterial genotypes differ in their response to temperature: Elevated temperature represses carbon fixation by two Leptospirillum genotypes, whereas carbon fixation is significantly up-regulated at higher temperature by a third closely related genotypic group. Leptospirillum group III bacteria are more susceptible to viral stress at elevated temperature, which may lead to greater carbon turnover in the microbial food web through the release of viral lysate. Overall, this proteogenomics approach revealed the effects of climate change on carbon cycling pathways and other

  9. A High Temperature Liquid Plasma Model of the Sun

    Directory of Open Access Journals (Sweden)

    Robitaille P.-M.

    2007-01-01

    Full Text Available In this work, a liquid model of the Sun is presented wherein the entire solar mass is viewed as a high density/high energy plasma. This model challenges our current understanding of the densities associated with the internal layers of the Sun, advocating a relatively constant density, almost independent of radial position. The incompressible nature of liquids is advanced to prevent solar collapse from gravitational forces. The liquid plasma model of the Sun is a non-equilibrium approach, where nuclear reactions occur throughout the solar mass. The primary means of addressing internal heat transfer are convection and conduction. As a result of the convective processes on the solar surface, the liquid model brings into question the established temperature of the solar photosphere by highlighting a violation of Kirchhoff’s law of thermal emission. Along these lines, the model also emphasizes that radiative emission is a surface phenomenon. Evidence that the Sun is a high density/high energy plasma is based on our knowledge of Planckian thermal emission and condensed matter, including the existence of pressure ionization and liquid metallic hydrogen at high temperatures and pressures. Prior to introducing the liquid plasma model, the historic and scientific justifications for the gaseous model of the Sun are reviewed and the gaseous equations of state are also discussed.

  10. Mathematical model of the metal mould surface temperature optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mlynek, Jaroslav, E-mail: jaroslav.mlynek@tul.cz; Knobloch, Roman, E-mail: roman.knobloch@tul.cz [Department of Mathematics, FP Technical University of Liberec, Studentska 2, 461 17 Liberec, The Czech Republic (Czech Republic); Srb, Radek, E-mail: radek.srb@tul.cz [Institute of Mechatronics and Computer Engineering Technical University of Liberec, Studentska 2, 461 17 Liberec, The Czech Republic (Czech Republic)

    2015-11-30

    The article is focused on the problem of generating a uniform temperature field on the inner surface of shell metal moulds. Such moulds are used e.g. in the automotive industry for artificial leather production. To produce artificial leather with uniform surface structure and colour shade the temperature on the inner surface of the mould has to be as homogeneous as possible. The heating of the mould is realized by infrared heaters located above the outer mould surface. The conceived mathematical model allows us to optimize the locations of infrared heaters over the mould, so that approximately uniform heat radiation intensity is generated. A version of differential evolution algorithm programmed in Matlab development environment was created by the authors for the optimization process. For temperate calculations software system ANSYS was used. A practical example of optimization of heaters locations and calculation of the temperature of the mould is included at the end of the article.

  11. Mathematical model of the metal mould surface temperature optimization

    International Nuclear Information System (INIS)

    Mlynek, Jaroslav; Knobloch, Roman; Srb, Radek

    2015-01-01

    The article is focused on the problem of generating a uniform temperature field on the inner surface of shell metal moulds. Such moulds are used e.g. in the automotive industry for artificial leather production. To produce artificial leather with uniform surface structure and colour shade the temperature on the inner surface of the mould has to be as homogeneous as possible. The heating of the mould is realized by infrared heaters located above the outer mould surface. The conceived mathematical model allows us to optimize the locations of infrared heaters over the mould, so that approximately uniform heat radiation intensity is generated. A version of differential evolution algorithm programmed in Matlab development environment was created by the authors for the optimization process. For temperate calculations software system ANSYS was used. A practical example of optimization of heaters locations and calculation of the temperature of the mould is included at the end of the article

  12. Model predictive control of room temperature with disturbance compensation

    Science.gov (United States)

    Kurilla, Jozef; Hubinský, Peter

    2017-08-01

    This paper deals with temperature control of multivariable system of office building. The system is simplified to several single input-single output systems by decoupling their mutual linkages, which are separately controlled by regulator based on generalized model predictive control. Main part of this paper focuses on the accuracy of the office temperature with respect to occupancy profile and effect of disturbance. Shifting of desired temperature and changing of weighting coefficients are used to achieve the desired accuracy of regulation. The final structure of regulation joins advantages of distributed computing power and possibility to use network communication between individual controllers to consider the constraints. The advantage of using decoupled MPC controllers compared to conventional PID regulators is demonstrated in a simulation study.

  13. Constitutive model of discontinuous plastic flow at cryogenic temperatures

    CERN Document Server

    Skoczen, B; Bielski, J; Marcinek, D

    2010-01-01

    FCC metals and alloys are frequently used in cryogenic applications, nearly down to the temperature of absolute zero, because of their excellent physical and mechanical properties including ductility. Some of these materials, often characterized by the low stacking fault energy (LSFE), undergo at low temperatures three distinct phenomena: dynamic strain ageing (DSA), plastic strain induced transformation from the parent phase (gamma) to the secondary phase (alpha) and evolution of micro-damage. The constitutive model presented in the paper is focused on the discontinuous plastic flow (serrated yielding) and takes into account the relevant thermodynamic background. The discontinuous plastic flow reflecting the DSA effect is described by the mechanism of local catastrophic failure of Lomer-Cottrell (LC) locks under the stress fields related to the accumulating edge dislocations (below the transition temperature from the screw dislocations to the edge dislocations mode T-1). The failure of LC locks leads to mass...

  14. Modelling of monovacancy diffusion in W over wide temperature range

    International Nuclear Information System (INIS)

    Bukonte, L.; Ahlgren, T.; Heinola, K.

    2014-01-01

    The diffusion of monovacancies in tungsten is studied computationally over a wide temperature range from 1300 K until the melting point of the material. Our modelling is based on Molecular Dynamics technique and Density Functional Theory. The monovacancy migration barriers are calculated using nudged elastic band method for nearest and next-nearest neighbour monovacancy jumps. The diffusion pre-exponential factor for monovacancy diffusion is found to be two to three orders of magnitude higher than commonly used in computational studies, resulting in attempt frequency of the order 10 15 Hz. Multiple nearest neighbour jumps of monovacancy are found to play an important role in the contribution to the total diffusion coefficient, especially at temperatures above 2/3 of T m , resulting in an upward curvature of the Arrhenius diagram. The probabilities for different nearest neighbour jumps for monovacancy in W are calculated at different temperatures

  15. A multifluid model extended for strong temperature nonequilibrium

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Chong [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-08

    We present a multifluid model in which the material temperature is strongly affected by the degree of segregation of each material. In order to track temperatures of segregated form and mixed form of the same material, they are defined as different materials with their own energy. This extension makes it necessary to extend multifluid models to the case in which each form is defined as a separate material. Statistical variations associated with the morphology of the mixture have to be simplified. Simplifications introduced include combining all molecularly mixed species into a single composite material, which is treated as another segregated material. Relative motion within the composite material, diffusion, is represented by material velocity of each component in the composite material. Compression work, momentum and energy exchange, virtual mass forces, and dissipation of the unresolved kinetic energy have been generalized to the heterogeneous mixture in temperature nonequilibrium. The present model can be further simplified by combining all mixed forms of materials into a composite material. Molecular diffusion in this case is modeled by the Stefan-Maxwell equations.

  16. Modeling Air Temperature/Water Temperature Relations Along a Small Mountain Stream Under Increasing Urban Influence

    Science.gov (United States)

    Fedders, E. R.; Anderson, W. P., Jr.; Hengst, A. M.; Gu, C.

    2017-12-01

    Boone Creek is a headwater stream of low to moderate gradient located in Boone, North Carolina, USA. Total impervious surface coverage in the 5.2 km2 catchment drained by the 1.9 km study reach increases from 13.4% in the upstream half of the reach to 24.3% in the downstream half. Other markers of urbanization, including culverting, lack of riparian shade vegetation, and bank armoring also increase downstream. Previous studies have shown the stream to be prone to temperature surges on short timescales (minutes to hours) caused by summer runoff from the urban hardscaping. This study investigates the effects of urbanization on the stream's thermal regime at daily to yearly timescales. To do this, we developed an analytical model of daily average stream temperatures based on daily average air temperatures. We utilized a two-part model comprising annual and biannual components and a daily component consisting of a 3rd-order Markov process in order to fit the thermal dynamics of our small, gaining stream. Optimizing this model at each of our study sites in each studied year (78 total site-years of data) yielded annual thermal exchange coefficients (K) for each site. These K values quantify the strength of the relationship between stream and air temperature, or inverse thermal stability. In a uniform, pristine catchment environment, K values are expected to decrease downstream as the stream gains discharge volume and, therefore, thermal inertia. Interannual average K values for our study reach, however, show an overall increase from 0.112 furthest upstream to 0.149 furthest downstream, despite a near doubling of stream discharge between these monitoring points. K values increase only slightly in the upstream, less urban, half of the reach. A line of best fit through these points on a plot of reach distance versus K value has a slope of 2E-6. But the K values of downstream, more urbanized sites increase at a rate of 2E-5 per meter of reach distance, an order of magnitude

  17. SDSS Log Viewer: visual exploratory analysis of large-volume SQL log data

    Science.gov (United States)

    Zhang, Jian; Chen, Chaomei; Vogeley, Michael S.; Pan, Danny; Thakar, Ani; Raddick, Jordan

    2012-01-01

    User-generated Structured Query Language (SQL) queries are a rich source of information for database analysts, information scientists, and the end users of databases. In this study a group of scientists in astronomy and computer and information scientists work together to analyze a large volume of SQL log data generated by users of the Sloan Digital Sky Survey (SDSS) data archive in order to better understand users' data seeking behavior. While statistical analysis of such logs is useful at aggregated levels, efficiently exploring specific patterns of queries is often a challenging task due to the typically large volume of the data, multivariate features, and data requirements specified in SQL queries. To enable and facilitate effective and efficient exploration of the SDSS log data, we designed an interactive visualization tool, called the SDSS Log Viewer, which integrates time series visualization, text visualization, and dynamic query techniques. We describe two analysis scenarios of visual exploration of SDSS log data, including understanding unusually high daily query traffic and modeling the types of data seeking behaviors of massive query generators. The two scenarios demonstrate that the SDSS Log Viewer provides a novel and potentially valuable approach to support these targeted tasks.

  18. Engineering aspects of radiometric logging

    International Nuclear Information System (INIS)

    Huppert, P.

    1982-01-01

    Engineering problems encountered in the development of nuclear borehole logging techniques are discussed. Spectrometric techniques require electronic stability of the equipment. In addition the electronics must be capable of handling high count rates of randomly distributed pulses of fast rise time from the detector and the systems must be designed so that precise calibration is possible under field operating conditions. Components of a logging system are discussed in detail. They include the logging probe (electronics, detector, high voltage supply, preamplifier), electronic instrumentation for data collection and processing and auxiliary equipment

  19. Log-balanced combinatorial sequences

    Directory of Open Access Journals (Sweden)

    Tomislav Došlic

    2005-01-01

    Full Text Available We consider log-convex sequences that satisfy an additional constraint imposed on their rate of growth. We call such sequences log-balanced. It is shown that all such sequences satisfy a pair of double inequalities. Sufficient conditions for log-balancedness are given for the case when the sequence satisfies a two- (or more- term linear recurrence. It is shown that many combinatorially interesting sequences belong to this class, and, as a consequence, that the above-mentioned double inequalities are valid for all of them.

  20. On the fate of the Standard Model at finite temperature

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Luigi Delle; Marzo, Carlo [Università del Salento, Dipartimento di Matematica e Fisica “Ennio De Giorgi' ,Via Arnesano, 73100 Lecce (Italy); INFN - Sezione di Lecce,via Arnesano, 73100 Lecce (Italy); Urbano, Alfredo [SISSA - International School for Advanced Studies,via Bonomea 256, 34136 Trieste (Italy)

    2016-05-10

    In this paper we revisit and update the computation of thermal corrections to the stability of the electroweak vacuum in the Standard Model. At zero temperature, we make use of the full two-loop effective potential, improved by three-loop beta functions with two-loop matching conditions. At finite temperature, we include one-loop thermal corrections together with resummation of daisy diagrams. We solve numerically — both at zero and finite temperature — the bounce equation, thus providing an accurate description of the thermal tunneling. Assuming a maximum temperature in the early Universe of the order of 10{sup 18} GeV, we find that the instability bound excludes values of the top mass M{sub t}≳173.6 GeV, with M{sub h}≃125 GeV and including uncertainties on the strong coupling. We discuss the validity and temperature-dependence of this bound in the early Universe, with a special focus on the reheating phase after inflation.

  1. Modeling temperature dependent singlet exciton dynamics in multilayered organic nanofibers

    Science.gov (United States)

    de Sousa, Leonardo Evaristo; de Oliveira Neto, Pedro Henrique; Kjelstrup-Hansen, Jakob; da Silva Filho, Demétrio Antônio

    2018-05-01

    Organic nanofibers have shown potential for application in optoelectronic devices because of the tunability of their optical properties. These properties are influenced by the electronic structure of the molecules that compose the nanofibers and also by the behavior of the excitons generated in the material. Exciton diffusion by means of Förster resonance energy transfer is responsible, for instance, for the change with temperature of colors in the light emitted by systems composed of different types of nanofibers. To study in detail this mechanism, we model temperature dependent singlet exciton dynamics in multilayered organic nanofibers. By simulating absorption and emission spectra, the possible Förster transitions are identified. Then, a kinetic Monte Carlo model is employed in combination with a genetic algorithm to theoretically reproduce time-resolved photoluminescence measurements for several temperatures. This procedure allows for the obtainment of different information regarding exciton diffusion in such a system, including temperature effects on the Förster transfer efficiency and the activation energy of the Förster mechanism. The method is general and may be employed for different systems where exciton diffusion plays a role.

  2. Modelling Ischemic Stroke and Temperature Intervention Using Vascular Porous Method

    Science.gov (United States)

    Blowers, Stephen; Valluri, Prashant; Marshall, Ian; Andrews, Peter; Harris, Bridget; Thrippleton, Michael

    2017-11-01

    In the event of cerebral infarction, a region of tissue is supplied with insufficient blood flow to support normal metabolism. This can lead to an ischemic reaction which incurs cell death. Through a reduction of temperature, the metabolic demand can be reduced, which then offsets the onset of necrosis. This allows extra time for the patient to receive medical attention and could help prevent permanent brain damage from occurring. Here, we present a vascular-porous (VaPor) blood flow model that can simulate such an event. Cerebral blood flow is simulated using a combination of 1-Dimensional vessels embedded in 3-Dimensional porous media. This allows for simple manipulation of the structure and determining the effect of an obstructed vessel. Results show regional temperature increase of 1-1.5°C comparable with results from literature (in contrast to previous simpler models). Additionally, the application of scalp cooling in such an event dramatically reduces the temperature in the affected region to near hypothermic temperatures, which points to a potential rapid form of first intervention.

  3. Temperature-influenced energetics model for migrating waterfowl

    Science.gov (United States)

    Aagaard, Kevin; Thogmartin, Wayne E.; Lonsdorg, Eric V.

    2018-01-01

    Climate and weather affect avian migration by influencing when and where birds fly, the energy costs and risks of flight, and the ability to sense cues necessary for proper navigation. We review the literature of the physiology of avian migration and the influence of climate, specifically temperature, on avian migration dynamics. We use waterfowl as a model guild because of the ready availability of empirical physiological data and their enormous economic value, but our discussion and expectations are broadly generalizable to migratory birds in general. We detail potential consequences of an increasingly warm climate on avian migration, including the possibility of the cessation of migration by some populations and species. Our intent is to lay the groundwork for including temperature effects on energetic gains and losses of migratory birds with the expected consequences of increasing temperatures into a predictive modeling framework. To this end, we provide a simulation of migration progression exclusively focused on the influence of temperature on the physiological determinants of migration. This simulation produced comparable results to empirically derived and observed values for different migratory factors (e.g., body fat content, flight range, departure date). By merging knowledge from the arenas of avian physiology and migratory theory we have identified a clear need for research and have developed hypotheses for a path forward.

  4. Improving Shade Modelling in a Regional River Temperature Model Using Fine-Scale LIDAR Data

    Science.gov (United States)

    Hannah, D. M.; Loicq, P.; Moatar, F.; Beaufort, A.; Melin, E.; Jullian, Y.

    2015-12-01

    Air temperature is often considered as a proxy of the stream temperature to model the distribution areas of aquatic species water temperature is not available at a regional scale. To simulate the water temperature at a regional scale (105 km²), a physically-based model using the equilibrium temperature concept and including upstream-downstream propagation of the thermal signal was developed and applied to the entire Loire basin (Beaufort et al., submitted). This model, called T-NET (Temperature-NETwork) is based on a hydrographical network topology. Computations are made hourly on 52,000 reaches which average 1.7 km long in the Loire drainage basin. The model gives a median Root Mean Square Error of 1.8°C at hourly time step on the basis of 128 water temperature stations (2008-2012). In that version of the model, tree shadings is modelled by a constant factor proportional to the vegetation cover on 10 meters sides the river reaches. According to sensitivity analysis, improving the shade representation would enhance T-NET accuracy, especially for the maximum daily temperatures, which are currently not very well modelized. This study evaluates the most efficient way (accuracy/computing time) to improve the shade model thanks to 1-m resolution LIDAR data available on tributary of the LoireRiver (317 km long and an area of 8280 km²). Two methods are tested and compared: the first one is a spatially explicit computation of the cast shadow for every LIDAR pixel. The second is based on averaged vegetation cover characteristics of buffers and reaches of variable size. Validation of the water temperature model is made against 4 temperature sensors well spread along the stream, as well as two airborne thermal infrared imageries acquired in summer 2014 and winter 2015 over a 80 km reach. The poster will present the optimal length- and crosswise scale to characterize the vegetation from LIDAR data.

  5. Cased-hole log analysis and reservoir performance monitoring

    CERN Document Server

    Bateman, Richard M

    2015-01-01

    This book addresses vital issues, such as the evaluation of shale gas reservoirs and their production. Topics include the cased-hole logging environment, reservoir fluid properties; flow regimes; temperature, noise, cement bond, and pulsed neutron logging; and casing inspection. Production logging charts and tables are included in the appendices. The work serves as a comprehensive reference for production engineers with upstream E&P companies, well logging service company employees, university students, and petroleum industry training professionals. This book also: ·       Provides methods of conveying production logging tools along horizontal well segments as well as measurements of formation electrical resistivity through casing ·       Covers new information on fluid flow characteristics in inclined pipe and provides new and improved nuclear tool measurements in cased wells ·       Includes updates on cased-hole wireline formation testing  

  6. Measurement of Laser Weld Temperatures for 3D Model Input

    Energy Technology Data Exchange (ETDEWEB)

    Dagel, Daryl [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grossetete, Grant [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Maccallum, Danny O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    Laser welding is a key joining process used extensively in the manufacture and assembly of critical components for several weapons systems. Sandia National Laboratories advances the understanding of the laser welding process through coupled experimentation and modeling. This report summarizes the experimental portion of the research program, which focused on measuring temperatures and thermal history of laser welds on steel plates. To increase confidence in measurement accuracy, researchers utilized multiple complementary techniques to acquire temperatures during laser welding. This data serves as input to and validation of 3D laser welding models aimed at predicting microstructure and the formation of defects and their impact on weld-joint reliability, a crucial step in rapid prototyping of weapons components.

  7. Systems Modeling for Crew Core Body Temperature Prediction Postlanding

    Science.gov (United States)

    Cross, Cynthia; Ochoa, Dustin

    2010-01-01

    The Orion Crew Exploration Vehicle, NASA s latest crewed spacecraft project, presents many challenges to its designers including ensuring crew survivability during nominal and off nominal landing conditions. With a nominal water landing planned off the coast of San Clemente, California, off nominal water landings could range from the far North Atlantic Ocean to the middle of the equatorial Pacific Ocean. For all of these conditions, the vehicle must provide sufficient life support resources to ensure that the crew member s core body temperatures are maintained at a safe level prior to crew rescue. This paper will examine the natural environments, environments created inside the cabin and constraints associated with post landing operations that affect the temperature of the crew member. Models of the capsule and the crew members are examined and analysis results are compared to the requirement for safe human exposure. Further, recommendations for updated modeling techniques and operational limits are included.

  8. Foundations of modelling of nonequilibrium low-temperature plasmas

    Science.gov (United States)

    Alves, L. L.; Bogaerts, A.; Guerra, V.; Turner, M. M.

    2018-02-01

    This work explains the need for plasma models, introduces arguments for choosing the type of model that better fits the purpose of each study, and presents the basics of the most common nonequilibrium low-temperature plasma models and the information available from each one, along with an extensive list of references for complementary in-depth reading. The paper presents the following models, organised according to the level of multi-dimensional description of the plasma: kinetic models, based on either a statistical particle-in-cell/Monte-Carlo approach or the solution to the Boltzmann equation (in the latter case, special focus is given to the description of the electron kinetics); multi-fluid models, based on the solution to the hydrodynamic equations; global (spatially-average) models, based on the solution to the particle and energy rate-balance equations for the main plasma species, usually including a very complete reaction chemistry; mesoscopic models for plasma-surface interaction, adopting either a deterministic approach or a stochastic dynamical Monte-Carlo approach. For each plasma model, the paper puts forward the physics context, introduces the fundamental equations, presents advantages and limitations, also from a numerical perspective, and illustrates its application with some examples. Whenever pertinent, the interconnection between models is also discussed, in view of multi-scale hybrid approaches.

  9. A Methodology for Modeling Confined, Temperature Sensitive Cushioning Systems

    Science.gov (United States)

    1980-06-01

    thickness of cushion T, and®- s temperature 0, and as a dependent variable, G, the peak acceleration. The initial model, Equation (IV-11), proved deficient ...k9) = TR * TCTH ALV(60) = Tk * TCTH AL2 V6)= Tk2 * FCTH V2 =TRk * TCrFH *AL V(6~3) =THZ * TC.TH AU! V(,34) =TRa * TCTH 141 Yj)=Tks * T(-Th * AL V(.4b

  10. VT Route Log Points 2017

    Data.gov (United States)

    Vermont Center for Geographic Information — This data layer is used with VTrans' Integrated Route Log System (IRA). It is also used to calibrate the linear referencing systems, including the End-to-End and...

  11. New materials for fireplace logs

    Science.gov (United States)

    Kieselback, D. J.; Smock, A. W.

    1971-01-01

    Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

  12. An Empirical Temperature Variance Source Model in Heated Jets

    Science.gov (United States)

    Khavaran, Abbas; Bridges, James

    2012-01-01

    An acoustic analogy approach is implemented that models the sources of jet noise in heated jets. The equivalent sources of turbulent mixing noise are recognized as the differences between the fluctuating and Favre-averaged Reynolds stresses and enthalpy fluxes. While in a conventional acoustic analogy only Reynolds stress components are scrutinized for their noise generation properties, it is now accepted that a comprehensive source model should include the additional entropy source term. Following Goldstein s generalized acoustic analogy, the set of Euler equations are divided into two sets of equations that govern a non-radiating base flow plus its residual components. When the base flow is considered as a locally parallel mean flow, the residual equations may be rearranged to form an inhomogeneous third-order wave equation. A general solution is written subsequently using a Green s function method while all non-linear terms are treated as the equivalent sources of aerodynamic sound and are modeled accordingly. In a previous study, a specialized Reynolds-averaged Navier-Stokes (RANS) solver was implemented to compute the variance of thermal fluctuations that determine the enthalpy flux source strength. The main objective here is to present an empirical model capable of providing a reasonable estimate of the stagnation temperature variance in a jet. Such a model is parameterized as a function of the mean stagnation temperature gradient in the jet, and is evaluated using commonly available RANS solvers. The ensuing thermal source distribution is compared with measurements as well as computational result from a dedicated RANS solver that employs an enthalpy variance and dissipation rate model. Turbulent mixing noise predictions are presented for a wide range of jet temperature ratios from 1.0 to 3.20.

  13. Modeling of helium effects in metals: High temperature embrittlement

    International Nuclear Information System (INIS)

    Trinkaus, H.

    1985-01-01

    The effects of helium on swelling, creep rupture and fatigue properties of fusion reactor materials subjected to (n,α)-reactions and/or direct α-injection, are controlled by bubble formation. The understanding of such effects requires therefore the modeling of (1) diffusional reactions of He atoms with other defects; (2) nucleation and growth of He bubbles; (3) transformation of such bubbles into cavities under continuous He generation and irradiation or creep stress. The present paper is focussed on the modeling of the (coupled) high temperature bubble nucleation and growth processes within and on grain boundaries. Two limiting cases are considered: di-atomic nucleation described by the simplest possible sets of rate equations, and multi-atomic nucleation described by classical nucleation theory. Scaling laws are derived which characterize the dependence of the bubble densities upon time (He-dose), He generation rate and temperature. Comparison with experimental data of AISI 316 SS α-implanted at temperatures around 1000 K indicates bubble nucleation of the multi-atomic type. The nucleation and growth models are applied to creep tests performed during α-implantation suggesting that in these cases gas driven bubble growth is the life time controlling mechanism. The narrow (creep stress/He generation rate) range of this mechanism in a mechanism map constructed from these tests indicates that in many reactor situations the time to rupture is probably controlled by stress driven cavity growth rather than by gas driven bubble growth. (orig.)

  14. SMOS brightness temperature assimilation into the Community Land Model

    Directory of Open Access Journals (Sweden)

    D. Rains

    2017-11-01

    Full Text Available SMOS (Soil Moisture and Ocean Salinity mission brightness temperatures at a single incident angle are assimilated into the Community Land Model (CLM across Australia to improve soil moisture simulations. Therefore, the data assimilation system DasPy is coupled to the local ensemble transform Kalman filter (LETKF as well as to the Community Microwave Emission Model (CMEM. Brightness temperature climatologies are precomputed to enable the assimilation of brightness temperature anomalies, making use of 6 years of SMOS data (2010–2015. Mean correlation R with in situ measurements increases moderately from 0.61 to 0.68 (11 % for upper soil layers if the root zone is included in the updates. A reduced improvement of 5 % is achieved if the assimilation is restricted to the upper soil layers. Root-zone simulations improve by 7 % when updating both the top layers and root zone, and by 4 % when only updating the top layers. Mean increments and increment standard deviations are compared for the experiments. The long-term assimilation impact is analysed by looking at a set of quantiles computed for soil moisture at each grid cell. Within hydrological monitoring systems, extreme dry or wet conditions are often defined via their relative occurrence, adding great importance to assimilation-induced quantile changes. Although still being limited now, longer L-band radiometer time series will become available and make model output improved by assimilating such data that are more usable for extreme event statistics.

  15. Small velocity and finite temperature variations in kinetic relaxation models

    KAUST Repository

    Markowich, Peter; Jü ngel, Ansgar; Aoki, Kazuo

    2010-01-01

    A small Knuden number analysis of a kinetic equation in the diffusive scaling is performed. The collision kernel is of BGK type with a general local Gibbs state. Assuming that the flow velocity is of the order of the Knudsen number, a Hilbert expansion yields a macroscopic model with finite temperature variations, whose complexity lies in between the hydrodynamic and the energy-transport equations. Its mathematical structure is explored and macroscopic models for specific examples of the global Gibbs state are presented. © American Institute of Mathematical Sciences.

  16. High-temperature series expansions for random Potts models

    Directory of Open Access Journals (Sweden)

    M.Hellmund

    2005-01-01

    Full Text Available We discuss recently generated high-temperature series expansions for the free energy and the susceptibility of random-bond q-state Potts models on hypercubic lattices. Using the star-graph expansion technique, quenched disorder averages can be calculated exactly for arbitrary uncorrelated coupling distributions while keeping the disorder strength p as well as the dimension d as symbolic parameters. We present analyses of the new series for the susceptibility of the Ising (q=2 and 4-state Potts model in three dimensions up to the order 19 and 18, respectively, and compare our findings with results from field-theoretical renormalization group studies and Monte Carlo simulations.

  17. Reheating temperature and gauge mediation models of supersymmetry breaking

    International Nuclear Information System (INIS)

    Olechowski, Marek; Pokorski, Stefan; Turzynski, Krzysztof; Wells, James D.

    2009-01-01

    For supersymmetric theories with gravitino dark matter, the maximal reheating temperature consistent with big bang nucleosynthesis bounds arises when the physical gaugino masses are degenerate. We consider the cases of a stau or sneutrino next-to-lightest superpartner, which have relatively less constraint from big bang nucleosynthesis. The resulting parameter space is consistent with leptogenesis requirements, and can be reached in generalized gauge mediation models. Such models illustrate a class of theories that overcome the well-known tension between big bang nucleosynthesis and leptogenesis.

  18. Effects of electrostatic discharge on three cryogenic temperature sensor models

    Energy Technology Data Exchange (ETDEWEB)

    Courts, S. Scott; Mott, Thomas B. [Lake Shore Cryotronics, 575 McCorkle Blvd., Westerville, OH 43082 (United States)

    2014-01-29

    Cryogenic temperature sensors are not usually thought of as electrostatic discharge (ESD) sensitive devices. However, the most common cryogenic thermometers in use today are thermally sensitive diodes or resistors - both electronic devices in their base form. As such, they are sensitive to ESD at some level above which either catastrophic or latent damage can occur. Instituting an ESD program for safe handling and installation of the sensor is costly and it is desirable to balance the risk of ESD damage against this cost. However, this risk cannot be evaluated without specific knowledge of the ESD vulnerability of the devices in question. This work examines three types of cryogenic temperature sensors for ESD sensitivity - silicon diodes, Cernox(trade mark, serif) resistors, and wire wound platinum resistors, all manufactured by Lake Shore Cryotronics, Inc. Testing was performed per TIA/EIA FOTP129 (Human Body Model). Damage was found to occur in the silicon diode sensors at discharge levels of 1,500 V. For Cernox(trade mark, serif) temperature sensors, damage was observed at 3,500 V. The platinum temperature sensors were not damaged by ESD exposure levels of 9,900 V. At the lower damage limit, both the silicon diode and the Cernox(trade mark, serif) temperature sensors showed relatively small calibration shifts of 1 to 3 K at room temperature. The diode sensors were stable with time and thermal cycling, but the long term stability of the Cernox(trade mark, serif) sensors was degraded. Catastrophic failure occurred at higher levels of ESD exposure.

  19. Effects of electrostatic discharge on three cryogenic temperature sensor models

    International Nuclear Information System (INIS)

    Courts, S. Scott; Mott, Thomas B.

    2014-01-01

    Cryogenic temperature sensors are not usually thought of as electrostatic discharge (ESD) sensitive devices. However, the most common cryogenic thermometers in use today are thermally sensitive diodes or resistors - both electronic devices in their base form. As such, they are sensitive to ESD at some level above which either catastrophic or latent damage can occur. Instituting an ESD program for safe handling and installation of the sensor is costly and it is desirable to balance the risk of ESD damage against this cost. However, this risk cannot be evaluated without specific knowledge of the ESD vulnerability of the devices in question. This work examines three types of cryogenic temperature sensors for ESD sensitivity - silicon diodes, Cernox(trade mark, serif) resistors, and wire wound platinum resistors, all manufactured by Lake Shore Cryotronics, Inc. Testing was performed per TIA/EIA FOTP129 (Human Body Model). Damage was found to occur in the silicon diode sensors at discharge levels of 1,500 V. For Cernox(trade mark, serif) temperature sensors, damage was observed at 3,500 V. The platinum temperature sensors were not damaged by ESD exposure levels of 9,900 V. At the lower damage limit, both the silicon diode and the Cernox(trade mark, serif) temperature sensors showed relatively small calibration shifts of 1 to 3 K at room temperature. The diode sensors were stable with time and thermal cycling, but the long term stability of the Cernox(trade mark, serif) sensors was degraded. Catastrophic failure occurred at higher levels of ESD exposure

  20. MODELING OF TEMPERATURE FIELDS IN A SOLID HEAT ACCUMULLATORS

    Directory of Open Access Journals (Sweden)

    S. S. Belimenko

    2016-10-01

    Full Text Available Purpose. Currently, one of the priorities of energy conservation is a cost savings for heating in commercial and residential buildings by the stored thermal energy during the night and its return in the daytime. Economic effect is achieved due to the difference in tariffs for the cost of electricity in the daytime and at night. One of the most common types of devices that allow accumulating and giving the resulting heat are solid heat accumulators. The main purpose of the work: 1 software development for the calculation of the temperature field of a flat solid heat accumulator, working due to the heat energy accumulation in the volume of thermal storage material without phase transition; 2 determination the temperature distribution in its volumes at convective heat transfer. Methodology. To achieve the study objectives a heat transfer theory and Laplace integral transform were used. On its base the problems of determining the temperature fields in the channels of heat accumulators, having different cross-sectional shapes were solved. Findings. Authors have developed the method of calculation and obtained solutions for the determination of temperature fields in channels of the solid heat accumulator in conditions of convective heat transfer. Temperature fields over length and thickness of channels were investigated. Experimental studies on physical models and industrial equipment were conducted. Originality. For the first time the technique of calculating the temperature field in the channels of different cross-section for the solid heat accumulator in the charging and discharging modes was proposed. The calculation results are confirmed by experimental research. Practical value. The proposed technique is used in the design of solid heat accumulators of different power as well as full-scale production of them was organized.

  1. Precipitates/Salts Model Calculations for Various Drift Temperature Environments

    International Nuclear Information System (INIS)

    Marnier, P.

    2001-01-01

    The objective and scope of this calculation is to assist Performance Assessment Operations and the Engineered Barrier System (EBS) Department in modeling the geochemical effects of evaporation within a repository drift. This work is developed and documented using procedure AP-3.12Q, Calculations, in support of ''Technical Work Plan For Engineered Barrier System Department Modeling and Testing FY 02 Work Activities'' (BSC 2001a). The primary objective of this calculation is to predict the effects of evaporation on the abstracted water compositions established in ''EBS Incoming Water and Gas Composition Abstraction Calculations for Different Drift Temperature Environments'' (BSC 2001c). A secondary objective is to predict evaporation effects on observed Yucca Mountain waters for subsequent cement interaction calculations (BSC 2001d). The Precipitates/Salts model is documented in an Analysis/Model Report (AMR), ''In-Drift Precipitates/Salts Analysis'' (BSC 2001b)

  2. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This paper establishes a remarkable result regarding Palm distributions for a log Gaussian Cox process: the reduced Palm distribution for a log Gaussian Cox process is itself a log Gaussian Cox process that only differs from the original log Gaussian Cox process in the intensity function. This new...... result is used to study functional summaries for log Gaussian Cox processes....

  3. Modelling guidelines for core exit temperature simulations with system codes

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, J., E-mail: jordi.freixa-terradas@upc.edu [Department of Physics and Nuclear Engineering, Technical University of Catalonia (UPC) (Spain); Paul Scherrer Institut (PSI), 5232 Villigen (Switzerland); Martínez-Quiroga, V., E-mail: victor.martinez@nortuen.com [Department of Physics and Nuclear Engineering, Technical University of Catalonia (UPC) (Spain); Zerkak, O., E-mail: omar.zerkak@psi.ch [Paul Scherrer Institut (PSI), 5232 Villigen (Switzerland); Reventós, F., E-mail: francesc.reventos@upc.edu [Department of Physics and Nuclear Engineering, Technical University of Catalonia (UPC) (Spain)

    2015-05-15

    Highlights: • Core exit temperature is used in PWRs as an indication of core heat up. • Modelling guidelines of CET response with system codes. • Modelling of heat transfer processes in the core and UP regions. - Abstract: Core exit temperature (CET) measurements play an important role in the sequence of actions under accidental conditions in pressurized water reactors (PWR). Given the difficulties in placing measurements in the core region, CET readings are used as criterion for the initiation of accident management (AM) procedures because they can indicate a core heat up scenario. However, the CET responses have some limitation in detecting inadequate core cooling and core uncovery simply because the measurement is not placed inside the core. Therefore, it is of main importance in the field of nuclear safety for PWR power plants to assess the capabilities of system codes for simulating the relation between the CET and the peak cladding temperature (PCT). The work presented in this paper intends to address this open question by making use of experimental work at integral test facilities (ITF) where experiments related to the evolution of the CET and the PCT during transient conditions have been carried out. In particular, simulations of two experiments performed at the ROSA/LSTF and PKL facilities are presented. The two experiments are part of a counterpart exercise between the OECD/NEA ROSA-2 and OECD/NEA PKL-2 projects. The simulations are used to derive guidelines in how to correctly reproduce the CET response during a core heat up scenario. Three aspects have been identified to be of main importance: (1) the need for a 3-dimensional representation of the core and Upper Plenum (UP) regions in order to model the heterogeneity of the power zones and axial areas, (2) the detailed representation of the active and passive heat structures, and (3) the use of simulated thermocouples instead of steam temperatures to represent the CET readings.

  4. Explorations in statistics: the log transformation.

    Science.gov (United States)

    Curran-Everett, Douglas

    2018-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This thirteenth installment of Explorations in Statistics explores the log transformation, an established technique that rescales the actual observations from an experiment so that the assumptions of some statistical analysis are better met. A general assumption in statistics is that the variability of some response Y is homogeneous across groups or across some predictor variable X. If the variability-the standard deviation-varies in rough proportion to the mean value of Y, a log transformation can equalize the standard deviations. Moreover, if the actual observations from an experiment conform to a skewed distribution, then a log transformation can make the theoretical distribution of the sample mean more consistent with a normal distribution. This is important: the results of a one-sample t test are meaningful only if the theoretical distribution of the sample mean is roughly normal. If we log-transform our observations, then we want to confirm the transformation was useful. We can do this if we use the Box-Cox method, if we bootstrap the sample mean and the statistic t itself, and if we assess the residual plots from the statistical model of the actual and transformed sample observations.

  5. Low reheating temperatures in monomial and binomial inflationary models

    International Nuclear Information System (INIS)

    Rehagen, Thomas; Gelmini, Graciela B.

    2015-01-01

    We investigate the allowed range of reheating temperature values in light of the Planck 2015 results and the recent joint analysis of Cosmic Microwave Background (CMB) data from the BICEP2/Keck Array and Planck experiments, using monomial and binomial inflationary potentials. While the well studied ϕ 2 inflationary potential is no longer favored by current CMB data, as well as ϕ p with p>2, a ϕ 1 potential and canonical reheating (w re =0) provide a good fit to the CMB measurements. In this last case, we find that the Planck 2015 68% confidence limit upper bound on the spectral index, n s , implies an upper bound on the reheating temperature of T re ≲6×10 10 GeV, and excludes instantaneous reheating. The low reheating temperatures allowed by this model open the possibility that dark matter could be produced during the reheating period instead of when the Universe is radiation dominated, which could lead to very different predictions for the relic density and momentum distribution of WIMPs, sterile neutrinos, and axions. We also study binomial inflationary potentials and show the effects of a small departure from a ϕ 1 potential. We find that as a subdominant ϕ 2 term in the potential increases, first instantaneous reheating becomes allowed, and then the lowest possible reheating temperature of T re =4 MeV is excluded by the Planck 2015 68% confidence limit

  6. Geothermal well log interpretation state of the art. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, S.K.; Wells, L.E.; Bickham, R.E.

    1980-01-01

    An in-depth study of the state of the art in Geothermal Well Log Interpretation has been made encompassing case histories, technical papers, computerized literature searches, and actual processing of geothermal wells from New Mexico, Idaho, and California. A classification scheme of geothermal reservoir types was defined which distinguishes fluid phase and temperature, lithology, geologic province, pore geometry, salinity, and fluid chemistry. Major deficiencies of Geothermal Well Log Interpretation are defined and discussed with recommendations of possible solutions or research for solutions. The Geothermal Well Log Interpretation study and report has concentrated primarily on Western US reservoirs. Geopressured geothermal reservoirs are not considered.

  7. Log quality enhancement: A systematic assessment of logging company wellsite performance and log quality

    International Nuclear Information System (INIS)

    Farnan, R.A.; Mc Hattie, C.M.

    1984-01-01

    To improve the monitoring of logging company performance, computer programs were developed to assess information en masse from log quality check lists completed on wellsite by the service company engineer and Phillips representative. A study of all logging jobs performed by different service companies for Phillips in Oklahoma (panhandle excepted) during 1982 enabled several pertinent and beneficial interpretations to be made. Company A provided the best tool and crew service. Company B incurred an excessive amount of lost time related to tool failure, in particular the neutron-density tool combination. Company C, although used only three times, incurred no lost time. With a reasonable data base valid conclusions were made pertaining, for example, to repeated tool malfunctions. The actual logs were then assessed for quality

  8. Tantalum strength model incorporating temperature, strain rate and pressure

    Science.gov (United States)

    Lim, Hojun; Battaile, Corbett; Brown, Justin; Lane, Matt

    Tantalum is a body-centered-cubic (BCC) refractory metal that is widely used in many applications in high temperature, strain rate and pressure environments. In this work, we propose a physically-based strength model for tantalum that incorporates effects of temperature, strain rate and pressure. A constitutive model for single crystal tantalum is developed based on dislocation kink-pair theory, and calibrated to measurements on single crystal specimens. The model is then used to predict deformations of single- and polycrystalline tantalum. In addition, the proposed strength model is implemented into Sandia's ALEGRA solid dynamics code to predict plastic deformations of tantalum in engineering-scale applications at extreme conditions, e.g. Taylor impact tests and Z machine's high pressure ramp compression tests, and the results are compared with available experimental data. Sandia National Laboratories is a multi program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  9. Model for low temperature oxidation during long term interim storage

    Energy Technology Data Exchange (ETDEWEB)

    Desgranges, Clara; Bertrand, Nathalie; Gauvain, Danielle; Terlain, Anne [Service de la Corrosion et du Comportement des Materiaux dans leur Environnement, CEA/Saclay - 91191 Gif-sur-Yvette Cedex (France); Poquillon, Dominique; Monceau, Daniel [CIRIMAT UMR 5085, ENSIACET-INPT, 31077 Toulouse Cedex 4 (France)

    2004-07-01

    For high-level nuclear waste containers in long-term interim storage, dry oxidation will be the first and the main degradation mode during about one century. The metal lost by dry oxidation over such a long period must be evaluated with a good reliability. To achieve this goal, modelling of the oxide scale growth is necessary and this is the aim of the dry oxidation studies performed in the frame of the COCON program. An advanced model based on the description of elementary mechanisms involved in scale growth at low temperatures, like partial interfacial control of the oxidation kinetics and/or grain boundary diffusion, is developed in order to increase the reliability of the long term extrapolations deduced from basic models developed from short time experiments. Since only few experimental data on dry oxidation are available in the temperature range of interest, experiments have also been performed to evaluate the relevant input parameters for models like grain size of oxide scale, considering iron as simplified material. (authors)

  10. Model for low temperature oxidation during long term interim storage

    International Nuclear Information System (INIS)

    Desgranges, Clara; Bertrand, Nathalie; Gauvain, Danielle; Terlain, Anne; Poquillon, Dominique; Monceau, Daniel

    2004-01-01

    For high-level nuclear waste containers in long-term interim storage, dry oxidation will be the first and the main degradation mode during about one century. The metal lost by dry oxidation over such a long period must be evaluated with a good reliability. To achieve this goal, modelling of the oxide scale growth is necessary and this is the aim of the dry oxidation studies performed in the frame of the COCON program. An advanced model based on the description of elementary mechanisms involved in scale growth at low temperatures, like partial interfacial control of the oxidation kinetics and/or grain boundary diffusion, is developed in order to increase the reliability of the long term extrapolations deduced from basic models developed from short time experiments. Since only few experimental data on dry oxidation are available in the temperature range of interest, experiments have also been performed to evaluate the relevant input parameters for models like grain size of oxide scale, considering iron as simplified material. (authors)

  11. Theoretical temperature model with experimental validation for CLIC Accelerating Structures

    CERN Document Server

    AUTHOR|(CDS)2126138; Vamvakas, Alex; Alme, Johan

    Micron level stability of the Compact Linear Collider (CLIC) components is one of the main requirements to meet the luminosity goal for the future $48 \\,km$ long underground linear accelerator. The radio frequency (RF) power used for beam acceleration causes heat generation within the aligned structures, resulting in mechanical movements and structural deformations. A dedicated control of the air- and water- cooling system in the tunnel is therefore crucial to improve alignment accuracy. This thesis investigates the thermo-mechanical behavior of the CLIC Accelerating Structure (AS). In CLIC, the AS must be aligned to a precision of $10\\,\\mu m$. The thesis shows that a relatively simple theoretical model can be used within reasonable accuracy to predict the temperature response of an AS as a function of the applied RF power. During failure scenarios or maintenance interventions, the RF power is turned off resulting in no heat dissipation and decrease in the overall temperature of the components. The theoretica...

  12. Inversion of a lateral log using neural networks

    International Nuclear Information System (INIS)

    Garcia, G.; Whitman, W.W.

    1992-01-01

    In this paper a technique using neural networks is demonstrated for the inversion of a lateral log. The lateral log is simulated by a finite difference method which in turn is used as an input to a backpropagation neural network. An initial guess earth model is generated from the neural network, which is then input to a Marquardt inversion. The neural network reacts to gross and subtle data features in actual logs and produces a response inferred from the knowledge stored in the network during a training process. The neural network inversion of lateral logs is tested on synthetic and field data. Tests using field data resulted in a final earth model whose simulated lateral is in good agreement with the actual log data

  13. Temperature Effect on Micelle Formation: Molecular Thermodynamic Model Revisited.

    Science.gov (United States)

    Khoshnood, Atefeh; Lukanov, Boris; Firoozabadi, Abbas

    2016-03-08

    Temperature affects the aggregation of macromolecules such as surfactants, polymers, and proteins in aqueous solutions. The effect on the critical micelle concentration (CMC) is often nonmonotonic. In this work, the effect of temperature on the micellization of ionic and nonionic surfactants in aqueous solutions is studied using a molecular thermodynamic model. Previous studies based on this technique have predicted monotonic behavior for ionic surfactants. Our investigation shows that the choice of tail transfer energy to describe the hydrophobic effect between the surfactant tails and the polar solvent molecules plays a key role in the predicted CMC. We modify the tail transfer energy by taking into account the effect of the surfactant head on the neighboring methylene group. The modification improves the description of the CMC and the predicted micellar size for aqueous solutions of sodium n-alkyl sulfate, dodecyl trimethylammonium bromide (DTAB), and n-alkyl polyoxyethylene. The new tail transfer energy describes the nonmonotonic behavior of CMC versus temperature. In the DTAB-water system, we redefine the head size by including the methylene group, next to the nitrogen, in the head. The change in the head size along with our modified tail transfer energy improves the CMC and aggregation size prediction significantly. Tail transfer is a dominant energy contribution in micellar and microemulsion systems. It also promotes the adsorption of surfactants at fluid-fluid interfaces and affects the formation of adsorbed layer at fluid-solid interfaces. Our proposed modifications have direct applications in the thermodynamic modeling of the effect of temperature on molecular aggregation, both in the bulk and at the interfaces.

  14. Modelling the behaviour of 210Po in high temperature processes

    International Nuclear Information System (INIS)

    Mora, J.C.; Robles, B.; Corbacho, J.A.; Gasco, Catalina; Gazquez, M.J.

    2011-01-01

    In several Naturally Occurring Radioactive Material (NORM) industries, relatively high temperatures are used as part of their industrial processes. In coal combustion, as occur in other high temperature processes, an increase of the activity concentration of every natural radioisotope is produced both, in residues and by-products. An additional increase can be observed in the activity concentration of radionuclides of elements with low boiling point. This work is centred in the increase of polonium, more precisely in its radioisotope Po-210, present in the natural chains, and with a half-life long enough to be considered for radiation protection purposes. This additional increase appears mainly in the residual particles that are suspended in the flue gases: the fly-ashes. Besides, scales, with a high concentration of this radioisotope, were observed. These scales are produced on surfaces with a temperature lower than the boiling point of the chemical element. Both, the accumulation in particles and the production of scales are attributed to condensation effects. When effective doses for the public and the workers are evaluated, taking into account these increases in activity concentrations, the use of theoretical models is necessary. In this work a theoretical description of those effects is presented. Moreover, a verification of the predictions of the model was performed by comparing them with measurements carried on in coal-fired power plants. The same description here presented is applicable in general to the behaviour of Po-210 in other NORM industries where high temperature processes involving raw materials are used, as can be ceramic, cement production, tiles production or steel processing.

  15. Influence of spatial temperature estimation method in ecohydrologic modeling in the western Oregon Cascades

    Science.gov (United States)

    E. Garcia; C.L. Tague; J. Choate

    2013-01-01

    Most spatially explicit hydrologic models require estimates of air temperature patterns. For these models, empirical relationships between elevation and air temperature are frequently used to upscale point measurements or downscale regional and global climate model estimates of air temperature. Mountainous environments are particularly sensitive to air temperature...

  16. Log Gaussian Cox processes on the sphere

    DEFF Research Database (Denmark)

    Pacheco, Francisco Andrés Cuevas; Møller, Jesper

    We define and study the existence of log Gaussian Cox processes (LGCPs) for the description of inhomogeneous and aggregated/clustered point patterns on the d-dimensional sphere, with d = 2 of primary interest. Useful theoretical properties of LGCPs are studied and applied for the description of sky...... positions of galaxies, in comparison with previous analysis using a Thomas process. We focus on simple estimation procedures and model checking based on functional summary statistics and the global envelope test....

  17. High temperature viscoplastic ratchetting: Material response or modeling artifact

    International Nuclear Information System (INIS)

    Freed, A.D.

    1991-01-01

    Ratchetting, the net accumulation of strain over a loading cycle, is a deformation mechanism that leads to distortions in shape, often resulting in a loss of function that culminates in structural failure. Viscoplastic ratchetting is prevalent at high homologous temperatures where viscous characteristics are prominent in material response. This deformation mechanism is accentuated by the presence of a mean stress; a consequence of interaction between thermal gradients and structural constraints. Favorable conditions for viscoplastic ratchetting exist in the Stirling engines being developed by the National Aeronautics and Space Administration (NASA) and the Department of Energy (DOE) for space and terrestrial power applications. To assess the potential for ratchetting and its effect on durability of high temperature structures requires a viscoplastic analysis of the design. But ratchetting is a very difficult phenomenon to accurately model. One must therefore ask whether the results from such an analysis are indicative of actual material behavior, or if they are artifacts of the theory being used in the analysis. There are several subtle aspects in a viscoplastic model that must be dealt with in order to accurately model ratchetting behavior, and therefore obtain meaningful predictions from it. In this paper, some of these subtlties and the necessary ratchet experiments needed to obtain an accurate viscoplastic representation of a material are discussed

  18. Design and Modelling of Small Scale Low Temperature Power Cycles

    DEFF Research Database (Denmark)

    Wronski, Jorrit

    he work presented in this report contributes to the state of the art within design and modelling of small scale low temperature power cycles. The study is divided into three main parts: (i) fluid property evaluation, (ii) expansion device investigations and (iii) heat exchanger performance......-oriented Modelica code and was included in the thermo Cycle framework for small scale ORC systems. Special attention was paid to the valve system and a control method for variable expansion ratios was introduced based on a cogeneration scenario. Admission control based on evaporator and condenser conditions...

  19. Bona Fide Thermodynamic Temperature in Nonequilibrium Kinetic Ising Models

    OpenAIRE

    Sastre, Francisco; Dornic, Ivan; Chaté, Hugues

    2003-01-01

    We show that a nominal temperature can be consistently and uniquely defined everywhere in the phase diagram of large classes of nonequilibrium kinetic Ising spin models. In addition, we confirm the recent proposal that, at critical points, the large-time ``fluctuation-dissipation ratio'' $X_\\infty$ is a universal amplitude ratio and find in particular $X_\\infty \\approx 0.33(2)$ and $X_\\infty = 1/2$ for the magnetization in, respectively, the two-dimensional Ising and voter universality classes.

  20. Improving the performance of temperature index snowmelt model of SWAT by using MODIS land surface temperature data.

    Science.gov (United States)

    Yang, Yan; Onishi, Takeo; Hiramatsu, Ken

    2014-01-01

    Simulation results of the widely used temperature index snowmelt model are greatly influenced by input air temperature data. Spatially sparse air temperature data remain the main factor inducing uncertainties and errors in that model, which limits its applications. Thus, to solve this problem, we created new air temperature data using linear regression relationships that can be formulated based on MODIS land surface temperature data. The Soil Water Assessment Tool model, which includes an improved temperature index snowmelt module, was chosen to test the newly created data. By evaluating simulation performance for daily snowmelt in three test basins of the Amur River, performance of the newly created data was assessed. The coefficient of determination (R (2)) and Nash-Sutcliffe efficiency (NSE) were used for evaluation. The results indicate that MODIS land surface temperature data can be used as a new source for air temperature data creation. This will improve snow simulation using the temperature index model in an area with sparse air temperature observations.

  1. ENHANCED MODELING OF REMOTELY SENSED ANNUAL LAND SURFACE TEMPERATURE CYCLE

    Directory of Open Access Journals (Sweden)

    Z. Zou

    2017-09-01

    Full Text Available Satellite thermal remote sensing provides access to acquire large-scale Land surface temperature (LST data, but also generates missing and abnormal values resulting from non-clear-sky conditions. Given this limitation, Annual Temperature Cycle (ATC model was employed to reconstruct the continuous daily LST data over a year. The original model ATCO used harmonic functions, but the dramatic changes of the real LST caused by the weather changes remained unclear due to the smooth sine curve. Using Aqua/MODIS LST products, NDVI and meteorological data, we proposed enhanced model ATCE based on ATCO to describe the fluctuation and compared their performances for the Yangtze River Delta region of China. The results demonstrated that, the overall root mean square errors (RMSEs of the ATCE was lower than ATCO, and the improved accuracy of daytime was better than that of night, with the errors decreased by 0.64 K and 0.36 K, respectively. The improvements of accuracies varied with different land cover types: the forest, grassland and built-up areas improved larger than water. And the spatial heterogeneity was observed for performance of ATC model: the RMSEs of built-up area, forest and grassland were around 3.0 K in the daytime, while the water attained 2.27 K; at night, the accuracies of all types significantly increased to similar RMSEs level about 2 K. By comparing the differences between LSTs simulated by two models in different seasons, it was found that the differences were smaller in the spring and autumn, while larger in the summer and winter.

  2. Hardwood log grades and lumber grade yields for factory lumber logs

    Science.gov (United States)

    Leland F. Hanks; Glenn L. Gammon; Robert L. Brisbin; Everette D. Rast

    1980-01-01

    The USDA Forest Service Standard Grades for Hardwood Factory Lumber Logs are described, and lumber grade yields for 16 species and 2 species groups are presented by log grade and log diameter. The grades enable foresters, log buyers, and log sellers to select and grade those log suitable for conversion into standard factory grade lumber. By using the apropriate lumber...

  3. The log S -log N distribution of gamma ray brust

    International Nuclear Information System (INIS)

    Yamagami, Takamasa; Nishimura, Jun; Fujii, Masami

    1982-01-01

    The relation between the size S and the frequency N of gamma ray burst has been studied. This relation may be determined from the celestial distribution of gamma ray burst sources. The present analysis gives that the log S - log N relation for any direction is determined by the celestial distribution of gamma ray burst sources. The observed bursts were analyzed. The celestial distribution of gamma ray burst sources was observed by the satellites of USSR. The results showed that the distribution seemed to be isotropic. However, the calculated log S - log N relation based on the isotropic distribution wasF in disagreement with the observed ones. As the result of analysis, it was found that the observed bursts missed low energy part because of the threshold of detectors. The levels of discrimination of detection were not clear. When a proper threshold level is set for each type of burst, and the size of bursts is determined, the above mentioned discrepancy will be deleted regardless of luminosity and the spatial distribution of bursts. (Kato, T.)

  4. Mud Logging; Control geologico en perforaciones petroliferas (Mud Logging)

    Energy Technology Data Exchange (ETDEWEB)

    Pumarega Lafuente, J.C.

    1994-12-31

    Mud Logging is an important activity in the oil field and it is a key job in drilling operations, our duties are the acquisition, collection and interpretation of the geological and engineering data at the wellsite, also inform the client immediately of any significant changes in the well. (Author)

  5. Borehole logging for uranium exploration

    International Nuclear Information System (INIS)

    1982-01-01

    The present text has been prepared taking into account the requirements of both developing countries, which might be at an incipient stage of uranium exploration, and industrialized countries, where more advanced exploration and resource evaluation techniques are commonly in use. While it was felt necessary to include some discussion of exploration concepts and fundamental physical principles underlying various logging methods, it was not the intention of the consultants to provide a thorough, detailed explanation of the various techniques, or even to give a comprehensive listing thereof. However, a list of references has been included, and it is strongly recommended that the serious student of mineral logging consult this list for further guidance

  6. Pulsed neutron porosity logging system

    International Nuclear Information System (INIS)

    Smith, H.D. Jr.; Smith, M.P.; Schultz, W.E.

    1978-01-01

    An improved pulsed neutron porosity logging system is provided in the present invention. A logging tool provided with a 14 MeV pulsed neutron source, an epithermal neutron detector, and a fast neutron detector is moved through a borehole. Repetitive bursts of neutrons irradiate the earth formations and, during the bursts, the fast neutron population is sampled. During the interval between bursts the epithermal neutron population is sampled along with background gamma radiation due to lingering thermal neutrons. The fast and epithermal neutron population measurements are combined to provide a measurement of formation porosity

  7. Understanding the tropical warm temperature bias simulated by climate models

    Science.gov (United States)

    Brient, Florent; Schneider, Tapio

    2017-04-01

    The state-of-the-art coupled general circulation models have difficulties in representing the observed spatial pattern of surface tempertaure. A majority of them suffers a warm bias in the tropical subsiding regions located over the eastern parts of oceans. These regions are usually covered by low-level clouds scattered from stratus along the coasts to more vertically developed shallow cumulus farther from them. Models usually fail to represent accurately this transition. Here we investigate physical drivers of this warm bias in CMIP5 models through a near-surface energy budget perspective. We show that overestimated solar insolation due to a lack of stratocumulus mostly explains the warm bias. This bias also arises partly from inter-model differences in surface fluxes that could be traced to differences in near-surface relative humidity and air-sea temperature gradient. We investigate the role of the atmosphere in driving surface biases by comparing historical and atmopsheric (AMIP) experiments. We show that some differences in boundary-layer characteristics, mostly those related to cloud fraction and relative humidity, are already present in AMIP experiments and may be the drivers of coupled biases. This gives insights in how models can be improved for better simulations of the tropical climate.

  8. Effects of reduced-impact logging on fish assemblages in central Amazonia.

    Science.gov (United States)

    Dias, Murilo S; Magnusson, William E; Zuanon, Jansen

    2010-02-01

    In Amazonia reduced-impact logging, which is meant to reduce environmental disturbance by controlling stem-fall directions and minimizing construction of access roads, has been applied to large areas containing thousands of streams. We investigated the effects of reduced-impact logging on environmental variables and the composition of fish in forest streams in a commercial logging concession in central Amazonia, Amazonas State, Brazil. To evaluate short-term effects, we sampled 11 streams before and after logging in one harvest area. We evaluated medium-term effects by comparing streams in 11 harvest areas logged 1-8 years before the study with control streams in adjacent areas. Each sampling unit was a 50-m stream section. The tetras Pyrrhulina brevis and Hemigrammus cf. pretoensis had higher abundances in plots logged > or =3 years before compared with plots logged fish composition did not differ two months before and immediately after reduced-impact logging. Temperature and pH varied before and after logging, but those differences were compatible with normal seasonal variation. In the medium term, temperature and cover of logs were lower in logged plots. Differences in ordination scores on the basis of relative fish abundance between streams in control and logged areas changed with time since logging, mainly because some common species increased in abundance after logging. There was no evidence of species loss from the logging concession, but differences in log cover and ordination scores derived from relative abundance of fish species persisted even after 8 years. For Amazonian streams, reduced-impact logging appears to be a viable alternative to clear-cut practices, which severely affect aquatic communities. Nevertheless, detailed studies are necessary to evaluated subtle long-term effects.

  9. Modeling Study of High Pressure and High Temperature Reservoir Fluids

    DEFF Research Database (Denmark)

    Varzandeh, Farhad

    properties like saturation pressures, densities at reservoir temperature and Stock TankviOil (STO) densities, while keeping the n-alkane limit of the correlations unchanged. Apart from applying this general approach to PC-SAFT, we have also shown that the approach can be applied to classical cubic models...... approach to characterizing reservoir fluids for any EoS. The approach consists in developing correlations of model parameters first with a database for well-defined components and then adjusting the correlations with a large PVT database. The adjustment is made to minimize the deviation in key PVT...... method to SRK and PR improved the saturation pressure calculation in comparisonto the original characterization method for SRK and PR. Using volume translationtogether with the new characterization approach for SRK and PR gives comparable results for density and STO density to that of original...

  10. FEM Modeling of the Relationship between the High-Temperature Hardness and High-Temperature, Quasi-Static Compression Experiment.

    Science.gov (United States)

    Zhang, Tao; Jiang, Feng; Yan, Lan; Xu, Xipeng

    2017-12-26

    The high-temperature hardness test has a wide range of applications, but lacks test standards. The purpose of this study is to develop a finite element method (FEM) model of the relationship between the high-temperature hardness and high-temperature, quasi-static compression experiment, which is a mature test technology with test standards. A high-temperature, quasi-static compression test and a high-temperature hardness test were carried out. The relationship between the high-temperature, quasi-static compression test results and the high-temperature hardness test results was built by the development of a high-temperature indentation finite element (FE) simulation. The simulated and experimental results of high-temperature hardness have been compared, verifying the accuracy of the high-temperature indentation FE simulation.The simulated results show that the high temperature hardness basically does not change with the change of load when the pile-up of material during indentation is ignored. The simulated and experimental results show that the decrease in hardness and thermal softening are consistent. The strain and stress of indentation were analyzed from the simulated contour. It was found that the strain increases with the increase of the test temperature, and the stress decreases with the increase of the test temperature.

  11. FEM Modeling of the Relationship between the High-Temperature Hardness and High-Temperature, Quasi-Static Compression Experiment

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2017-12-01

    Full Text Available The high-temperature hardness test has a wide range of applications, but lacks test standards. The purpose of this study is to develop a finite element method (FEM model of the relationship between the high-temperature hardness and high-temperature, quasi-static compression experiment, which is a mature test technology with test standards. A high-temperature, quasi-static compression test and a high-temperature hardness test were carried out. The relationship between the high-temperature, quasi-static compression test results and the high-temperature hardness test results was built by the development of a high-temperature indentation finite element (FE simulation. The simulated and experimental results of high-temperature hardness have been compared, verifying the accuracy of the high-temperature indentation FE simulation.The simulated results show that the high temperature hardness basically does not change with the change of load when the pile-up of material during indentation is ignored. The simulated and experimental results show that the decrease in hardness and thermal softening are consistent. The strain and stress of indentation were analyzed from the simulated contour. It was found that the strain increases with the increase of the test temperature, and the stress decreases with the increase of the test temperature.

  12. Finite-temperature models of Bose-Einstein condensation

    Energy Technology Data Exchange (ETDEWEB)

    Proukakis, Nick P; Jackson, Brian [School of Mathematics and Statistics, Newcastle University, Newcastle-upon-Tyne NE1 7RU (United Kingdom)], E-mail: Nikolaos.Proukakis@ncl.ac.uk

    2008-10-28

    The theoretical description of trapped weakly interacting Bose-Einstein condensates is characterized by a large number of seemingly very different approaches which have been developed over the course of time by researchers with very distinct backgrounds. Newcomers to this field, experimentalists and young researchers all face a considerable challenge in navigating through the 'maze' of abundant theoretical models, and simple correspondences between existing approaches are not always very transparent. This tutorial provides a generic introduction to such theories, in an attempt to single out common features and deficiencies of certain 'classes of approaches' identified by their physical content, rather than their particular mathematical implementation. This tutorial is structured in a manner accessible to a non-specialist with a good working knowledge of quantum mechanics. Although some familiarity with concepts of quantum field theory would be an advantage, key notions, such as the occupation number representation of second quantization, are nonetheless briefly reviewed. Following a general introduction, the complexity of models is gradually built up, starting from the basic zero-temperature formalism of the Gross-Pitaevskii equation. This structure enables readers to probe different levels of theoretical developments (mean field, number conserving and stochastic) according to their particular needs. In addition to its 'training element', we hope that this tutorial will prove useful to active researchers in this field, both in terms of the correspondences made between different theoretical models, and as a source of reference for existing and developing finite-temperature theoretical models. (phd tutorial)

  13. Modelling of the temperature field that accompanies friction stir welding

    Directory of Open Access Journals (Sweden)

    Nosal Przemysław

    2017-01-01

    Full Text Available The thermal modelling of the Friction Stir Welding process allows for better recognition and understanding of phenomena occurring during the joining process of different materials. It is of particular importance considering the possibilities of process technology parameters, optimization and the mechanical properties of the joint. This work demonstrates the numerical modelling of temperature distribution accompanying the process of friction stir welding. The axisymmetric problem described by Fourier’s type equation with internal heat source is considered. In order to solve the diffusive initial value problem a fully implicit scheme of the finite difference method is applied. The example under consideration deals with the friction stir welding of a plate (0.7 cm thick made of Al 6082-T6 by use of a tool made of tungsten alloy, whereas the material subjected to welding was TiC powder. Obtained results confirm both quantitatively and qualitatively experimental observations that the superior temperature corresponds to the zone where the pin joints the shoulder.

  14. High resolution gamma spectroscopy well logging system

    International Nuclear Information System (INIS)

    Giles, J.R.; Dooley, K.J.

    1997-01-01

    A Gamma Spectroscopy Logging System (GSLS) has been developed to study sub-surface radionuclide contamination. The absolute counting efficiencies of the GSLS detectors were determined using cylindrical reference sources. More complex borehole geometries were modeled using commercially available shielding software and correction factors were developed based on relative gamma-ray fluence rates. Examination of varying porosity and moisture content showed that as porosity increases, and as the formation saturation ratio decreases, relative gamma-ray fluence rates increase linearly for all energies. Correction factors for iron and water cylindrical shields were found to agree well with correction factors determined during previous studies allowing for the development of correction factors for type-304 stainless steel and low-carbon steel casings. Regression analyses of correction factor data produced equations for determining correction factors applicable to spectral gamma-ray well logs acquired under non-standard borehole conditions

  15. Methodology to characterize an unsampled oil interval, integrating PVT (Pressure/Volumen/Temperature) analysis and production log; Metodologia para caracterizacao de oleo de intervalo nao-amostrado, integrando analise PVT e perfil de producao

    Energy Technology Data Exchange (ETDEWEB)

    Marcon, Diogo Reato; Souza, Ana Paula Martins de; Vieira, Alexandre J.M. [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2008-07-01

    This work presents a new methodology for characterizing an unsampled oil interval, using basically production log data and PVT analyses available in the well. The methodology was applied to a real case, where the live oil samples were collected during a well test run in three different depths, revealing some evidence of a compositional grading due to gravity. Each individual sample was a mixture of the fluid produced from the reservoir bottom to the sampling point, since the whole interval was perforated and the isolation had to be made with a packer. The first sample was corresponding to the mixture of lower and all upper oils. The other two samples are only the heavier and that oil with part of the one from the upper interval. In order to identify the fluid properties from the upper interval, needed for production development studies, the following procedure was devised: equation-of-state tuning, reproducing the sampled fluid properties; conversion of volumetric flowrates from production log into mass and molar flowrates; flowrate ratio calculation, between the upper and lower intervals; upper interval fluid composition estimative; upper interval fluid properties simulation, using the previously tuned equation-of-state, thus generating what was considered a representative, synthetic PVT analysis. (author)

  16. Comparison between observational and theoretical (log Tsub(eff), Msub(bol)) diagrams

    International Nuclear Information System (INIS)

    Cayrel de Strobel, G.; Perrin, M.N.

    1978-01-01

    Perrin et al. (1977) have constructed an empirical HR diagram for 138 nearby F, G and K stars, for which they had: i) an effective temperature and a metal content derived from a detailed analysis; ii) a reliable bolometric magnitude obtained from an absolute magnitude Msub(V), based on a large parallax and a rather small bolometric correction. In the present work it is asserted that these results based on the theoretical grid of evolutionary models of Hejlesen (1975) remain valid if the observational (log Tsub(eff), Msub(bol)) model is studied with Demarque's (1977) evolutionary models. (Auth.)

  17. Logging Work Injuries in Appalachia

    Science.gov (United States)

    Charles H. Wolf; Gilbert P. Dempsey

    1978-01-01

    Logging accidents are costly. They may bring pain to injured workers, hardship to their families, and higher insurance premiums and lower productivity to their employers. Our analysis of 1,172 injuries in central Appalachia reveals that nearly half of all time lost-and almost all fatalities-resulted from accidents during felling and unloading. The largest proportion of...

  18. Log files for testing usability

    NARCIS (Netherlands)

    Klein Teeselink, G.; Siepe, A.H.M.; Pijper, de J.R.

    1999-01-01

    The aim of this study is to gain insight in the usefulness of log file analysis as a method to evaluate the usability of individual interface components and their influence on the usability of the overall user interface. We selected a music player as application, with four different interfaces and

  19. Precipitates/Salts Model Calculations for Various Drift Temperature Environments

    Energy Technology Data Exchange (ETDEWEB)

    P. Marnier

    2001-12-20

    The objective and scope of this calculation is to assist Performance Assessment Operations and the Engineered Barrier System (EBS) Department in modeling the geochemical effects of evaporation within a repository drift. This work is developed and documented using procedure AP-3.12Q, Calculations, in support of ''Technical Work Plan For Engineered Barrier System Department Modeling and Testing FY 02 Work Activities'' (BSC 2001a). The primary objective of this calculation is to predict the effects of evaporation on the abstracted water compositions established in ''EBS Incoming Water and Gas Composition Abstraction Calculations for Different Drift Temperature Environments'' (BSC 2001c). A secondary objective is to predict evaporation effects on observed Yucca Mountain waters for subsequent cement interaction calculations (BSC 2001d). The Precipitates/Salts model is documented in an Analysis/Model Report (AMR), ''In-Drift Precipitates/Salts Analysis'' (BSC 2001b).

  20. Data-Model Comparison of Pliocene Sea Surface Temperature

    Science.gov (United States)

    Dowsett, H. J.; Foley, K.; Robinson, M. M.; Bloemers, J. T.

    2013-12-01

    The mid-Piacenzian (late Pliocene) climate represents the most geologically recent interval of long-term average warmth and shares similarities with the climate projected for the end of the 21st century. As such, its fossil and sedimentary record represents a natural experiment from which we can gain insight into potential climate change impacts, enabling more informed policy decisions for mitigation and adaptation. We present the first systematic comparison of Pliocene sea surface temperatures (SST) between an ensemble of eight climate model simulations produced as part of PlioMIP (Pliocene Model Intercomparison Project) and the PRISM (Pliocene Research, Interpretation and Synoptic Mapping) Project mean annual SST field. Our results highlight key regional (mid- to high latitude North Atlantic and tropics) and dynamic (upwelling) situations where there is discord between reconstructed SST and the PlioMIP simulations. These differences can lead to improved strategies for both experimental design and temporal refinement of the palaeoenvironmental reconstruction. Scatter plot of multi-model-mean anomalies (squares) and PRISM3 data anomalies (large blue circles) by latitude. Vertical bars on data anomalies represent the variability of warm climate phase within the time-slab at each locality. Small colored circles represent individual model anomalies and show the spread of model estimates about the multi-model-mean. While not directly comparable in terms of the development of the means nor the meaning of variability, this plot provides a first order comparison of the anomalies. Encircled areas are a, PRISM low latitude sites outside of upwelling areas; b, North Atlantic coastal sequences and Mediterranean sites; c, large anomaly PRISM sites from the northern hemisphere. Numbers identify Ocean Drilling Program sites.

  1. Hardwood log supply: a broader perspective

    Science.gov (United States)

    Iris Montague; Adri Andersch; Jan Wiedenbeck; Urs. Buehlmann

    2015-01-01

    At regional and state meetings we talk with others in our business about the problems we face: log exports, log quality, log markets, logger shortages, cash flow problems, the weather. These are familiar talking points and real and persistent problems. But what is the relative importance of these problems for log procurement in different regions of...

  2. Unsupervised signature extraction from forensic logs

    NARCIS (Netherlands)

    Thaler, S.M.; Menkovski, V.; Petkovic, M.; Altun, Y.; Das, K.; Mielikäinen, T.; Malerba, D.; Stefanowski, J.; Read, J.; Žitnik, M.; Ceci, M.

    2017-01-01

    Signature extraction is a key part of forensic log analysis. It involves recognizing patterns in log lines such that log lines that originated from the same line of code are grouped together. A log signature consists of immutable parts and mutable parts. The immutable parts define the signature, and

  3. A method of estimating log weights.

    Science.gov (United States)

    Charles N. Mann; Hilton H. Lysons

    1972-01-01

    This paper presents a practical method of estimating the weights of logs before they are yarded. Knowledge of log weights is required to achieve optimum loading of modern yarding equipment. Truckloads of logs are weighed and measured to obtain a local density index (pounds per cubic foot) for a species of logs. The density index is then used to estimate the weights of...

  4. Nondestructive evaluation for sorting red maple logs

    Science.gov (United States)

    Xiping Wang; Robert J. Ross; David W. Green; Karl Englund; Michael Wolcott

    2000-01-01

    Existing log grading procedures in the United States make only visual assessments of log quality. These procedures do not incorporate estimates of the modulus of elasticity (MOE) of logs. It is questionable whether the visual grading procedures currently used for logs adequately assess the potential quality of structural products manufactured from them, especially...

  5. Nuclear well logging in hydrology

    International Nuclear Information System (INIS)

    1971-01-01

    The optimum development of regional and local groundwater resources requires a quantitative evaluation of its aquifers and aquicludes, and of the physical and chemical properties relevant to the recharge to and withdrawal of water from them. If an understanding of the groundwater regime is to be obtained, geological observations at outcrop must be augmented by subsurface measurements of the strata and the waters they contain. Measurements of many hydrological and geological parameters can be made in situ by nuclear geophysical well-logging methods. Very simply, well logging consists of lowering a measuring probe into a well and making a continuous record of the variations of a particular parameter with depth. In most circumstances, repetition of the measurements under differing hydrodynamic conditions results in a better definition of the flow regime in the aquifer. Nuclear well-logging techniques have for some years been capable of solving a number of the sub-surface measurement problems faced by hydrogeologists. However, the present usage of these methods varies from country to country and the literature concerning applications is scattered in the professional journals of several disciplines. The objective of this report is to include in a single reference volume descriptions of the physical principles of nuclear logging methods, their applications to hydrogeological problems and their limitations on a level suitable for the practising hydrologists with a limited knowledge of nuclear physics. The Working Group responsible for compiling the report recommended that it should cover a broad spectrum of hydrogeological investigations and problems. For example, it saw no valid reason to distinguish for the purposes of the report between well-logging applications for water-supply purposes and for water-flooding studies in the petroleum industry. Neutron measurements made for soil-moisture determinations in the unsaturated zone have been specifically omitted, however, as

  6. On conditional independence and log-convexity

    Czech Academy of Sciences Publication Activity Database

    Matúš, František

    2012-01-01

    Roč. 48, č. 4 (2012), s. 1137-1147 ISSN 0246-0203 R&D Projects: GA AV ČR IAA100750603; GA ČR GA201/08/0539 Institutional support: RVO:67985556 Keywords : Conditional independence * Markov properties * factorizable distributions * graphical Markov models * log-convexity * Gibbs- Markov equivalence * Markov fields * Gaussian distributions * positive definite matrices * covariance selection model Subject RIV: BA - General Mathematics Impact factor: 0.933, year: 2012 http://library.utia.cas.cz/separaty/2013/MTR/matus-0386229.pdf

  7. Deterministic Modeling of the High Temperature Test Reactor

    International Nuclear Information System (INIS)

    Ortensi, J.; Cogliati, J.J.; Pope, M.A.; Ferrer, R.M.; Ougouag, A.M.

    2010-01-01

    Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine INL's current prismatic reactor deterministic analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 column thin annular core, and the fully loaded core critical condition with 30 columns. Special emphasis is devoted to the annular core modeling, which shares more characteristics with the NGNP base design. The DRAGON code is used in this study because it offers significant ease and versatility in modeling prismatic designs. Despite some geometric limitations, the code performs quite well compared to other lattice physics codes. DRAGON can generate transport solutions via collision probability (CP), method of characteristics (MOC), and discrete ordinates (Sn). A fine group cross section library based on the SHEM 281 energy structure is used in the DRAGON calculations. HEXPEDITE is the hexagonal z full core solver used in this study and is based on the Green's Function solution of the transverse integrated equations. In addition, two Monte Carlo (MC) based codes, MCNP5 and PSG2/SERPENT, provide benchmarking capability for the DRAGON and the nodal diffusion solver codes. The results from this study show a consistent bias of 2-3% for the core multiplication factor. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement stems from the fact that during the experiments the control

  8. A model for temperature dependent resistivity of metallic superlattices

    Directory of Open Access Journals (Sweden)

    J. I. Uba

    2015-11-01

    Full Text Available The temperature dependent resistivity of metallic superlattices, to first order approximation, is assumed to have same form as bulk metal, ρ(T = ρo + aT, which permits describing these structures as linear atomic chain. The assumption is, substantiated with the derivation of the above expression from the standard magnetoresistance equation, in which the second term, a Bragg scattering factor, is a correction to the usual model involving magnon and phonon scatterings. Fitting the model to Fe/Cr data from literature shows that Bragg scattering is dominant at T < 50 K and magnon and phonon coefficients are independent of experiment conditions, with typical values of 4.7 × 10−4 μΩcmK−2 and −8 ± 0.7 × 10−7μΩcmK−3. From the linear atomic chain model, the dielectric constant ε q , ω = 8 . 33 × 10 − 2 at Debye frequency for all materials and acoustic speed and Thomas – Fermi screening length are pressure dependent with typical values of 1.53 × 104 m/s and 1.80 × 109 m at 0.5 GPa pressure for an Fe/Cr structure.

  9. Elevated Temperature Testing and Modeling of Advanced Toughened Ceramic Materials

    Science.gov (United States)

    Keith, Theo G.

    2005-01-01

    The purpose of this report is to provide a final report for the period of 12/1/03 through 11/30/04 for NASA Cooperative Agreement NCC3-776, entitled "Elevated Temperature Testing and Modeling of Advanced Toughened Ceramic Materials." During this final period, major efforts were focused on both the determination of mechanical properties of advanced ceramic materials and the development of mechanical test methodologies under several different programs of the NASA-Glenn. The important research activities made during this period are: 1. Mechanical properties evaluation of two gas-turbine grade silicon nitrides. 2) Mechanical testing for fuel-cell seal materials. 3) Mechanical properties evaluation of thermal barrier coatings and CFCCs and 4) Foreign object damage (FOD) testing.

  10. Nuclear cross section library for oil well logging analysis

    International Nuclear Information System (INIS)

    Kodeli, I.; Kitsos, S.; Aldama, D.L.; Zefran, B.

    2003-01-01

    As part of the IRTMBA (Improved Radiation Transport Modelling for Borehole Applications) Project of the EU Community's 5 th Programme a special purpose multigroup cross section library to be used in the deterministic (as well as Monte Carlo) oil well logging particle transport calculations was prepared. This library is expected to improve the prediction of the neutron and gamma spectra at the detector positions of the logging tool, and their use for the interpretation of the neutron logging measurements was studied. Preparation and testing of this library is described. (author)

  11. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  12. Sensitivity of a soil-plant-atmosphere model to changes in air temperature, dew point temperature, and solar radiation

    Energy Technology Data Exchange (ETDEWEB)

    Luxmoore, R.J. (Oak Ridge National Lab.,TN); Stolzy, J.L.; Holdeman, J.T.

    1981-01-01

    Air temperature, dew point temperature and solar radiation were independently varied in an hourly soil-plant-atmosphere model in a sensitivity analysis of these parameters. Results suggested that evapotranspiration in eastern Tennessee is limited more by meteorological conditions that determine the vapor-pressure gradient than by the necessary energy to vaporize water within foliage. Transpiration and soil water drainage were very sensitive to changes in air and dew point temperature and to solar radiation under low atmospheric vapor-pressure deficit conditions associated with reduced air temperature. Leaf water potential and stomatal conductance were reduced under conditions having high evapotranspiration. Representative air and dew point temperature input data for a particular application are necessary for satisfactory results, whereas irradiation may be less well characterized for applications with high atmospheric vapor-pressure deficit. The effects of a general rise in atmospheric temperature on forest water budgets are discussed.

  13. Logística empresarial

    OpenAIRE

    Feres Sahid

    1987-01-01

    RESUMEN El concepto logístico, se pudo ver reflejado con exactitud desde el punto de vista etimológico e histórico a través de la revista de la E.A.N; ya que  tiene cierto carácter militar que lo hace característico a la gestión empresarial y de esto se formula un debate definitivo de este concepto.

  14. Chemical logging of geothermal wells

    Science.gov (United States)

    Allen, C.A.; McAtee, R.E.

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  15. Audit Log for Forensic Photography

    Science.gov (United States)

    Neville, Timothy; Sorell, Matthew

    We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

  16. Logística empresarial

    Directory of Open Access Journals (Sweden)

    Feres Sahid

    1987-04-01

    Full Text Available RESUMEN El concepto logístico, se pudo ver reflejado con exactitud desde el punto de vista etimológico e histórico a través de la revista de la E.A.N; ya que  tiene cierto carácter militar que lo hace característico a la gestión empresarial y de esto se formula un debate definitivo de este concepto.

  17. Neutron capture in borehole logging

    International Nuclear Information System (INIS)

    Randall, R.R.

    1981-01-01

    The use is described of a pulsed source of fast neutrons and a radiation detector to measure the thermal neutron population decay rate in a well logging instrument. The macroscopic neutron absorption cross-section is calculated by taking the natural logarithm of the ratio of the detected radiation counts occurring within two measurement intervals of fixed duration and starting at a fixed time after a neutron burst. (U.K.)

  18. Modelling property changes in graphite irradiated at changing irradiation temperature

    CSIR Research Space (South Africa)

    Kok, S

    2011-01-01

    Full Text Available A new method is proposed to predict the irradiation induced property changes in nuclear; graphite, including the effect of a change in irradiation temperature. The currently used method; to account for changes in irradiation temperature, the scaled...

  19. Borehole logging in uranium exploration

    International Nuclear Information System (INIS)

    Kulkarni, N.H.

    1992-01-01

    The ultimate objective of exploration by drilling as far as Atomic Minerals Division is concerned is to locate the ore zone in the subsurface, draw samples and analyze them for their metal content. The presence of the ore zone is also indicated by gamma-ray logging of the borehole. A gamma-ray detector is lowered in the borehole and precise depth and grade of the ore zone is established. This helps the geologist in correlating the ore horizon with the surface outcrop or the ore zone intercepted in adjoining boreholes and in deciding about further drilling and location of boreholes. Most commonly, total gamma measurements are made although some units capable of measuring the gamma-ray spectrum are also in use. It is possible to know if the mineralization is due to uranium without waiting for the laboratory results. The present write up gives a brief account of the principles, equipment and methods of borehole gamma-ray logging including density and self-potential logging. (author). 8 refs., 5 figs

  20. Fluid temperatures: Modeling the thermal regime of a river network

    Science.gov (United States)

    Rhonda Mazza; Ashley Steel

    2017-01-01

    Water temperature drives the complex food web of a river network. Aquatic organisms hatch, feed, and reproduce in thermal niches within the tributaries and mainstem that comprise the river network. Changes in water temperature can synchronize or asynchronize the timing of their life stages throughout the year. The water temperature fluctuates over time and place,...

  1. Developing an Effective Model for Predicting Spatially and Temporally Continuous Stream Temperatures from Remotely Sensed Land Surface Temperatures

    Directory of Open Access Journals (Sweden)

    Kristina M. McNyset

    2015-12-01

    Full Text Available Although water temperature is important to stream biota, it is difficult to collect in a spatially and temporally continuous fashion. We used remotely-sensed Land Surface Temperature (LST data to estimate mean daily stream temperature for every confluence-to-confluence reach in the John Day River, OR, USA for a ten year period. Models were built at three spatial scales: site-specific, subwatershed, and basin-wide. Model quality was assessed using jackknife and cross-validation. Model metrics for linear regressions of the predicted vs. observed data across all sites and years: site-specific r2 = 0.95, Root Mean Squared Error (RMSE = 1.25 °C; subwatershed r2 = 0.88, RMSE = 2.02 °C; and basin-wide r2 = 0.87, RMSE = 2.12 °C. Similar analyses were conducted using 2012 eight-day composite LST and eight-day mean stream temperature in five watersheds in the interior Columbia River basin. Mean model metrics across all basins: r2 = 0.91, RMSE = 1.29 °C. Sensitivity analyses indicated accurate basin-wide models can be parameterized using data from as few as four temperature logger sites. This approach generates robust estimates of stream temperature through time for broad spatial regions for which there is only spatially and temporally patchy observational data, and may be useful for managers and researchers interested in stream biota.

  2. Numerical simulation of logging-while-drilling density image by Monte-Carlo method

    International Nuclear Information System (INIS)

    Yue Aizhong; He Biao; Zhang Jianmin; Wang Lijuan

    2010-01-01

    Logging-while-drilling system is researched by Monte Carlo Method. Model of Logging-while-drilling system is built, tool response and azimuth density image are acquired, methods dealing with azimuth density data is discussed. This outcome lay foundation for optimizing tool, developing new tool and logging explanation. (authors)

  3. Modelling fruit-temperature dynamics within apple tree crowns using virtual plants.

    Science.gov (United States)

    Saudreau, M; Marquier, A; Adam, B; Sinoquet, H

    2011-10-01

    Fruit temperature results from a complex system involving the climate, the tree architecture, the fruit location within the tree crown and the fruit thermal properties. Despite much theoretical and experimental evidence for large differences (up to 10 °C in sunny conditions) between fruit temperature and air temperature, fruit temperature is never used in horticultural studies. A way of modelling fruit-temperature dynamics from climate data is addressed in this work. The model is based upon three-dimensional virtual representation of apple trees and links three-dimensional virtual trees with a physical-based fruit-temperature dynamical model. The overall model was assessed by comparing model outputs to field measures of fruit-temperature dynamics. The model was able to simulate both the temperature dynamics at fruit scale, i.e. fruit-temperature gradients and departure from air temperature, and at the tree scale, i.e. the within-tree-crown variability in fruit temperature (average root mean square error value over fruits was 1·43 °C). This study shows that linking virtual plants with the modelling of the physical plant environment offers a relevant framework to address the modelling of fruit-temperature dynamics within a tree canopy. The proposed model offers opportunities for modelling effects of the within-crown architecture on fruit thermal responses in horticultural studies.

  4. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species.

    Science.gov (United States)

    Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R

    2017-01-04

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  5. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    Science.gov (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  6. Numerical simulation of responses for cased-hole density logging

    International Nuclear Information System (INIS)

    Wu, Wensheng; Fu, Yaping; Niu, Wei

    2013-01-01

    Stabilizing or stimulating oil production in old oil fields requires density logging in cased holes where open-hole logging data are either missing or of bad quality. However, measured values from cased-hole density logging are more severely influenced by factors such as fluid, casing, cement sheath and the outer diameter of the open-hole well compared with those from open-hole logging. To correctly apply the cased-hole formation density logging data, one must eliminate these influences on the measured values and study the characteristics of how the cased-hole density logging instrument responds to these factors. In this paper, a Monte Carlo numerical simulation technique was used to calculate the responses of the far detector of a cased-hole density logging instrument to in-hole fluid, casing wall thickness, cement sheath density and the formation and thus to obtain influence rules and response coefficients. The obtained response of the detector is a function of in-hole liquid, casing wall thickness, the casing's outer diameter, cement sheath density, open-hole well diameter and formation density. The ratio of the counting rate of the detector in the calibration well to that in the measurement well was used to get a fairly simple detector response equation and the coefficients in the equation are easy to acquire. These provide a new way of calculating cased-hole density through forward modelling methods. (paper)

  7. MCEM algorithm for the log-Gaussian Cox process

    OpenAIRE

    Delmas, Celine; Dubois-Peyrard, Nathalie; Sabbadin, Regis

    2014-01-01

    Log-Gaussian Cox processes are an important class of models for aggregated point patterns. They have been largely used in spatial epidemiology (Diggle et al., 2005), in agronomy (Bourgeois et al., 2012), in forestry (Moller et al.), in ecology (sightings of wild animals) or in environmental sciences (radioactivity counts). A log-Gaussian Cox process is a Poisson process with a stochastic intensity depending on a Gaussian random eld. We consider the case where this Gaussian random eld is ...

  8. Temperature modulated differential scanning calorimetry. Modelling and applications

    International Nuclear Information System (INIS)

    Jiang, Z.

    2000-01-01

    DSC. Some shortcomings of TMDSC have been noticed in both modelling and application work. Firstly, any experiments for purpose of either understanding or the quantitative measurements of TMDSC output quantities should be performed under carefully selected conditions which can satisfy the linear response assumption. Secondly, some signals in particular those associated with kinetic processes may not be fully sampled by TMDSC due to the limit of the observing window of a modulation. Thirdly, the TMDSC evaluation procedure introduces mathematical artefacts into the output signals. As a consequence, it is preferable to include as many temperature modulations as possible within any transition being studied in order obtain good quality experimental signals by eliminating or minimising these artefacts. (author)

  9. Kinetic Modeling of Corn Fermentation with S. cerevisiae Using a Variable Temperature Strategy

    Directory of Open Access Journals (Sweden)

    Augusto C. M. Souza

    2018-04-01

    Full Text Available While fermentation is usually done at a fixed temperature, in this study, the effect of having a controlled variable temperature was analyzed. A nonlinear system was used to model batch ethanol fermentation, using corn as substrate and the yeast Saccharomyces cerevisiae, at five different fixed and controlled variable temperatures. The lower temperatures presented higher ethanol yields but took a longer time to reach equilibrium. Higher temperatures had higher initial growth rates, but the decay of yeast cells was faster compared to the lower temperatures. However, in a controlled variable temperature model, the temperature decreased with time with the initial value of 40 ∘ C. When analyzing a time window of 60 h, the ethanol production increased 20% compared to the batch with the highest temperature; however, the yield was still 12% lower compared to the 20 ∘ C batch. When the 24 h’ simulation was analyzed, the controlled model had a higher ethanol concentration compared to both fixed temperature batches.

  10. Kinetic Modeling of Corn Fermentation with S. cerevisiae Using a Variable Temperature Strategy.

    Science.gov (United States)

    Souza, Augusto C M; Mousaviraad, Mohammad; Mapoka, Kenneth O M; Rosentrater, Kurt A

    2018-04-24

    While fermentation is usually done at a fixed temperature, in this study, the effect of having a controlled variable temperature was analyzed. A nonlinear system was used to model batch ethanol fermentation, using corn as substrate and the yeast Saccharomyces cerevisiae , at five different fixed and controlled variable temperatures. The lower temperatures presented higher ethanol yields but took a longer time to reach equilibrium. Higher temperatures had higher initial growth rates, but the decay of yeast cells was faster compared to the lower temperatures. However, in a controlled variable temperature model, the temperature decreased with time with the initial value of 40 ∘ C. When analyzing a time window of 60 h, the ethanol production increased 20% compared to the batch with the highest temperature; however, the yield was still 12% lower compared to the 20 ∘ C batch. When the 24 h’ simulation was analyzed, the controlled model had a higher ethanol concentration compared to both fixed temperature batches.

  11. Application of oil-field well log interpretation techniques to the Cerro Prieto Geothermal Field

    Energy Technology Data Exchange (ETDEWEB)

    Ershaghi, I.; Phillips, L.B.; Dougherty, E.L.; Handy, L.L.

    1979-10-01

    An example is presented of the application of oil-field techniques to the Cerro Prieto Field, Mexico. The lithology in this field (sand-shale lithology) is relatively similar to oil-field systems. The study was undertaken as a part of the first series of case studies supported by the Geothermal Log Interpretation Program (GLIP) of the US Department of Energy. The suites of logs for individual wells were far from complete. This was partly because of adverse borehole conditions but mostly because of unavailability of high-temperature tools. The most complete set of logs was a combination of Dual Induction Laterolog, Compensated Formation Density Gamma Ray, Compensated Neutron Log, and Saraband. Temperature data about the wells were sketchy, and the logs had been run under pre-cooled mud condition. A system of interpretation consisting of a combination of graphic and numerical studies was used to study the logs. From graphical studies, evidence of hydrothermal alteration may be established from the trend analysis of SP (self potential) and ILD (deep induction log). Furthermore, the cross plot techniques using data from density and neutron logs may help in establishing compaction as well as rock density profile with depth. In the numerical method, R/sub wa/ values from three different resistivity logs were computed and brought into agreement. From this approach, values of formation temperature and mud filtrate resistivity effective at the time of logging were established.

  12. Dynamic Planar Convex Hull with Optimal Query Time and O(log n · log log n ) Update Time

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Jakob, Riko

    2000-01-01

    The dynamic maintenance of the convex hull of a set of points in the plane is one of the most important problems in computational geometry. We present a data structure supporting point insertions in amortized O(log n · log log log n) time, point deletions in amortized O(log n · log log n) time......, and various queries about the convex hull in optimal O(log n) worst-case time. The data structure requires O(n) space. Applications of the new dynamic convex hull data structure are improved deterministic algorithms for the k-level problem and the red-blue segment intersection problem where all red and all...

  13. Well-log based prediction of thermal conductivity

    DEFF Research Database (Denmark)

    Fuchs, Sven; Förster, Andrea

    Rock thermal conductivity (TC) is paramount for the determination of heat flow and the calculation of temperature profiles. Due to the scarcity of drill cores compared to the availability of petrophysical well logs, methods are desired to indirectly predict TC in sedimentary basins. Most...

  14. Modeling seasonal surface temperature variations in secondary tropical dry forests

    Science.gov (United States)

    Cao, Sen; Sanchez-Azofeifa, Arturo

    2017-10-01

    Secondary tropical dry forests (TDFs) provide important ecosystem services such as carbon sequestration, biodiversity conservation, and nutrient cycle regulation. However, their biogeophysical processes at the canopy-atmosphere interface remain unknown, limiting our understanding of how this endangered ecosystem influences, and responds to the ongoing global warming. To facilitate future development of conservation policies, this study characterized the seasonal land surface temperature (LST) behavior of three successional stages (early, intermediate, and late) of a TDF, at the Santa Rosa National Park (SRNP), Costa Rica. A total of 38 Landsat-8 Thermal Infrared Sensor (TIRS) data and the Surface Reflectance (SR) product were utilized to model LST time series from July 2013 to July 2016 using a radiative transfer equation (RTE) algorithm. We further related the LST time series to seven vegetation indices which reflect different properties of TDFs, and soil moisture data obtained from a Wireless Sensor Network (WSN). Results showed that the LST in the dry season was 15-20 K higher than in the wet season at SRNP. We found that the early successional stages were about 6-8 K warmer than the intermediate successional stages and were 9-10 K warmer than the late successional stages in the middle of the dry season; meanwhile, a minimum LST difference (0-1 K) was observed at the end of the wet season. Leaf phenology and canopy architecture explained most LST variations in both dry and wet seasons. However, our analysis revealed that it is precipitation that ultimately determines the LST variations through both biogeochemical (leaf phenology) and biogeophysical processes (evapotranspiration) of the plants. Results of this study could help physiological modeling studies in secondary TDFs.

  15. Data Mining of Network Logs

    Science.gov (United States)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  16. Neutron borehole logging correction technique

    International Nuclear Information System (INIS)

    Goldman, L.H.

    1978-01-01

    In accordance with an illustrative embodiment of the present invention, a method and apparatus is disclosed for logging earth formations traversed by a borehole in which an earth formation is irradiated with neutrons and gamma radiation produced thereby in the formation and in the borehole is detected. A sleeve or shield for capturing neutrons from the borehole and producing gamma radiation characteristic of that capture is provided to give an indication of the contribution of borehole capture events to the total detected gamma radiation. It is then possible to correct from those borehole effects the total detected gamma radiation and any earth formation parameters determined therefrom

  17. Benchmark neutron porosity log calculations

    International Nuclear Information System (INIS)

    Little, R.C.; Michael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    Calculations have been made for a benchmark neutron porosity log problem with the general purpose Monte Carlo code MCNP and the specific purpose Monte Carlo code McDNL. For accuracy and timing comparison purposes the CRAY XMP and MicroVax II computers have been used with these codes. The CRAY has been used for an analog version of the MCNP code while the MicroVax II has been used for the optimized variance reduction versions of both codes. Results indicate that the two codes give the same results within calculated standard deviations. Comparisons are given and discussed for accuracy (precision) and computation times for the two codes

  18. Dose estimative in operators during petroleum wells logging with nuclear wireless probes through computer modelling; Estimativa da dose em operadores durante procedimentos de perfilagem de pocos de petroleo com sondas wireless nucleares atraves de modelagem computacional

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Edmilson Monteiro de; Silva, Ademir Xavier da; Lopes, Ricardo T., E-mail: emonteiro@nuclear.ufrj.b, E-mail: ademir@nuclear.ufrj.b, E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Lima, Inaya C.B., E-mail: inaya@lin.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Instituto Politecnico do Rio de Janeiro (IPRJ/UERJ), Nova Friburgo, RJ (Brazil); Correa, Samanda Cristine Arruda, E-mail: scorrea@cnen.gov.b [Comissao Nacional de Energia Nuclear (DIAPI/CGMI/CNEN), Rio de Janeiro, RJ (Brazil); Rocha, Paula L.F., E-mail: ferrucio@acd.ufrj.b [Universidade Federal do Rio de Janeiro (UFRJ)., RJ (Brazil). Dept. de Geologia

    2011-10-26

    This paper evaluates the absorbed dose and the effective dose on operators during the petroleum well logging with nuclear wireless that uses gamma radiation sources. To obtain the data, a typical scenery of a logging procedure will be simulated with MCNPX Monte Carlo code. The simulated logging probe was the Density Gamma Probe - TRISOND produced by Robertson Geolloging. The absorbed dose values were estimated through the anthropomorphic simulator in male voxel MAX. The effective dose values were obtained using the ICRP 103

  19. The Impact of the Variability of Precipitation and Temperatures on the Efficiency of a Conceptual Rainfall-Runoff Model

    Directory of Open Access Journals (Sweden)

    Sleziak P.

    2016-12-01

    Full Text Available The main objective of the paper is to understand how the model’s efficiency and the selected climatic indicators are related. The hydrological model applied in this study is a conceptual rainfall-runoff model (the TUW model, which was developed at the Vienna University of Technology. This model was calibrated over three different periods between 1981-2010 in three groups of Austrian catchments (snow, runoff, and soil catchments, which represent a wide range of the hydroclimatic conditions of Austria. The model’s calibration was performed using a differential evolution algorithm (Deoptim. As an objective function, we used a combination of the Nash-Sutcliffe coefficient (NSE and the logarithmic Nash-Sutcliffe coefficient (logNSE. The model’s efficiency was evaluated by Volume error (VE. Subsequently, we evaluated the relationship between the model’s efficiency (VE and changes in the climatic indicators (precipitation ΔP, air temperature ΔT. The implications of findings are discussed in the conclusion.

  20. 29 CFR 1918.88 - Log operations.

    Science.gov (United States)

    2010-07-01

    ...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in..., the employer shall ensure that employees remain clear of areas where logs being dumped could strike...

  1. Artificial intelligence approach to interwell log correlation

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Jong-Se [Korea Maritime University, Pusan(Korea); Kang, Joo Myung [Seoul National University, Seoul(Korea); Kim, Jung Whan [Korea National Oil Corp., Anyang(Korea)

    2000-04-30

    This paper describes a new approach to automated interwell log correlation using artificial intelligence and principal component analysis. The approach to correlate wire line logging data is on the basis of a large set of subjective rules that are intended to represent human logical processes. The data processed are mainly the qualitative information such as the characteristics of the shapes extracted along log traces. The apparent geologic zones are identified by pattern recognition for the specific characteristics of log trace collected as a set of objects by object oriented programming. The correlation of zones between wells is made by rule-based inference program. The reliable correlation can be established from the first principal component logs derived from both the important information around well bore and the largest common part of variances of all available well log data. Correlation with field log data shows that this approach can make interwell log correlation more reliable and accurate. (author). 6 refs., 7 figs.

  2. Tucker Wireline Open Hole Wireline Logging; FINAL

    International Nuclear Information System (INIS)

    Milliken, M.

    2002-01-01

    The Tucker Wireline unit ran a suite of open hole logs right behind the RMOTC logging contractor for comparison purposes. The tools included Dual Laterolog, Phased Induction, BHC Sonic, and Density-Porosity

  3. Modeling FBG sensors sensitivity from cryogenic temperatures to room temperature as a function of metal coating thickness

    Science.gov (United States)

    Vendittozzi, Cristian; Felli, Ferdinando; Lupi, Carla

    2018-05-01

    Fiber optics with photo-imprinted Bragg grating have been studied in order to be used as temperature sensors in cryogenic applications. The main disadvantage presented by Fiber Bragg Grating (FBG) sensors is the significant drop in sensitivity as temperature decreases, mainly due to the critical lowering of the thermo-optic coefficient of the fiber and the very low thermal expansion coefficient (CTE) of fused silica at cryogenic temperatures. Thus, especially for the latter, it is important to enhance sensitivity to temperature by depositing a metal coating presenting higher CTE. In this work the thermal sensitivity of metal-coated FBG sensors has been evaluated by considering their elongation within temperature variations in the cryogenic range, as compared to bare fiber sensors. To this purpose, a theoretical model simulating elongation of metal-coated sensors has been developed. The model has been used to evaluate the behaviour of different metals which can be used as coating (Ni, Cu, Al, Zn, Pb and In). The optimal coating thickness has been calculated at different fixed temperature (from 5 K to 100 K) for each metal. It has been found that the metal coating effectiveness depends on thickness and operating temperature in accordance to our previous experimental work and theory suggest.

  4. A critical view on temperature modelling for application in weather derivatives markets

    International Nuclear Information System (INIS)

    Šaltytė Benth, Jūratė; Benth, Fred Espen

    2012-01-01

    In this paper we present a stochastic model for daily average temperature. The model contains seasonality, a low-order autoregressive component and a variance describing the heteroskedastic residuals. The model is estimated on daily average temperature records from Stockholm (Sweden). By comparing the proposed model with the popular model of Campbell and Diebold (2005), we point out some important issues to be addressed when modelling the temperature for application in weather derivatives market. - Highlights: ► We present a stochastic model for daily average temperature, containing seasonality, a low-order autoregressive component and a variance describing the heteroskedastic residuals. ► We compare the proposed model with the popular model of Campbell and Diebold (2005). ► Some important issues to be addressed when modelling the temperature for application in weather derivatives market are pointed out.

  5. Generating porosity spectrum of carbonate reservoirs using ultrasonic imaging log

    Science.gov (United States)

    Zhang, Jie; Nie, Xin; Xiao, Suyun; Zhang, Chong; Zhang, Chaomo; Zhang, Zhansong

    2018-03-01

    Imaging logging tools can provide us the borehole wall image. The micro-resistivity imaging logging has been used to obtain borehole porosity spectrum. However, the resistivity imaging logging cannot cover the whole borehole wall. In this paper, we propose a method to calculate the porosity spectrum using ultrasonic imaging logging data. Based on the amplitude attenuation equation, we analyze the factors affecting the propagation of wave in drilling fluid and formation and based on the bulk-volume rock model, Wyllie equation and Raymer equation, we establish various conversion models between the reflection coefficient β and porosity ϕ. Then we use the ultrasonic imaging logging and conventional wireline logging data to calculate the near-borehole formation porosity distribution spectrum. The porosity spectrum result obtained from ultrasonic imaging data is compared with the one from the micro-resistivity imaging data, and they turn out to be similar, but with discrepancy, which is caused by the borehole coverage and data input difference. We separate the porosity types by performing threshold value segmentation and generate porosity-depth distribution curves by counting with equal depth spacing on the porosity image. The practice result is good and reveals the efficiency of our method.

  6. Modelling of aluminium sheet forming at elevated temperatures

    NARCIS (Netherlands)

    van den Boogaard, Antonius H.; Huetink, Han

    2004-01-01

    The formability of Al–Mg sheet can be improved considerably, by increasing the temperature. By heating the sheet in areas with large shear strains, but cooling it on places where the risk of necking is high, the limiting drawing ratio can be increased to values above 2.5. At elevated temperatures,

  7. Prediction of water temperature metrics using spatial modelling in ...

    African Journals Online (AJOL)

    Water temperature regime dynamics should be viewed regionally, where regional divisions have an inherent underpinning by an understanding of natural thermal variability. The aim of this research was to link key water temperature metrics to readily-mapped environmental surrogates, and to produce spatial images of ...

  8. A model of evaluating the pseudogap temperature for high ...

    Indian Academy of Sciences (India)

    The observation of pseudogap in normal-state properties of high-temperature supercon- ducting (HTS) oxide materials has raised many questions about the origin and its relation with superconductivity. Emery and Kevilson [1] first used the term pseudogap temper- ature for underdoped high-Tc materials. The temperature at ...

  9. Modelling and analysis of radial thermal stresses and temperature ...

    African Journals Online (AJOL)

    A theoretical investigation has been undertaken to study operating temperatures, heat fluxes and radial thermal stresses in the valves of a modern diesel engine with and without air-cavity. Temperatures, heat fluxes and radial thermal stresses were measured theoretically for both cases under all four thermal loading ...

  10. A linear regression model for predicting PNW estuarine temperatures in a changing climate

    Science.gov (United States)

    Pacific Northwest coastal regions, estuaries, and associated ecosystems are vulnerable to the potential effects of climate change, especially to changes in nearshore water temperature. While predictive climate models simulate future air temperatures, no such projections exist for...

  11. Selective logging in the Brazilian Amazon.

    Science.gov (United States)

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  12. Linking log quality with product performance

    Science.gov (United States)

    D. W. Green; Robert Ross

    1997-01-01

    In the United States, log grading procedures use visual assessment of defects, in relation to the log scaling diameter, to estimate the yield of lumber that maybe expected from the log. This procedure was satisfactory when structural grades were based only on defect size and location. In recent years, however, structural products have increasingly been graded using a...

  13. Selective logging and its relation to deforestation

    Science.gov (United States)

    Gregory P. Asner; Michael Keller; Marco Lentini; Frank Merry; Souza Jr. Carlos

    2009-01-01

    Selective logging is a major contributor to the social, economic, and ecological dynamics of Brazilian Amazonia. Logging activities have expanded from low-volume floodplain harvests in past centuries to high-volume operations today that take about 25 million m3 of wood from the forest each year. The most common high-impact conventional and often illegal logging...

  14. Pacific Rim log trade: determinants and trends.

    Science.gov (United States)

    Donald F. Flora; Andrea L. Anderson; Wendy J. McGinnls

    1991-01-01

    Pacific Rim trade in softwood logs amounts to about $3 billion annually, of which the U.S. share is about $2 billion. Log exporting is a significant part of the forest economy in the Pacific Northwest. The 10 major Pacific Rim log-trading client and competitor countries differ widely in their roles in trade and in their policies affecting the industry.

  15. Well logging radioactive detector assembly

    International Nuclear Information System (INIS)

    Osburn, T.D.

    1992-01-01

    This patent describes a well logging instrument of the type having a radioactive logging sub having a sealed chamber and have a radioactive source for emitting radioactive energy into the well formation, the instrument having a radioactive energy detector for detecting gamma rays resulting from the emission of the radioactive energy into the well formation, and means for pressing the sub against the well of the well, an improved Dewar flask for the detector. It comprises: an inner housing formed of titanium and containing the detector; an outer housing formed of titanium, having a cylindrical side wall surrounding the inner housing and separated by a clearance which is evacuated, the outer housing being located within the sealed chamber in the sub of the instrument; a window section formed in the side wall of the outer housing adjacent the detector and on a side of the side wall closest to the wall of the well when the sub is pressed against the wall of the well; and wherein the inner housing has a cylindrical side wall that is of lesser wall thickness than the wall thickness of the side wall of the outer housing other than in the window section

  16. A hierarchical model of daily stream temperature using air-water temperature synchronization, autocorrelation, and time lags

    Directory of Open Access Journals (Sweden)

    Benjamin H. Letcher

    2016-02-01

    Full Text Available Water temperature is a primary driver of stream ecosystems and commonly forms the basis of stream classifications. Robust models of stream temperature are critical as the climate changes, but estimating daily stream temperature poses several important challenges. We developed a statistical model that accounts for many challenges that can make stream temperature estimation difficult. Our model identifies the yearly period when air and water temperature are synchronized, accommodates hysteresis, incorporates time lags, deals with missing data and autocorrelation and can include external drivers. In a small stream network, the model performed well (RMSE = 0.59°C, identified a clear warming trend (0.63 °C decade−1 and a widening of the synchronized period (29 d decade−1. We also carefully evaluated how missing data influenced predictions. Missing data within a year had a small effect on performance (∼0.05% average drop in RMSE with 10% fewer days with data. Missing all data for a year decreased performance (∼0.6 °C jump in RMSE, but this decrease was moderated when data were available from other streams in the network.

  17. Glass Transition Temperature- and Specific Volume- Composition Models for Tellurite Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Brian J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-01

    This report provides models for predicting composition-properties for tellurite glasses, namely specific gravity and glass transition temperature. Included are the partial specific coefficients for each model, the component validity ranges, and model fit parameters.

  18. The infinite range Heisenberg model and high temperature superconductivity

    Science.gov (United States)

    Tahir-Kheli, Jamil

    1992-01-01

    The thesis deals with the theory of high temperature superconductivity from the standpoint of three-band Hubbard models.Chapter 1 of the thesis proposes a strongly coupled variational wavefunction that has the three-spin system of an oxygen hole and its two neighboring copper spins in a doublet and the background Cu spins in an eigenstate of the infinite range antiferromagnet. This wavefunction is expected to be a good "zeroth order" wavefunction in the superconducting regime of dopings. The three-spin polaron is stabilized by the hopping terms rather than the copper-oxygen antiferromagnetic coupling Jpd. Considering the effect of the copper-copper antiferromagnetic coupling Jdd, we show that the three-spin polaron cannot be pure Emery (Dg), but must have a non-negligible amount of doublet-u (Du) character for hopping stabilization. Finally, an estimate is made for the magnitude of the attractive coupling of oxygen holes.Chapter 2 presents an exact solution to a strongly coupled Hamiltonian for the motion of oxygen holes in a 1-D Cu-O lattice. The Hamiltonian separates into two pieces: one for the spin degrees of freedom of the copper and oxygen holes, and the other for the charge degrees of freedom of the oxygen holes. The spinon part becomes the Heisenberg antiferromagnet in 1-D that is soluble by the Bethe Ansatz. The holon piece is also soluble by a Bethe Ansatz with simple algebraic relations for the phase shifts.Finally, we show that the nearest neighbor Cu-Cu spin correlation increases linearly with doping and becomes positive at x [...] 0.70.

  19. Infinite-range Heisenberg model and high-temperature superconductivity

    Science.gov (United States)

    Tahir-Kheli, Jamil; Goddard, William A., III

    1993-11-01

    A strongly coupled variational wave function, the doublet spin-projected Néel state (DSPN), is proposed for oxygen holes in three-band models of high-temperature superconductors. This wave function has the three-spin system of the oxygen hole plus the two neighboring copper atoms coupled in a spin-1/2 doublet. The copper spins in the neighborhood of a hole are in an eigenstate of the infinite-range Heisenberg antiferromagnet (SPN state). The doublet three-spin magnetic polaron or hopping polaron (HP) is stabilized by the hopping terms tσ and tτ, rather than by the copper-oxygen antiferromagnetic coupling Jpd. Although, the HP has a large projection onto the Emery (Dg) polaron, a non-negligible amount of doublet-u (Du) character is required for optimal hopping stabilization. This is due to Jdd, the copper-copper antiferromagnetic coupling. For the copper spins near an oxygen hole, the copper-copper antiferromagnetic coupling can be considered to be almost infinite ranged, since the copper-spin-correlation length in the superconducting phase (0.06-0.25 holes per in-plane copper) is approximately equal to the mean separation of the holes (between 2 and 4 lattice spacings). The general DSPN wave function is constructed for the motion of a single quasiparticle in an antiferromagnetic background. The SPN state allows simple calculations of various couplings of the oxygen hole with the copper spins. The energy minimum is found at symmetry (π/2,π/2) and the bandwidth scales with Jdd. These results are in agreement with exact computations on a lattice. The coupling of the quasiparticles leads to an attraction of holes and its magnitude is estimated.

  20. Spectral Noise Logging for well integrity analysis in the mineral water well in Asselian aquifer

    Directory of Open Access Journals (Sweden)

    R.R. Kantyukov

    2017-06-01

    Full Text Available This paper describes a mineral water well with decreasing salinity level according to lab tests. A well integrity package including Spectral Noise Logging (SNL, High-Precision Temperature (HPT logging and electromagnetic defectoscopy (EmPulse was performed in the well which allowed finding casing leaks and fresh water source. In the paper all logging data were thoroughly analyzed and recommendation for workover was mentioned. The SNL-HPT-EmPulse survey allowed avoiding well abandonment.

  1. Adoption of projected mortality table for the Slovenian market using the Poisson log-bilinear model to test the minimum standard for valuing life annuities

    Directory of Open Access Journals (Sweden)

    Darko Medved

    2015-01-01

    Full Text Available With the introduction of Solvency II a consistent market approach to the valuation of insurance assets and liabilities is required. For the best estimate of life annuity provisions one should estimate the longevity risk of the insured population in Slovenia. In this paper the current minimum standard in Slovenia for calculating pension annuities is tested using the Lee-Carter model. In particular, the mortality of the Slovenian population is projected using the best fit from the stochastic mortality projections method. The projected mortality statistics are then corrected with the selection effect and compared with the current minimum standard.

  2. Error Checking for Chinese Query by Mining Web Log

    Directory of Open Access Journals (Sweden)

    Jianyong Duan

    2015-01-01

    Full Text Available For the search engine, error-input query is a common phenomenon. This paper uses web log as the training set for the query error checking. Through the n-gram language model that is trained by web log, the queries are analyzed and checked. Some features including query words and their number are introduced into the model. At the same time data smoothing algorithm is used to solve data sparseness problem. It will improve the overall accuracy of the n-gram model. The experimental results show that it is effective.

  3. A Comparative Analysis of the Value of Information in a Continuous Time Market Model with Partial Information: The Cases of Log-Utility and CRRA

    Directory of Open Access Journals (Sweden)

    Zhaojun Yang

    2011-01-01

    Full Text Available We study the question what value an agent in a generalized Black-Scholes model with partial information attributes to the complementary information. To do this, we study the utility maximization problems from terminal wealth for the two cases partial information and full information. We assume that the drift term of the risky asset is a dynamic process of general linear type and that the two levels of observation correspond to whether this drift term is observable or not. Applying methods from stochastic filtering theory we derive an analytical tractable formula for the value of information in the case of logarithmic utility. For the case of constant relative risk aversion (CRRA we derive a semianalytical formula, which uses as an input the numerical solution of a system of ODEs. For both cases we present a comparative analysis.

  4. Comparison of high-temperature flare models with observations and implications for the low-temperature flare

    International Nuclear Information System (INIS)

    Machado, M.E.; Emslie, A.G.

    1979-01-01

    We analyze EUV data from the Harvard College Observatory and Naval Research Laboratory instruments on board the Skylab Apollo Telescope Mount, together with SOLRAD 9 X-ray data, in order to empirically deduce the variation of emission measure with temperature in the atmosphere of a number of solar flares. From these data we construct a ''mean'' differential emission measure profile Q (T) for a flare, which we find to be characterized by a low-lying plateau at temperatures of a few hundred thousand K, representative of a thin transition zone at these temperatures.We then compare this empirical profile with that predicted by a number of theoretical models, each of which represents a solution of the energy equation for the flare under various simplifying assumptions. In this way we not only deduce estimates of various flare parameters, such as gas pressure, but also gain insight into the validity of the various modeling assumptions employed.We find that realistic flare models must include both conductive and radiative terms in the energy equation, and that hydrodynamic terms may be important at low temperatures. Considering only models which neglect this hydrodynamic term, we compute conductive fluxes at various levels in the high-temperature plasma and compare them to the observed radiated power throughout the atmosphere, with particular reference to the 1973 September 5 event, which is rich in observations throughout most of the electromagnetic spectrum. This comparison yields results which reinforce our belief in the dominance of the conduction and radiation terms in the flare energy balance.The implications of this result for flare models in general is discussed; in particular, it is shown that the inclusion of the conductive term into models which have hitherto neglected it can perhaps resolve some of the observational difficulties with such models

  5. Semi-automatic logarithmic converter of logs

    International Nuclear Information System (INIS)

    Gol'dman, Z.A.; Bondar's, V.V.

    1974-01-01

    Semi-automatic logarithmic converter of logging charts. An original semi-automatic converter was developed for use in converting BK resistance logging charts and the time interval, ΔT, of acoustic logs from a linear to a logarithmic scale with a specific ratio for subsequent combining of them with neutron-gamma logging charts in operative interpretation of logging materials by a normalization method. The converter can be used to increase productivity by giving curves different from those obtained in manual, pointwise processing. The equipment operates reliably and is simple in use. (author)

  6. Electronic Modeling and Design for Extreme Temperatures, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop electronics for operation at temperatures that range from -230oC to +130oC. This new technology will minimize the requirements for external...

  7. Quasispin model of itinerant magnetism: High-temperature theory

    International Nuclear Information System (INIS)

    Liu, S.H.

    1977-01-01

    The high-temperature properties of itinerant magnetic systems are examined by using the coherent-potential approximation. We assume a local moment on each atom so that at elevated temperatures there is a number of reversed spins. The coherent potential is solved, and from that the moment on each atom is determined self-consistently. It is found that when the condition for ferromagnetic ordering is satisfied, the local moments persist even above the critical temperature. Conversely, if local moments do not exist at high temperatures, the system can at most condense into a spin-density-wave state. Furthermore, spin-flip scatterings of the conduction electrons from the local moments give rise to additional correlation not treated in the coherent-potential approximation. This correlation energy is an important part of the coupling energy of the local moments. The relations between our work and the theories of Friedel, Hubbard, and others are discussed

  8. GeoTemp™ 1.0: A MATLAB-based program for the processing, interpretation and modelling of geological formation temperature measurements

    Science.gov (United States)

    Ricard, Ludovic P.; Chanu, Jean-Baptiste

    2013-08-01

    The evaluation of potential and resources during geothermal exploration requires accurate and consistent temperature characterization and modelling of the sub-surface. Existing interpretation and modelling approaches of 1D temperature measurements are mainly focusing on vertical heat conduction with only few approaches that deals with advective heat transport. Thermal regimes are strongly correlated to rock and fluid properties. Currently, no consensus exists for the identification of the thermal regime and the analysis of such dataset. We developed a new framework allowing the identification of thermal regimes by rock formations, the analysis and modelling of wireline logging and discrete temperature measurements by taking into account the geological, geophysical and petrophysics data. This framework has been implemented in the GeoTemp software package that allows the complete thermal characterization and modelling at the formation scale and that provides a set of standard tools for the processing wireline and discrete temperature data. GeoTempTM operates via a user friendly graphical interface written in Matlab that allows semi-automatic calculation, display and export of the results. Output results can be exported as Microsoft Excel spreadsheets or vector graphics of publication quality. GeoTemp™ is illustrated here with an example geothermal application from Western Australia and can be used for academic, teaching and professional purposes.

  9. Identifying the optimal supply temperature in district heating networks - A modelling approach

    DEFF Research Database (Denmark)

    Mohammadi, Soma; Bojesen, Carsten

    2014-01-01

    of this study is to develop a model for thermo-hydraulic calculation of low temperature DH system. The modelling is performed with emphasis on transient heat transfer in pipe networks. The pseudo-dynamic approach is adopted to model the District Heating Network [DHN] behaviour which estimates the temperature...... dynamically while the flow and pressure are calculated on the basis of steady state conditions. The implicit finite element method is applied to simulate the transient temperature behaviour in the network. Pipe network heat losses, pressure drop in the network and return temperature to the plant...... are calculated in the developed model. The model will serve eventually as a basis to find out the optimal supply temperature in an existing DHN in later work. The modelling results are used as decision support for existing DHN; proposing possible modifications to operate at optimal supply temperature....

  10. A Universal Logging System for LHCb Online

    International Nuclear Information System (INIS)

    Nikolaidis, Fotis; Brarda, Loic; Garnier, Jean-Christophe; Neufeld, Niko

    2011-01-01

    A log is recording of system's activity, aimed to help system administrator to traceback an attack, find the causes of a malfunction and generally with troubleshooting. The fact that logs are the only information an administrator may have for an incident, makes logging system a crucial part of an IT infrastructure. In large scale infrastructures, such as LHCb Online, where quite a few GB of logs are produced daily, it is impossible for a human to review all of these logs. Moreover, a great percentage of them as just n oise . That makes clear that a more automated and sophisticated approach is needed. In this paper, we present a low-cost centralized logging system which allow us to do in-depth analysis of every log.

  11. A comparison between elemental logs and core data

    International Nuclear Information System (INIS)

    Kerr, S.A.; Grau, J.A.; Schweitzer, J.S.

    1992-01-01

    Neutron-induced gamma-ray spectroscopy, of prompt capture and delayed activation, together with natural gamma-ray measurements, provides a borehole elemental analysis to characterize rock matrix composition. This study involved extensive core and log data in two wells. One well was drilled with a barite-weighted oil-based mud through a shallow marine sand. The other was drilled with fresh water-based mud through a channel sand, mudstone sequence overlying limestone. The results illustrate the importance of a suitable core sampling strategy and the problems associated with matching core to log data. Possible inaccuracies from the modelling of Ca-, Fe- and S-bearing minerals have been determined. A method for correcting the total measured aluminium concentration for that due to the borehole mud has been successfully tested against aluminium concentrations measured in the cleaned core samples. Estimates of the overall accuracy and precision of the elemental logging concentrations are obtained by comparing the log results with those obtained from the laboratory core analysis. A comprehensive core elemental analysis can also provide useful insight into the way other logs, such as the photoelectric factor or formation thermal neutron macroscopic absorption cross section, are influenced by minor and trace elements. Differences between calculated values from elemental logs and measured macroscopic parameters provide additional data for a more detailed understanding of the rock properties. (Author)

  12. Crop model improvement reduces the uncertainty of the response to temperature of multi-model ensembles

    DEFF Research Database (Denmark)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold

    2017-01-01

    of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA...... ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures >24 °C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME...

  13. Wood moisture monitoring during log house thermal insulation mounting

    Directory of Open Access Journals (Sweden)

    Pavla Kotásková

    2011-01-01

    Full Text Available The current designs of thermal insulation for buildings concentrate on the achievement of the required heat transmission coefficient. However, another factor that cannot be neglected is the assessment of the possible water vapour condensation inside the construction. The aim of the study was to find out whether the designed modification of the cladding structure of an existing log house will or will not lead to a risk of possible water vapour condensation in the walls after an additional thermal insulation mounting. The condensation could result in the increase in moisture of the walls and consequently the constructional timber, which would lead to the reduction of the timber construction strength, wood degradation by biotic factors – wood-destroying insects, mildew or wood-destroying fungi. The main task was to compare the theoretically established values of moisture of the constructional timber with the values measured inside the construction using a specific example of a thermal insulated log house. Three versions of thermal insulation were explored to find the solution of a log house reconstruction which would be the optimum for living purposes. Two versions deal with the cladding structure with the insulation from the interior, the third version deals with an external insulation.In a calculation model the results can be affected to a great degree by input values (boundary conditions. This especially concerns the factor of vapour barrier diffusion resistance, which is entered in accordance with the producer’s specifications; however, its real value can be lower as it depends on the perfectness and correctness of the technological procedure. That is why the study also includes thermal technical calculations of all designed insulation versions in the most unfavourable situation, which includes the degradation of the vapour barrier down to 10% efficiency, i.e. the reduction of the diffusion resistance factor to 10% of the original value

  14. Simulation Control Graphical User Interface Logging Report

    Science.gov (United States)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  15. Two-dimensional model of laser alloying of binary alloy powder with interval of melting temperature

    Science.gov (United States)

    Knyzeva, A. G.; Sharkeev, Yu. P.

    2017-10-01

    The paper contains two-dimensional model of laser beam melting of powders from binary alloy. The model takes into consideration the melting of alloy in some temperature interval between solidus and liquidus temperatures. The external source corresponds to laser beam with energy density distributed by Gauss law. The source moves along the treated surface according to given trajectory. The model allows investigating the temperature distribution and thickness of powder layer depending on technological parameters.

  16. A neutron well logging system

    International Nuclear Information System (INIS)

    1980-01-01

    A pulsed neutron well logging system using a sealed off neutron generator tube is provided with a programmable digital neutron output control system. The control system monitors the target beam current and compares a function of this current with a pre-programmed control function to develop a control signal for the neutron generator. The control signal is used in a series regulator to control the average replenisher current of the neutron generator tube. The programmable digital control system of the invention also provides digital control signals as a function of time to provide ion source voltages. This arrangement may be utilized to control neutron pulses durations and repetition rates or to produce other modulated wave forms for intensity modulating the output of the neutron generator as a function of time. (Auth.)

  17. Indonesian commercial bus drum brake system temperature model

    International Nuclear Information System (INIS)

    Wibowo, D. B.; Haryanto, I.; Laksono, N. P.

    2016-01-01

    Brake system is the most significant aspect of an automobile safety. It must be able to slow the vehicle, quickly intervening and reliable under varying conditions. Commercial bus in Indonesia, which often stops suddenly and has a high initial velocity, will raise the temperature of braking significantly. From the thermal analysis it is observed that for the bus with the vehicle laden mass of 15 tons and initial velocity of 80 km/h the temperature is increasing with time and reaches the highest temperature of 270.1 °C when stops on a flat road and reaches 311.2 °C on a declination road angle, ø, 20°. These temperatures exceeded evaporation temperature of brake oil DOT 3 and DOT 4. Besides that, the magnitude of the braking temperature also potentially lowers the friction coefficient of more than 30%. The brakes are pressed repeatedly and high-g decelerations also causes brake lining wear out quickly and must be replaced every 1 month as well as the emergence of a large thermal stress which can lead to thermal cracking or thermal fatigue crack. Brake fade phenomenon that could be the cause of many buses accident in Indonesia because of the failure of the braking function. The chances of accidents will be even greater when the brake is worn and not immediately replaced which could cause hot spots as rivets attached to the brake drum and brake oil is not changed for more than 2 years that could potentially lower the evaporation temperature because of the effect hygroscopic.

  18. Indonesian commercial bus drum brake system temperature model

    Science.gov (United States)

    Wibowo, D. B.; Haryanto, I.; Laksono, N. P.

    2016-03-01

    Brake system is the most significant aspect of an automobile safety. It must be able to slow the vehicle, quickly intervening and reliable under varying conditions. Commercial bus in Indonesia, which often stops suddenly and has a high initial velocity, will raise the temperature of braking significantly. From the thermal analysis it is observed that for the bus with the vehicle laden mass of 15 tons and initial velocity of 80 km/h the temperature is increasing with time and reaches the highest temperature of 270.1 °C when stops on a flat road and reaches 311.2 °C on a declination road angle, ø, 20°. These temperatures exceeded evaporation temperature of brake oil DOT 3 and DOT 4. Besides that, the magnitude of the braking temperature also potentially lowers the friction coefficient of more than 30%. The brakes are pressed repeatedly and high-g decelerations also causes brake lining wear out quickly and must be replaced every 1 month as well as the emergence of a large thermal stress which can lead to thermal cracking or thermal fatigue crack. Brake fade phenomenon that could be the cause of many buses accident in Indonesia because of the failure of the braking function. The chances of accidents will be even greater when the brake is worn and not immediately replaced which could cause hot spots as rivets attached to the brake drum and brake oil is not changed for more than 2 years that could potentially lower the evaporation temperature because of the effect hygroscopic.

  19. Indonesian commercial bus drum brake system temperature model

    Energy Technology Data Exchange (ETDEWEB)

    Wibowo, D. B., E-mail: rmt.bowo@gmail.com; Haryanto, I., E-mail: ismoyo2001@yahoo.de; Laksono, N. P., E-mail: priyolaksono89@gmail.com [Mechanical Engineering Dept., Faculty of Engineering, Diponegoro University (Indonesia)

    2016-03-29

    Brake system is the most significant aspect of an automobile safety. It must be able to slow the vehicle, quickly intervening and reliable under varying conditions. Commercial bus in Indonesia, which often stops suddenly and has a high initial velocity, will raise the temperature of braking significantly. From the thermal analysis it is observed that for the bus with the vehicle laden mass of 15 tons and initial velocity of 80 km/h the temperature is increasing with time and reaches the highest temperature of 270.1 °C when stops on a flat road and reaches 311.2 °C on a declination road angle, ø, 20°. These temperatures exceeded evaporation temperature of brake oil DOT 3 and DOT 4. Besides that, the magnitude of the braking temperature also potentially lowers the friction coefficient of more than 30%. The brakes are pressed repeatedly and high-g decelerations also causes brake lining wear out quickly and must be replaced every 1 month as well as the emergence of a large thermal stress which can lead to thermal cracking or thermal fatigue crack. Brake fade phenomenon that could be the cause of many buses accident in Indonesia because of the failure of the braking function. The chances of accidents will be even greater when the brake is worn and not immediately replaced which could cause hot spots as rivets attached to the brake drum and brake oil is not changed for more than 2 years that could potentially lower the evaporation temperature because of the effect hygroscopic.

  20. Theoretical model and optimization of a novel temperature sensor based on quartz tuning fork resonators

    International Nuclear Information System (INIS)

    Xu Jun; You Bo; Li Xin; Cui Juan

    2007-01-01

    To accurately measure temperatures, a novel temperature sensor based on a quartz tuning fork resonator has been designed. The principle of the quartz tuning fork temperature sensor is that the resonant frequency of the quartz resonator changes with the variation in temperature. This type of tuning fork resonator has been designed with a new doubly rotated cut work at flexural vibration mode as temperature sensor. The characteristics of the temperature sensor were evaluated and the results sufficiently met the target of development for temperature sensor. The theoretical model for temperature sensing has been developed and built. The sensor structure was analysed by finite element method (FEM) and optimized, including tuning fork geometry, tine electrode pattern and the sensor's elements size. The performance curve of output versus measured temperature is given. The results from theoretical analysis and experiments indicate that the sensor's sensitivity can reach 60 ppm 0 C -1 with the measured temperature range varying from 0 to 100 0 C

  1. Room temperature ionic liquids: A simple model. Effect of chain length and size of intermolecular potential on critical temperature.

    Science.gov (United States)

    Chapela, Gustavo A; Guzmán, Orlando; Díaz-Herrera, Enrique; del Río, Fernando

    2015-04-21

    A model of a room temperature ionic liquid can be represented as an ion attached to an aliphatic chain mixed with a counter ion. The simple model used in this work is based on a short rigid tangent square well chain with an ion, represented by a hard sphere interacting with a Yukawa potential at the head of the chain, mixed with a counter ion represented as well by a hard sphere interacting with a Yukawa potential of the opposite sign. The length of the chain and the depth of the intermolecular forces are investigated in order to understand which of these factors are responsible for the lowering of the critical temperature. It is the large difference between the ionic and the dispersion potentials which explains this lowering of the critical temperature. Calculation of liquid-vapor equilibrium orthobaric curves is used to estimate the critical points of the model. Vapor pressures are used to obtain an estimate of the triple point of the different models in order to calculate the span of temperatures where they remain a liquid. Surface tensions and interfacial thicknesses are also reported.

  2. A Comparative Study of Cox Regression vs. Log-Logistic ...

    African Journals Online (AJOL)

    Colorectal cancer is common and lethal disease with different incidence rate in different parts of the world which is taken into account as the third cause of cancer-related deaths. In the present study, using non-parametric Cox model and parametric Log-logistic model, factors influencing survival of patients with colorectal ...

  3. Scenario and modelling uncertainty in global mean temperature change derived from emission driven Global Climate Models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D.

    2012-09-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration driven simulations (with 10-90 percentile ranges of 1.7 K for the aggressive mitigation scenario up to 3.9 K for the high end business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 degrees (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission driven experiments, they do not change existing expectations (based on previous concentration driven experiments) on the timescale that different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration pathways used to drive GCM ensembles lies towards the lower end of our simulated distribution. This design decision (a legecy of previous assessments) is likely to lead concentration driven experiments to under-sample strong feedback responses in concentration driven projections. Our ensemble of emission driven simulations span the global temperature response of other multi-model frameworks except at the low end, where combinations of low climate sensitivity and low carbon cycle feedbacks lead to responses outside our ensemble range. The ensemble simulates a number of high end responses which lie above the CMIP5 carbon

  4. Modelling of Temperature Profiles and Transport Scaling in Auxiliary Heated Tokamaks

    DEFF Research Database (Denmark)

    Callen, J.D.; Christiansen, J.P.; Cordey, J.G.

    1987-01-01

    time , the heating effectiveness η, and the energy offset W(0). Considering both the temperature profile responses and the global transport scaling, the constant heat pinch or excess temperature gradient model is found to best characterize the present JET data. Finally, new methods are proposed......The temperature profiles produced by various heating profiles are calculated from local heat transport models. The models take the heat flux to be the sum of heat diffusion and a non-diffusive heat flow, consistent with local measurements of heat transport. Two models are developed analytically...... in detail: (i) a heat pinch or excess temperature gradient model with constant coefficients; and (ii) a non-linear heat diffusion coefficient (χ) model. Both models predict weak (lesssim20%) temperature profile responses to physically relevant changes in the heat deposition profile – primarily because...

  5. Modelagem de redes logísticas com custos de inventário calculados a partir da cobertura de estoque Logistic networks modeling with inventory carrying costs calculated using inventory turnover

    Directory of Open Access Journals (Sweden)

    Ricardo Hamad

    2011-01-01

    Full Text Available A consideração dos custos de inventário tem se tornado cada vez mais importante na análise dos trade-offs e na tomada de decisões sobre redes logísticas. O artigo propõe uma metodologia para tratamento do custo de inventário a partir da cobertura de estoque esperada em cada instalação, incorporada na modelagem matemática da localização de fábricas e/ou centros de distribuição em redes logísticas com vários elos, representando uma nova versão do modelo apresentado em Hamad e Gualda (2008. As principais contribuições desta metodologia em relação a outros métodos encontrados na literatura são a facilidade de aplicação do método, a inclusão de restrições ligadas à capacidade de armazenagem e a consideração do custo total de inventário, e não apenas do custo relacionado aos produtos modelados.Inventory carrying cost has become very important in the analysis of the trade-offs and an important component to be considered when taking decisions about a logistic network. This paper proposes a methodology incorporated in a multi-echelon sourcing decision model, to be considered as a new version of the model presented in Hamad, Gualda (2008. It treats the carrying costs using the inventory Days-on-hand estimate in each echelon of the chain (plants and/or Distribution Centers. The main contributions of this methodology compared to other options found in the literature are the simplicity of its application, the consideration of all inventory costs (not only the ones related to the products being modeled and, also, the inclusion of constraints related to warehousing capacity.

  6. Modeling thermal spike driven reactions at low temperature and application to zirconium carbide radiation damage

    Science.gov (United States)

    Ulmer, Christopher J.; Motta, Arthur T.

    2017-11-01

    The development of TEM-visible damage in materials under irradiation at cryogenic temperatures cannot be explained using classical rate theory modeling with thermally activated reactions since at low temperatures thermal reaction rates are too low. Although point defect mobility approaches zero at low temperature, the thermal spikes induced by displacement cascades enable some atom mobility as it cools. In this work a model is developed to calculate "athermal" reaction rates from the atomic mobility within the irradiation-induced thermal spikes, including both displacement cascades and electronic stopping. The athermal reaction rates are added to a simple rate theory cluster dynamics model to allow for the simulation of microstructure evolution during irradiation at cryogenic temperatures. The rate theory model is applied to in-situ irradiation of ZrC and compares well at cryogenic temperatures. The results show that the addition of the thermal spike model makes it possible to rationalize microstructure evolution in the low temperature regime.

  7. Element size and other restrictions in finite-element modeling of reinforced concrete at elevated temperatures

    DEFF Research Database (Denmark)

    Carstensen, Josephine Voigt; Jomaas, Grunde; Pankaj, Pankaj

    2013-01-01

    to extend this approach for RC at elevated temperatures. Prior to the extension, the approach is investigated for associated modeling issues and a set of limits of application are formulated. The available models of the behavior of plain concrete at elevated temperatures were used to derive inherent......One of the accepted approaches for postpeak finite-element modeling of RC comprises combining plain concrete, reinforcement, and interaction behaviors. In these, the postpeak strain-softening behavior of plain concrete is incorporated by the use of fracture energy concepts. This study attempts...... fracture energy variation with temperature. It is found that the currently used tensile elevated temperature model assumes that the fracture energy decays with temperature. The existing models in compression also show significant decay of fracture energy at higher temperatures (>400°) and a considerable...

  8. Effects of temperature on development, survival and reproduction of insects: Experimental design, data analysis and modeling

    Science.gov (United States)

    Jacques Regniere; James Powell; Barbara Bentz; Vincent Nealis

    2012-01-01

    The developmental response of insects to temperature is important in understanding the ecology of insect life histories. Temperature-dependent phenology models permit examination of the impacts of temperature on the geographical distributions, population dynamics and management of insects. The measurement of insect developmental, survival and reproductive responses to...

  9. Dynamical Symmetry Breaking of Maximally Generalized Yang-Mills Model and Its Restoration at Finite Temperatures

    International Nuclear Information System (INIS)

    Wang Dianfu

    2008-01-01

    In terms of the Nambu-Jona-Lasinio mechanism, dynamical breaking of gauge symmetry for the maximally generalized Yang-Mills model is investigated. The gauge symmetry behavior at finite temperature is also investigated and it is shown that the gauge symmetry broken dynamically at zero temperature can be restored at finite temperatures

  10. Performance of a Predictive Model for Calculating Ascent Time to a Target Temperature

    Directory of Open Access Journals (Sweden)

    Jin Woo Moon

    2016-12-01

    Full Text Available The aim of this study was to develop an artificial neural network (ANN prediction model for controlling building heating systems. This model was used to calculate the ascent time of indoor temperature from the setback period (when a building was not occupied to a target setpoint temperature (when a building was occupied. The calculated ascent time was applied to determine the proper moment to start increasing the temperature from the setback temperature to reach the target temperature at an appropriate time. Three major steps were conducted: (1 model development; (2 model optimization; and (3 performance evaluation. Two software programs—Matrix Laboratory (MATLAB and Transient Systems Simulation (TRNSYS—were used for model development, performance tests, and numerical simulation methods. Correlation analysis between input variables and the output variable of the ANN model revealed that two input variables (current indoor air temperature and temperature difference from the target setpoint temperature, presented relatively strong relationships with the ascent time to the target setpoint temperature. These two variables were used as input neurons. Analyzing the difference between the simulated and predicted values from the ANN model provided the optimal number of hidden neurons (9, hidden layers (3, moment (0.9, and learning rate (0.9. At the study’s conclusion, the optimized model proved its prediction accuracy with acceptable errors.

  11. A stream temperature model for the Peace-Athabasca River basin

    Science.gov (United States)

    Morales-Marin, L. A.; Rokaya, P.; Wheater, H. S.; Lindenschmidt, K. E.

    2017-12-01

    Water temperature plays a fundamental role in water ecosystem functioning. Because it regulates flow energy and metabolic rates in organism productivity over a broad spectrum of space and time scales, water temperature constitutes an important indicator of aquatic ecosystems health. In cold region basins, stream water temperature modelling is also fundamental to predict ice freeze-up and break-up events in order to improve flood management. Multiple model approaches such as linear and multivariable regression methods, neural network and thermal energy budged models have been developed and implemented to simulate stream water temperature. Most of these models have been applied to specific stream reaches and trained using observed data, but very little has been done to simulate water temperature in large catchment river networks. We present the coupling of RBM model, a semi-Lagrangian water temperature model for advection-dominated river system, and MESH, a semi-distributed hydrological model, to simulate stream water temperature in river catchments. The coupled models are implemented in the Peace-Athabasca River basin in order to analyze the variation in stream temperature regimes under changing hydrological and meteorological conditions. Uncertainty of stream temperature simulations is also assessed in order to determine the degree of reliability of the estimates.

  12. A simple model for predicting soil temperature in snow-covered and seasonally frozen soil: model description and testing

    Directory of Open Access Journals (Sweden)

    K. Rankinen

    2004-01-01

    Full Text Available Microbial processes in soil are moisture, nutrient and temperature dependent and, consequently, accurate calculation of soil temperature is important for modelling nitrogen processes. Microbial activity in soil occurs even at sub-zero temperatures so that, in northern latitudes, a method to calculate soil temperature under snow cover and in frozen soils is required. This paper describes a new and simple model to calculate daily values for soil temperature at various depths in both frozen and unfrozen soils. The model requires four parameters: average soil thermal conductivity, specific heat capacity of soil, specific heat capacity due to freezing and thawing and an empirical snow parameter. Precipitation, air temperature and snow depth (measured or calculated are needed as input variables. The proposed model was applied to five sites in different parts of Finland representing different climates and soil types. Observed soil temperatures at depths of 20 and 50 cm (September 1981–August 1990 were used for model calibration. The calibrated model was then tested using observed soil temperatures from September 1990 to August 2001. R2-values of the calibration period varied between 0.87 and 0.96 at a depth of 20 cm and between 0.78 and 0.97 at 50 cm. R2-values of the testing period were between 0.87 and 0.94 at a depth of 20cm, and between 0.80 and 0.98 at 50cm. Thus, despite the simplifications made, the model was able to simulate soil temperature at these study sites. This simple model simulates soil temperature well in the uppermost soil layers where most of the nitrogen processes occur. The small number of parameters required means that the model is suitable for addition to catchment scale models. Keywords: soil temperature, snow model

  13. Elemental logging in the KTB Pilot Hole. Pt. 1

    International Nuclear Information System (INIS)

    Grau, J.A.; Schweitzer, J.S.; Draxler, J.K.; Gatto, H.; Lauterjung, J.

    1993-01-01

    Neutron-induced γ-ray spectrometry, of prompt capture and delayed activation, together with natural γ-ray measurements, provide a borehole elemental analysis to characterize rock matrix composition. Elemental concentrations from the prompt capture measurements are derived through the use of a closure model that was developed from data on rocks in a sedimentary environment. This set of spectrometers was used to log the 4000 m of the German Continental Deep Drilling Project (KTB) Pilot Hole. The model was tested, with a minor change, for suitability to the crystalline rock environment. Good overall agreement was found between the logging measurements and laboratory analyses performed on cuttings and cores. (Author)

  14. Nuclear-Thermal Analysis of Fully Ceramic Microencapsulated Fuel via Two-Temperature Homogenized Model

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Nam Zin

    2013-01-01

    The FCM fuel is based on a proven safety philosophy that has been utilized operationally in very high temperature reactors (VHTRs). However, the FCM fuel consists of TRISO particles randomly dispersed in SiC matrix. The high heterogeneity in composition leads to difficulty in explicit thermal calculation of such a fuel. Therefore, an appropriate homogenization model becomes essential. In this paper, we apply the two-temperature homogenized model to thermal analysis of an FCM fuel. The model was recently proposed in order to provide more realistic temperature profiles in the fuel element in VHTRs. We applied the two-temperature homogenized model to FCM fuel. The two-temperature homogenized model was obtained by particle transport Monte Carlo calculation applied to the pellet region consisting of many coated particles uniformly dispersed in SiC matrix. Since this model gives realistic temperature profiles in the pellet (providing fuel-kernel temperature and SiC matrix temperature distinctly), it can be used for more accurate neutronics evaluation such as Doppler temperature feedback. The transient thermal calculation may be performed also more realistically with temperature-dependent homogenized parameters in various scenarios

  15. Axial temperatures and fuel management models for a HTR system

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, U

    1971-11-12

    In the HTR system, there is a large difference in temperature between different parts of the reactor core. The softer neutron spectrum in the upper colder core regions tends to shift the power productions in the fresh fuel upwards. As uranium 235 depletes and plutonium with its higher cross sections in the lower hot regions is built-up, an axial power flattening takes place. These effects have been studied in detail for a single column in an equilibrium environment. The aim of this paper is to relate these findings to a whole reactor core and to investigate the influence of axial temperatures on the overall performance and in particular, the fuel management scheme chosen for the reference design. A further objective has been to calculate the reactivity requirements for different part load conditions and for various daily and weekly load diagrams. As the xenon cross section changes significantly with temperature these investigations are performed for an equilibrium core with due representation of axial temperature zones.

  16. Climate change, global warming and coral reefs: modelling the effects of temperature.

    Science.gov (United States)

    Crabbe, M James C

    2008-10-01

    Climate change and global warming have severe consequences for the survival of scleractinian (reef-building) corals and their associated ecosystems. This review summarizes recent literature on the influence of temperature on coral growth, coral bleaching, and modelling the effects of high temperature on corals. Satellite-based sea surface temperature (SST) and coral bleaching information available on the internet is an important tool in monitoring and modelling coral responses to temperature. Within the narrow temperature range for coral growth, corals can respond to rate of temperature change as well as to temperature per se. We need to continue to develop models of how non-steady-state processes such as global warming and climate change will affect coral reefs.

  17. Development of a temperature-dependent cyclic plasticity constitutive model for SUS304 steel

    International Nuclear Information System (INIS)

    Takahashi, Yukio

    1990-01-01

    Development of an accurate inelastic constitutive model is required to improve the accuracy of inelastic analysis for structural components used in the inelastic region. Based on two fundamental assumptions derived from physical interpretation of temperature dependency of the plastic deformation behavior of type 304 stainless steel, a temperature-dependent cyclic plastic constitutive model is constructed here. Particular emphasis is placed on the modeling of enhanced hardening caused by the dynamic strain aging effect observed in some temperature regimes. Constants and functions involved in the model are determined based on the deformation characteristics observed in the low-cycle fatigue tests conducted at room temperature through 600degC. Several comparisons of model predictions with experimental data show the effectiveness of the present model in non-isothermal condition as well as in isothermal condition between room temperature and 600degC. (author)

  18. Mean atmospheric temperature model estimation for GNSS meteorology using AIRS and AMSU data

    Directory of Open Access Journals (Sweden)

    Rata Suwantong

    2017-03-01

    Full Text Available In this paper, the problem of modeling the relationship between the mean atmospheric and air surface temperatures is addressed. Particularly, the major goal is to estimate the model parameters at a regional scale in Thailand. To formulate the relationship between the mean atmospheric and air surface temperatures, a triply modulated cosine function was adopted to model the surface temperature as a periodic function. The surface temperature was then converted to mean atmospheric temperature using a linear function. The parameters of the model were estimated using an extended Kalman filter. Traditionally, radiosonde data is used. In this paper, satellite data from an atmospheric infrared sounder, and advanced microwave sounding unit sensors was used because it is open source data and has global coverage with high temporal resolution. The performance of the proposed model was tested against that of a global model via an accuracy assessment of the computed GNSS-derived PWV.

  19. Temperature modulation with an esophageal heat transfer device- a pediatric swine model study

    OpenAIRE

    Kulstad, Erik B; Naiman, Melissa; Shanley, Patrick; Garrett, Frank; Haryu, Todd; Waller, Donald; Azarafrooz, Farshid; Courtney, Daniel Mark

    2015-01-01

    Background An increasing number of conditions appear to benefit from control and modulation of temperature, but available techniques to control temperature often have limitations, particularly in smaller patients with high surface to mass ratios. We aimed to evaluate a new method of temperature modulation with an esophageal heat transfer device in a pediatric swine model, hypothesizing that clinically significant modulation in temperature (both increases and decreases of more than 1?C) would ...

  20. Modelling for Temperature Non-Isothermal Continuous Stirred Tank Reactor Using Fuzzy Logic

    OpenAIRE

    Nasser Mohamed Ramli; Mohamad Syafiq Mohamad

    2017-01-01

    Many types of controllers were applied on the continuous stirred tank reactor (CSTR) unit to control the temperature. In this research paper, Proportional-Integral-Derivative (PID) controller are compared with Fuzzy Logic controller for temperature control of CSTR. The control system for temperature non-isothermal of a CSTR will produce a stable response curve to its set point temperature. A mathematical model of a CSTR using the most general operating condition was developed through a set of...