WorldWideScience

Sample records for selective refinement approach

  1. Refining processes of selected copper alloys

    Directory of Open Access Journals (Sweden)

    S. Rzadkosz

    2009-04-01

    Full Text Available The analysis of the refining effectiveness of the liquid copper and selected copper alloys by various micro additions and special refiningsubstances – was performed. Examinations of an influence of purifying, modifying and deoxidation operations performed in a metal bath on the properties of certain selected alloys based on copper matrix - were made. Refining substances, protecting-purifying slag, deoxidation and modifying substances containing micro additions of such elements as: zirconium, boron, phosphor, sodium, lithium, or their compounds introduced in order to change micro structures and properties of alloys, were applied in examinations. A special attention was directed to macro and micro structures of alloys, their tensile and elongation strength and hot-cracks sensitivity. Refining effects were estimated by comparing the effectiveness of micro structure changes with property changes of copper and its selected alloys from the group of tin bronzes.

  2. Repetitive Identification of Structural Systems Using a Nonlinear Model Parameter Refinement Approach

    Directory of Open Access Journals (Sweden)

    Jeng-Wen Lin

    2009-01-01

    Full Text Available This paper proposes a statistical confidence interval based nonlinear model parameter refinement approach for the health monitoring of structural systems subjected to seismic excitations. The developed model refinement approach uses the 95% confidence interval of the estimated structural parameters to determine their statistical significance in a least-squares regression setting. When the parameters' confidence interval covers the zero value, it is statistically sustainable to truncate such parameters. The remaining parameters will repetitively undergo such parameter sifting process for model refinement until all the parameters' statistical significance cannot be further improved. This newly developed model refinement approach is implemented for the series models of multivariable polynomial expansions: the linear, the Taylor series, and the power series model, leading to a more accurate identification as well as a more controllable design for system vibration control. Because the statistical regression based model refinement approach is intrinsically used to process a “batch” of data and obtain an ensemble average estimation such as the structural stiffness, the Kalman filter and one of its extended versions is introduced to the refined power series model for structural health monitoring.

  3. A refined approach: Saudi Arabia moves beyond crude

    International Nuclear Information System (INIS)

    Krane, Jim

    2015-01-01

    Saudi Arabia's role in global energy markets is changing. The kingdom is reshaping itself as a supplier of refined petroleum products while moving beyond its long-held role as a simple exporter of crude oil. This change is commensurate with the typical development trajectory of a state progressing to a more advanced stage of global economic integration. Gains from increased refining include reducing fuel imports and capturing margins now bequeathed to competitors. Refining also allows the kingdom to export its heavy crude oil to a wider array of customers, beyond select importers configured to handle heavy crudes. However, the move also presents strategic complications. The world's 'swing supplier' of oil may grow less willing or able to adjust supply to suit market demands. In the process, Saudi Arabia may have to update the old “oil for security” relationship that links it with Washington, augmenting it with a more diverse set of economic and investment ties with individual companies and countries, including China. -- Highlights: •Saudi Arabia is diverting crude oil into an expanding refining sector. •In doing so, the kingdom is moving beyond its role as global “swing supplier” of crude oil. •The kingdom will benefit from increased refining, including enhanced demand for heavy crude. •Strategic complications may force it to seek security partners beyond Washington

  4. Refining mass formulas for astrophysical applications: A Bayesian neural network approach

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2017-10-01

    Background: Exotic nuclei, particularly those near the drip lines, are at the core of one of the fundamental questions driving nuclear structure and astrophysics today: What are the limits of nuclear binding? Exotic nuclei play a critical role in both informing theoretical models as well as in our understanding of the origin of the heavy elements. Purpose: Our aim is to refine existing mass models through the training of an artificial neural network that will mitigate the large model discrepancies far away from stability. Methods: The basic paradigm of our two-pronged approach is an existing mass model that captures as much as possible of the underlying physics followed by the implementation of a Bayesian neural network (BNN) refinement to account for the missing physics. Bayesian inference is employed to determine the parameters of the neural network so that model predictions may be accompanied by theoretical uncertainties. Results: Despite the undeniable quality of the mass models adopted in this work, we observe a significant improvement (of about 40%) after the BNN refinement is implemented. Indeed, in the specific case of the Duflo-Zuker mass formula, we find that the rms deviation relative to experiment is reduced from σrms=0.503 MeV to σrms=0.286 MeV. These newly refined mass tables are used to map the neutron drip lines (or rather "drip bands") and to study a few critical r -process nuclei. Conclusions: The BNN approach is highly successful in refining the predictions of existing mass models. In particular, the large discrepancy displayed by the original "bare" models in regions where experimental data are unavailable is considerably quenched after the BNN refinement. This lends credence to our approach and has motivated us to publish refined mass tables that we trust will be helpful for future astrophysical applications.

  5. Spatially adaptive hp refinement approach for PN neutron transport equation using spectral element method

    International Nuclear Information System (INIS)

    Nahavandi, N.; Minuchehr, A.; Zolfaghari, A.; Abbasi, M.

    2015-01-01

    Highlights: • Powerful hp-SEM refinement approach for P N neutron transport equation has been presented. • The method provides great geometrical flexibility and lower computational cost. • There is a capability of using arbitrary high order and non uniform meshes. • Both posteriori and priori local error estimation approaches have been employed. • High accurate results are compared against other common adaptive and uniform grids. - Abstract: In this work we presented the adaptive hp-SEM approach which is obtained from the incorporation of Spectral Element Method (SEM) and adaptive hp refinement. The SEM nodal discretization and hp adaptive grid-refinement for even-parity Boltzmann neutron transport equation creates powerful grid refinement approach with high accuracy solutions. In this regard a computer code has been developed to solve multi-group neutron transport equation in one-dimensional geometry using even-parity transport theory. The spatial dependence of flux has been developed via SEM method with Lobatto orthogonal polynomial. Two commonly error estimation approaches, the posteriori and the priori has been implemented. The incorporation of SEM nodal discretization method and adaptive hp grid refinement leads to high accurate solutions. Coarser meshes efficiency and significant reduction of computer program runtime in comparison with other common refining methods and uniform meshing approaches is tested along several well-known transport benchmarks

  6. Risk as economic category: systematics scientific approach and refinement contents

    OpenAIRE

    V.G. Vygovskyy

    2015-01-01

    The paper studies the categorical-conceptual apparatus of risk and its refinement based on a critical analysis of existing systematic scientific approaches. Determined that in the refinement of the economic nature of the risk of a number of controversial issues: the definition of the objective or subjective nature of risk; matching of concepts such as «risk», «danger», «loss», «probability of loss»; definition of negative or positive consequences of risk; identification of risk with its conse...

  7. Refining a Tool for the Selection of Experts in Educational Research

    Directory of Open Access Journals (Sweden)

    Miguel Cruz Ramírez

    2012-11-01

    Full Text Available In this paper we report a research study geared toward refining an empirical instrument for the selection of experts for educational research, according to its reliability and internal consistency. To this end we used a three-round Delphi technique and subjected the results to a factor analysis. Latent variables were determined that explain the nature of the sources of argumentation necessary for ensuring an adequate level of competence on the part of the experts.

  8. An adaptive mesh refinement approach for average current nodal expansion method in 2-D rectangular geometry

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► A new adaptive h-refinement approach has been developed for a class of nodal method. ► The resulting system of nodal equations is more amenable to efficient numerical solution. ► The benefit of the approach is reducing computational efforts relative to the uniform fine mesh modeling. ► Spatially adaptive approach greatly enhances the accuracy of the solution. - Abstract: The aim of this work is to develop a spatially adaptive coarse mesh strategy that progressively refines the nodes in appropriate regions of domain to solve the neutron balance equation by zeroth order nodal expansion method. A flux gradient based a posteriori estimation scheme has been utilized for checking the approximate solutions for various nodes. The relative surface net leakage of nodes has been considered as an assessment criterion. In this approach, the core module is called in by adaptive mesh generator to determine gradients of node surfaces flux to explore the possibility of node refinements in appropriate regions and directions of the problem. The benefit of the approach is reducing computational efforts relative to the uniform fine mesh modeling. For this purpose, a computer program ANRNE-2D, Adaptive Node Refinement Nodal Expansion, has been developed to solve neutron diffusion equation using average current nodal expansion method for 2D rectangular geometries. Implementing the adaptive algorithm confirms its superiority in enhancing the accuracy of the solution without using fine nodes throughout the domain and increasing the number of unknown solution. Some well-known benchmarks have been investigated and improvements are reported

  9. Use of nutrient self selection as a diet refining tool in Tenebrio molitor (Coleoptera: Tenebrionidae)

    Science.gov (United States)

    A new method to refine existing dietary supplements for improving production of the yellow mealworm, Tenebrio molitor L. (Coleoptera: Tenebrionidae), was tested. Self selected ratios of 6 dietary ingredients by T. molitor larvae were used to produce a dietary supplement. This supplement was compared...

  10. Refinement by interface instantiation

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Hoang, Thai Son

    2012-01-01

    be easily refined. Our first contribution hence is a proposal for a new construct called interface that encapsulates the external variables, along with a mechanism for interface instantiation. Using the new construct and mechanism, external variables can be refined consistently. Our second contribution...... is an approach for verifying the correctness of Event-B extensions using the supporting Rodin tool. We illustrate our approach by proving the correctness of interface instantiation....

  11. An Ensemble-Based Training Data Refinement for Automatic Crop Discrimination Using WorldView-2 Imagery

    DEFF Research Database (Denmark)

    Chellasamy, Menaka; Ferre, Ty Paul; Greve, Mogens Humlekrog

    2015-01-01

    This paper presents a new approach for refining and selecting training data for satellite imagery-based crop discrimination. The goal of this approach is to automate the pixel-based “multievidence crop classification approach,” proposed by the authors in their previous research. The present study...

  12. A spatially adaptive grid-refinement approach for the finite element solution of the even-parity Boltzmann transport equation

    International Nuclear Information System (INIS)

    Mirza, Anwar M.; Iqbal, Shaukat; Rahman, Faizur

    2007-01-01

    A spatially adaptive grid-refinement approach has been investigated to solve the even-parity Boltzmann transport equation. A residual based a posteriori error estimation scheme has been utilized for checking the approximate solutions for various finite element grids. The local particle balance has been considered as an error assessment criterion. To implement the adaptive approach, a computer program ADAFENT (adaptive finite elements for neutron transport) has been developed to solve the second order even-parity Boltzmann transport equation using K + variational principle for slab geometry. The program has a core K + module which employs Lagrange polynomials as spatial basis functions for the finite element formulation and Legendre polynomials for the directional dependence of the solution. The core module is called in by the adaptive grid generator to determine local gradients and residuals to explore the possibility of grid refinements in appropriate regions of the problem. The a posteriori error estimation scheme has been implemented in the outer grid refining iteration module. Numerical experiments indicate that local errors are large in regions where the flux gradients are large. A comparison of the spatially adaptive grid-refinement approach with that of uniform meshing approach for various benchmark cases confirms its superiority in greatly enhancing the accuracy of the solution without increasing the number of unknown coefficients. A reduction in the local errors of the order of 10 2 has been achieved using the new approach in some cases

  13. A spatially adaptive grid-refinement approach for the finite element solution of the even-parity Boltzmann transport equation

    Energy Technology Data Exchange (ETDEWEB)

    Mirza, Anwar M. [Department of Computer Science, National University of Computer and Emerging Sciences, NUCES-FAST, A.K. Brohi Road, H-11, Islamabad (Pakistan)], E-mail: anwar.m.mirza@gmail.com; Iqbal, Shaukat [Faculty of Computer Science and Engineering, Ghulam Ishaq Khan (GIK) Institute of Engineering Science and Technology, Topi-23460, Swabi (Pakistan)], E-mail: shaukat@giki.edu.pk; Rahman, Faizur [Department of Physics, Allama Iqbal Open University, H-8 Islamabad (Pakistan)

    2007-07-15

    A spatially adaptive grid-refinement approach has been investigated to solve the even-parity Boltzmann transport equation. A residual based a posteriori error estimation scheme has been utilized for checking the approximate solutions for various finite element grids. The local particle balance has been considered as an error assessment criterion. To implement the adaptive approach, a computer program ADAFENT (adaptive finite elements for neutron transport) has been developed to solve the second order even-parity Boltzmann transport equation using K{sup +} variational principle for slab geometry. The program has a core K{sup +} module which employs Lagrange polynomials as spatial basis functions for the finite element formulation and Legendre polynomials for the directional dependence of the solution. The core module is called in by the adaptive grid generator to determine local gradients and residuals to explore the possibility of grid refinements in appropriate regions of the problem. The a posteriori error estimation scheme has been implemented in the outer grid refining iteration module. Numerical experiments indicate that local errors are large in regions where the flux gradients are large. A comparison of the spatially adaptive grid-refinement approach with that of uniform meshing approach for various benchmark cases confirms its superiority in greatly enhancing the accuracy of the solution without increasing the number of unknown coefficients. A reduction in the local errors of the order of 10{sup 2} has been achieved using the new approach in some cases.

  14. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  15. Risk as economic category: systematics scientific approach and refinement contents

    Directory of Open Access Journals (Sweden)

    V.G. Vygovskyy

    2015-03-01

    Full Text Available The paper studies the categorical-conceptual apparatus of risk and its refinement based on a critical analysis of existing systematic scientific approaches. Determined that in the refinement of the economic nature of the risk of a number of controversial issues: the definition of the objective or subjective nature of risk; matching of concepts such as «risk», «danger», «loss», «probability of loss»; definition of negative or positive consequences of risk; identification of risk with its consequences, or source of origin, which makes the relevance of research topics. As a result of scientific research has been refined interpretation of risk as an economic category, the characteristics of the company associated with the probability of unforeseen situations that may lead to negative and positive impacts, assessment of which requires the development of alternatives for management decisions. Clarification of the definition focuses on the possibility (probability of a favorable (unfavorable events which require certain corrective action management unit of the enterprise. The author emphasizes the mandatory features of the category of «risk», in particular: the concept of risk is always associated with the uncertainty of the future; event occurring has implications for the enterprise (both negative and positive; consequences for necessitates the development of a number of alternative solutions to the possible elimination of the negative consequences of risky events; risk – a mandatory attribute of modern management (its value is enhanced in terms of market conditions; subject to risk assessment and management by the company. Dedicated and updated features contribute to the clarification of the nature of the economic risk and categorical conceptual apparatus of risk management.

  16. a Novel Approach to Veterinary Spatial Epidemiology: Dasymetric Refinement of the Swiss Dog Tumor Registry Data

    Science.gov (United States)

    Boo, G.; Fabrikant, S. I.; Leyk, S.

    2015-08-01

    In spatial epidemiology, disease incidence and demographic data are commonly summarized within larger regions such as administrative units because of privacy concerns. As a consequence, analyses using these aggregated data are subject to the Modifiable Areal Unit Problem (MAUP) as the geographical manifestation of ecological fallacy. In this study, we create small area disease estimates through dasymetric refinement, and investigate the effects on predictive epidemiological models. We perform a binary dasymetric refinement of municipality-aggregated dog tumor incidence counts in Switzerland for the year 2008 using residential land as a limiting ancillary variable. This refinement is expected to improve the quality of spatial data originally aggregated within arbitrary administrative units by deconstructing them into discontinuous subregions that better reflect the underlying population distribution. To shed light on effects of this refinement, we compare a predictive statistical model that uses unrefined administrative units with one that uses dasymetrically refined spatial units. Model diagnostics and spatial distributions of model residuals are assessed to evaluate the model performances in different regions. In particular, we explore changes in the spatial autocorrelation of the model residuals due to spatial refinement of the enumeration units in a selected mountainous region, where the rugged topography induces great shifts of the analytical units i.e., residential land. Such spatial data quality refinement results in a more realistic estimation of the population distribution within administrative units, and thus, in a more accurate modeling of dog tumor incidence patterns. Our results emphasize the benefits of implementing a dasymetric modeling framework in veterinary spatial epidemiology.

  17. A refinement methodology for object-oriented programs

    OpenAIRE

    Tafat , Asma; Boulmé , Sylvain; Marché , Claude

    2010-01-01

    International audience; Refinement is a well-known approach for developing correct-byconstruction software. It has been very successful for producing high quality code e.g., as implemented in the B tool. Yet, such refinement techniques are restricted in the sense that they forbid aliasing (and more generally sharing of data-structures), which often happens in usual programming languages. We propose a sound approach for refinement in presence of aliases. Suitable abstractions of programs are d...

  18. Modulation wave approach to the structural parameterization and Rietveld refinement of low carnegieite

    International Nuclear Information System (INIS)

    Withers, R.L.; Thompson, J.G.

    1993-01-01

    The crystal structure of low carnegieite, NaAlSiO 4 [M r =142.05, orthorhombic, Pb2 1 a, a=10.261(1), b=14.030(2), c=5.1566(6) A, D x =2.542 g cm -3 , Z=4, Cu Kα 1 , λ=1.5406 A, μ=77.52 cm -1 , F(000)=559.85], is determined via Rietveld refinement from powder data, R p =0.057, R wp =0.076, R Bragg =0.050. Given that there are far too many parameters to be determined via unconstrained Rietveld refinement, a group theoretical or modulation wave approach is used in order to parameterize the structural deviation of low carnegieite from its underlying C9 aristotype. Appropriate crystal chemical constraints are applied in order to provide two distinct plausible starting models for the structure of the aluminosilicate framework. The correct starting model for the aluminosilicate framework as well as the ordering and positions of the non-framework Na atoms are then determined via Rietveld refinement. At all stages, chemical plausibility is checked via the use of the bond-length-bond-valence formalism. The JCPDS file number for low carnegieite is 44-1496. (orig.)

  19. Innovation During the Supplier Selection Process

    DEFF Research Database (Denmark)

    Pilkington, Alan; Pedraza, Isabel

    2014-01-01

    Established ideas on supplier selection have not moved much from the original premise of how to choose between bidders. Whilst we have added many different tools and refinements to choose between alternative suppliers, its nature has not evolved. We move the original selection process approach...... observed through an ethnographic embedded researcher study has refined the selection process and has two selection stages one for first supply covering tool/process developed and another later for resupply of mature parts. We report the details of the process, those involved, the criteria employed...... and identify benefits and weaknesses of this enhanced selection process....

  20. Grain refinement of zinc-aluminium alloys

    International Nuclear Information System (INIS)

    Zaid, A.I.O.

    2006-01-01

    It is now well-established that the structure of the zinc-aluminum die casting alloys can be modified by the binary Al-Ti or the ternary Al-Ti-B master alloys. in this paper, grain refinement of zinc-aluminum alloys by rare earth materials is reviewed and discussed. The importance of grain refining of these alloys and parameters affecting it are presented and discussed. These include parameters related to the Zn-Al alloys cast, parameters related to the grain refining elements or alloys and parameters related to the process. The effect of addition of other alloying elements e.g. Zr either alone or in the presence of the main grain refiners Ti or Ti + B on the grain refining efficiency is also reviewed and discussed. Furthermore, based on the grain refinement and the parameters affecting it, a criterion for selection of the optimum grain refiner is suggested. Finally, the recent research work on the effect of grain refiners on the mechanical behaviour, impact strength, wear resistance, and fatigue life of these alloys are presented and discussed. (author)

  1. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics

    Science.gov (United States)

    Claudio-Campos, Karla; Rivera-Miranda, Giselle; Bermúdez-Bosch, Luis; Renta, Jessicca Y.; Cadilla, Carmen L.; Cruz, Iadelisse; Feliu, Juan F.; Vergara, Cunegundo; Ruaño, Gualberto

    2016-01-01

    Aim This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients. Patients & Methods A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals. Results The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day) than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day), and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (pwarfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias). Conclusions Results supported our rationale to incorporate individual’s genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics. Trial Registration ClinicalTrials.gov NCT01318057 PMID:26745506

  2. Refinement of RAIM via Implementation of Implicit Euler Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoonhee; Kim, Han-Chul [Korea Institute of Nuclear and Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The first approach is a mechanistic approach which is used in LIRIC in which more than 200 reactions are modeled in detail. This approach enables to perform the detailed analysis. However, it requires huge computation burden. The other approach is a simplified model approach which is used in the IMOD, ASTEC/IODE, and etc. Recently, KINS has developed RAIM (Radio-Active Iodine chemistry Model) based on the simplified model approach. Since the numerical analysis module in RAIM is based on the explicit Euler method, there are major issues on the stability of the module. Therefore, implementation of a stable numerical method becomes essential. In this study, RAIM is refined via implementation of implicit Euler method in which the Newton method is used to find the solutions at each time step. The refined RAIM is tested by comparing to RAIM based on the explicit Euler method. In this paper, RAIM was refined by implementing the implicit Euler method. At each time step of the method in the refined RAIM, the reaction kinetics equations are solved by the Newton method in which elements of the Jacobian matrix are expressed analytically. With the results of OECD-BIP P10T2 test, the refined RAIM was compared to RAIM with the explicit Euler method. The refined RAIM shows better agreement with the experimental data than those from the explicit Euler method. For the rapid change of pH during the experiment, the refined RAIM gives more realistic changes in the concentrations of chemical species than those from the explicit Euler method. In addition, in terms of computing time, the refined RAIM shows comparable computing time to that with explicit Euler method. These comparisons are attributed to ⁓10 times larger time step size used in the implicit Euler method, even though computation burden at each time step in the refined RAIM is much higher than that of the explicit Euler method. Compared to the experimental data, the refined RAIM still shows discrepancy, which are attributed

  3. Refinement of RAIM via Implementation of Implicit Euler Method

    International Nuclear Information System (INIS)

    Lee, Yoonhee; Kim, Han-Chul

    2016-01-01

    The first approach is a mechanistic approach which is used in LIRIC in which more than 200 reactions are modeled in detail. This approach enables to perform the detailed analysis. However, it requires huge computation burden. The other approach is a simplified model approach which is used in the IMOD, ASTEC/IODE, and etc. Recently, KINS has developed RAIM (Radio-Active Iodine chemistry Model) based on the simplified model approach. Since the numerical analysis module in RAIM is based on the explicit Euler method, there are major issues on the stability of the module. Therefore, implementation of a stable numerical method becomes essential. In this study, RAIM is refined via implementation of implicit Euler method in which the Newton method is used to find the solutions at each time step. The refined RAIM is tested by comparing to RAIM based on the explicit Euler method. In this paper, RAIM was refined by implementing the implicit Euler method. At each time step of the method in the refined RAIM, the reaction kinetics equations are solved by the Newton method in which elements of the Jacobian matrix are expressed analytically. With the results of OECD-BIP P10T2 test, the refined RAIM was compared to RAIM with the explicit Euler method. The refined RAIM shows better agreement with the experimental data than those from the explicit Euler method. For the rapid change of pH during the experiment, the refined RAIM gives more realistic changes in the concentrations of chemical species than those from the explicit Euler method. In addition, in terms of computing time, the refined RAIM shows comparable computing time to that with explicit Euler method. These comparisons are attributed to ⁓10 times larger time step size used in the implicit Euler method, even though computation burden at each time step in the refined RAIM is much higher than that of the explicit Euler method. Compared to the experimental data, the refined RAIM still shows discrepancy, which are attributed

  4. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.

    Science.gov (United States)

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  5. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  6. Action Refinement

    NARCIS (Netherlands)

    Gorrieri, R.; Rensink, Arend; Bergstra, J.A.; Ponse, A.; Smolka, S.A.

    2001-01-01

    In this chapter, we give a comprehensive overview of the research results in the field of action refinement during the past 12 years. The different approaches that have been followed are outlined in detail and contrasted to each other in a uniform framework. We use two running examples to discuss

  7. Hybrid direct and iterative solvers for h refined grids with singularities

    KAUST Repository

    Paszyński, Maciej R.

    2015-04-27

    This paper describes a hybrid direct and iterative solver for two and three dimensional h adaptive grids with point singularities. The point singularities are eliminated by using a sequential linear computational cost solver O(N) on CPU [1]. The remaining Schur complements are submitted to incomplete LU preconditioned conjugated gradient (ILUPCG) iterative solver. The approach is compared to the standard algorithm performing static condensation over the entire mesh and executing the ILUPCG algorithm on top of it. The hybrid solver is applied for two or three dimensional grids automatically h refined towards point or edge singularities. The automatic refinement is based on the relative error estimations between the coarse and fine mesh solutions [2], and the optimal refinements are selected using the projection based interpolation. The computational mesh is partitioned into sub-meshes with local point and edge singularities separated. This is done by using the following greedy algorithm.

  8. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics.

    Directory of Open Access Journals (Sweden)

    Jorge Duconge

    Full Text Available This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients.A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals.The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day, and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001. The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias.Results supported our rationale to incorporate individual's genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics.ClinicalTrials.gov NCT01318057.

  9. Grain refinement of aluminum and its alloys

    International Nuclear Information System (INIS)

    Zaid, A.I.O.

    2001-01-01

    Grain refinement of aluminum and its alloys by the binary Al-Ti and Ternary Al-Ti-B master alloys is reviewed and discussed. The importance of grain refining to the cast industry and the parameters affecting it are presented and discussed. These include parameters related to the cast, parameters related to the grain refining alloy and parameters related to the process. The different mechanisms, suggested in the literature for the process of grain refining are presented and discussed, from which it is found that although the mechanism of refining by the binary Al-Ti is well established the mechanism of grain refining by the ternary Al-Ti-B is still a controversial matter and some research work is still needed in this area. The effect of the addition of other alloying elements in the presence of the grain refiner on the grain refining efficiency is also reviewed and discussed. It is found that some elements e.g. V, Mo, C improves the grain refining efficiency, whereas other elements e.g. Cr, Zr, Ta poisons the grain refinement. Based on the parameters affecting the grain refinement and its mechanism, a criterion for selection of the optimum grain refiner is forwarded and discussed. (author)

  10. Grid refinement for aeroacoustics in the lattice Boltzmann method: A directional splitting approach

    Science.gov (United States)

    Gendre, Félix; Ricot, Denis; Fritz, Guillaume; Sagaut, Pierre

    2017-08-01

    This study focuses on grid refinement techniques for the direct simulation of aeroacoustics, when using weakly compressible lattice Boltzmann models, such as the D3Q19 athermal velocity set. When it comes to direct noise computation, very small errors on the density or pressure field may have great negative consequences. Even strong acoustic density fluctuations have indeed a clearly lower amplitude than the hydrodynamic ones. This work deals with such very weak spurious fluctuations that emerge when a vortical structure crosses a refinement interface, which may contaminate the resulting aeroacoustic field. We show through an extensive literature review that, within the framework described above, this issue has never been addressed before. To tackle this problem, we develop an alternative algorithm and compare its behavior to a classical one, which fits our in-house vertex-centered data structure. Our main idea relies on a directional splitting of the continuous discrete velocity Boltzmann equation, followed by an integration over specific characteristics. This method can be seen as a specific coupling between finite difference and lattice Boltzmann, locally on the interface between the two grids. The method is assessed considering two cases: an acoustic pulse and a convected vortex. We show how very small errors on the density field arise and propagate throughout the domain when a vortical flow crosses the refinement interface. We also show that an increased free stream Mach number (but still within the weakly compressible regime) strongly deteriorates the situation, although the magnitude of the errors may remain negligible for purely aerodynamic studies. A drastically reduced level of error for the near-field spurious noise is obtained with our approach, especially for under-resolved simulations, a situation that is crucial for industrial applications. Thus, the vortex case is proved useful for aeroacoustic validations of any grid refinement algorithm.

  11. A novel non-uniform control vector parameterization approach with time grid refinement for flight level tracking optimal control problems.

    Science.gov (United States)

    Liu, Ping; Li, Guodong; Liu, Xinggao; Xiao, Long; Wang, Yalin; Yang, Chunhua; Gui, Weihua

    2018-02-01

    High quality control method is essential for the implementation of aircraft autopilot system. An optimal control problem model considering the safe aerodynamic envelop is therefore established to improve the control quality of aircraft flight level tracking. A novel non-uniform control vector parameterization (CVP) method with time grid refinement is then proposed for solving the optimal control problem. By introducing the Hilbert-Huang transform (HHT) analysis, an efficient time grid refinement approach is presented and an adaptive time grid is automatically obtained. With this refinement, the proposed method needs fewer optimization parameters to achieve better control quality when compared with uniform refinement CVP method, whereas the computational cost is lower. Two well-known flight level altitude tracking problems and one minimum time cost problem are tested as illustrations and the uniform refinement control vector parameterization method is adopted as the comparative base. Numerical results show that the proposed method achieves better performances in terms of optimization accuracy and computation cost; meanwhile, the control quality is efficiently improved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Post-Processing Approach for Refining Raw Land Cover Change Detection of Very High-Resolution Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhiyong Lv

    2018-03-01

    Full Text Available In recent decades, land cover change detection (LCCD using very high-spatial resolution (VHR remote sensing images has been a major research topic. However, VHR remote sensing images usually lead to a large amount of noises in spectra, thereby reducing the reliability of the detected results. To solve this problem, this study proposes an object-based expectation maximization (OBEM post-processing approach for enhancing raw LCCD results. OBEM defines a refinement of the labeling in a detected map to enhance its raw detection accuracies. Current mainstream change detection (preprocessing techniques concentrate on proposing a change magnitude measurement or considering image spatial features to obtain a change detection map. The proposed OBEM approach is a new solution to enhance change detection accuracy by refining the raw result. Post-processing approaches can achieve competitive accuracies to the preprocessing methods, but in a direct and succinct manner. The proposed OBEM post-processing method synthetically considers multi-scale segmentation and expectation maximum algorithms to refine the raw change detection result. Then, the influence of the scale of segmentation on the LCCD accuracy of the proposed OBEM is investigated. Four pairs of remote sensing images, one of two pairs (aerial image with 0.5 m/pixel resolution which depict two landslide sites on Landtau Island, Hong Kong, China, are used in the experiments to evaluate the effectiveness of the proposed approach. In addition, the proposed approach is applied, and validated by two case studies, LCCD in Tianjin City China (SPOT-5 satellite image with 2.5 m/pixel resolution and Mexico forest fire case (Landsat TM images with 30 m/pixel resolution, respectively. Quantitative evaluations show that the proposed OBEM post-processing approach can achieve better performance and higher accuracies than several commonly used preprocessing methods. To the best of the authors’ knowledge, this type

  13. Decadal climate prediction with a refined anomaly initialisation approach

    Science.gov (United States)

    Volpi, Danila; Guemas, Virginie; Doblas-Reyes, Francisco J.; Hawkins, Ed; Nichols, Nancy K.

    2017-03-01

    In decadal prediction, the objective is to exploit both the sources of predictability from the external radiative forcings and from the internal variability to provide the best possible climate information for the next decade. Predicting the climate system internal variability relies on initialising the climate model from observational estimates. We present a refined method of anomaly initialisation (AI) applied to the ocean and sea ice components of the global climate forecast model EC-Earth, with the following key innovations: (1) the use of a weight applied to the observed anomalies, in order to avoid the risk of introducing anomalies recorded in the observed climate, whose amplitude does not fit in the range of the internal variability generated by the model; (2) the AI of the ocean density, instead of calculating it from the anomaly initialised state of temperature and salinity. An experiment initialised with this refined AI method has been compared with a full field and standard AI experiment. Results show that the use of such refinements enhances the surface temperature skill over part of the North and South Atlantic, part of the South Pacific and the Mediterranean Sea for the first forecast year. However, part of such improvement is lost in the following forecast years. For the tropical Pacific surface temperature, the full field initialised experiment performs the best. The prediction of the Arctic sea-ice volume is improved by the refined AI method for the first three forecast years and the skill of the Atlantic multidecadal oscillation is significantly increased compared to a non-initialised forecast, along the whole forecast time.

  14. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    Science.gov (United States)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  15. Towards automated crystallographic structure refinement with phenix.refine

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Mustyakimov, Marat; Terwilliger, Thomas C. [Los Alamos National Laboratory, M888, Los Alamos, NM 87545 (United States); Urzhumtsev, Alexandre [CNRS–INSERM–UdS, 1 Rue Laurent Fries, BP 10142, 67404 Illkirch (France); Université Henri Poincaré, Nancy 1, BP 239, 54506 Vandoeuvre-lès-Nancy (France); Zwart, Peter H. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); University of California Berkeley, Berkeley, CA 94720 (United States)

    2012-04-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods.

  16. A Semi-Supervised Approach for Refining Transcriptional Signatures of Drug Response and Repositioning Predictions.

    Directory of Open Access Journals (Sweden)

    Francesco Iorio

    Full Text Available We present a novel strategy to identify drug-repositioning opportunities. The starting point of our method is the generation of a signature summarising the consensual transcriptional response of multiple human cell lines to a compound of interest (namely the seed compound. This signature can be derived from data in existing databases, such as the connectivity-map, and it is used at first instance to query a network interlinking all the connectivity-map compounds, based on the similarity of their transcriptional responses. This provides a drug neighbourhood, composed of compounds predicted to share some effects with the seed one. The original signature is then refined by systematically reducing its overlap with the transcriptional responses induced by drugs in this neighbourhood that are known to share a secondary effect with the seed compound. Finally, the drug network is queried again with the resulting refined signatures and the whole process is carried on for a number of iterations. Drugs in the final refined neighbourhood are then predicted to exert the principal mode of action of the seed compound. We illustrate our approach using paclitaxel (a microtubule stabilising agent as seed compound. Our method predicts that glipizide and splitomicin perturb microtubule function in human cells: a result that could not be obtained through standard signature matching methods. In agreement, we find that glipizide and splitomicin reduce interphase microtubule growth rates and transiently increase the percentage of mitotic cells-consistent with our prediction. Finally, we validated the refined signatures of paclitaxel response by mining a large drug screening dataset, showing that human cancer cell lines whose basal transcriptional profile is anti-correlated to them are significantly more sensitive to paclitaxel and docetaxel.

  17. Adaptive mesh refinement for shocks and material interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Dai, William Wenlong [Los Alamos National Laboratory

    2010-01-01

    There are three kinds of adaptive mesh refinement (AMR) in structured meshes. Block-based AMR sometimes over refines meshes. Cell-based AMR treats cells cell by cell and thus loses the advantage of the nature of structured meshes. Patch-based AMR is intended to combine advantages of block- and cell-based AMR, i.e., the nature of structured meshes and sharp regions of refinement. But, patch-based AMR has its own difficulties. For example, patch-based AMR typically cannot preserve symmetries of physics problems. In this paper, we will present an approach for a patch-based AMR for hydrodynamics simulations. The approach consists of clustering, symmetry preserving, mesh continuity, flux correction, communications, management of patches, and load balance. The special features of this patch-based AMR include symmetry preserving, efficiency of refinement across shock fronts and material interfaces, special implementation of flux correction, and patch management in parallel computing environments. To demonstrate the capability of the AMR framework, we will show both two- and three-dimensional hydrodynamics simulations with many levels of refinement.

  18. Application of multi-criteria material selection techniques to constituent refinement in biobased composites

    International Nuclear Information System (INIS)

    Miller, Sabbie A.; Lepech, Michael D.; Billington, Sarah L.

    2013-01-01

    Highlights: • Biobased composites have the potential to replace certain engineered materials. • Woven reinforcement can provide better material properties in biobased composites. • Short fiber filler can provide lower environmental impact in biobased composites. • Per function, different fibers are desired to lower composite environmental impact. - Abstract: Biobased composites offer a potentially low environmental impact material option for the construction industries. Designing these materials to meet both performance requirements for an application and minimize environmental impacts requires the ability to refine composite constituents based on environmental impact and mechanical properties. In this research, biobased composites with varying natural fiber reinforcement in a poly(β-hydroxybutyrate)-co-(β-hydroxyvalerate) matrix were characterized based on material properties through experiments and environmental impact through life cycle assessments. Using experimental results, these biobased composites were found to have competitive flexural properties and thermal conductivity with certain short-chopped glass fiber reinforced plastics. Multi-criteria material selection techniques were applied to weigh desired material properties with greenhouse gas emissions, fossil fuel demand, and Eco-Indicator ’99 score. The effects of using different reinforcing fibers in biobased composites were analyzed using the developed selection scheme as a tool for choosing constituents. The use of multi-criteria material selection provided the ability to select fiber reinforcement for biobased composites and showed when it would be more appropriate to use a novel biobased composite or a currently available engineered material

  19. Comparison of Two Grid Refinement Approaches for High Resolution Regional Climate Modeling: MPAS vs WRF

    Science.gov (United States)

    Leung, L.; Hagos, S. M.; Rauscher, S.; Ringler, T.

    2012-12-01

    This study compares two grid refinement approaches using global variable resolution model and nesting for high-resolution regional climate modeling. The global variable resolution model, Model for Prediction Across Scales (MPAS), and the limited area model, Weather Research and Forecasting (WRF) model, are compared in an idealized aqua-planet context with a focus on the spatial and temporal characteristics of tropical precipitation simulated by the models using the same physics package from the Community Atmosphere Model (CAM4). For MPAS, simulations have been performed with a quasi-uniform resolution global domain at coarse (1 degree) and high (0.25 degree) resolution, and a variable resolution domain with a high-resolution region at 0.25 degree configured inside a coarse resolution global domain at 1 degree resolution. Similarly, WRF has been configured to run on a coarse (1 degree) and high (0.25 degree) resolution tropical channel domain as well as a nested domain with a high-resolution region at 0.25 degree nested two-way inside the coarse resolution (1 degree) tropical channel. The variable resolution or nested simulations are compared against the high-resolution simulations that serve as virtual reality. Both MPAS and WRF simulate 20-day Kelvin waves propagating through the high-resolution domains fairly unaffected by the change in resolution. In addition, both models respond to increased resolution with enhanced precipitation. Grid refinement induces zonal asymmetry in precipitation (heating), accompanied by zonal anomalous Walker like circulations and standing Rossby wave signals. However, there are important differences between the anomalous patterns in MPAS and WRF due to differences in the grid refinement approaches and sensitivity of model physics to grid resolution. This study highlights the need for "scale aware" parameterizations in variable resolution and nested regional models.

  20. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.; Dawson, Clint N.

    2014-01-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  1. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.

    2014-03-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  2. Potency of high-intensity ultrasonic treatment for grain refinement of magnesium alloys

    International Nuclear Information System (INIS)

    Ramirez, A.; Qian Ma; Davis, B.; Wilks, T.; StJohn, D.H.

    2008-01-01

    High-intensity ultrasonic treatment (UT) for grain refinement of magnesium alloys has been investigated using a novel theoretical approach in order to better understand its grain-refining potential and the mechanism of nucleation. The process demonstrated significantly superior grain-refining potency to carbon inoculation for Al-containing magnesium alloys but inferior potency to zirconium for Al-free alloys. Details revealed by applying the theoretical approach to ultrasonic grain refinement provide new clues to understanding the mechanism of grain nucleation by UT

  3. Refinement-Animation for Event-B - Towards a Method of Validation

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Leuschel, Michael; Plagge, Daniel

    2010-01-01

    We provide a detailed description of refinement in Event-B, both as a contribution in itself and as a foundation for the approach to simultaneous animation of multiple levels of refinement that we propose. We present an algorithm for simultaneous multi-level animation of refinement, and show how ...

  4. Refinement of Triple-Negative Breast Cancer Molecular Subtypes: Implications for Neoadjuvant Chemotherapy Selection.

    Directory of Open Access Journals (Sweden)

    Brian D Lehmann

    Full Text Available Triple-negative breast cancer (TNBC is a heterogeneous disease that can be classified into distinct molecular subtypes by gene expression profiling. Considered a difficult-to-treat cancer, a fraction of TNBC patients benefit significantly from neoadjuvant chemotherapy and have far better overall survival. Outside of BRCA1/2 mutation status, biomarkers do not exist to identify patients most likely to respond to current chemotherapy; and, to date, no FDA-approved targeted therapies are available for TNBC patients. Previously, we developed an approach to identify six molecular subtypes TNBC (TNBCtype, with each subtype displaying unique ontologies and differential response to standard-of-care chemotherapy. Given the complexity of the varying histological landscape of tumor specimens, we used histopathological quantification and laser-capture microdissection to determine that transcripts in the previously described immunomodulatory (IM and mesenchymal stem-like (MSL subtypes were contributed from infiltrating lymphocytes and tumor-associated stromal cells, respectively. Therefore, we refined TNBC molecular subtypes from six (TNBCtype into four (TNBCtype-4 tumor-specific subtypes (BL1, BL2, M and LAR and demonstrate differences in diagnosis age, grade, local and distant disease progression and histopathology. Using five publicly available, neoadjuvant chemotherapy breast cancer gene expression datasets, we retrospectively evaluated chemotherapy response of over 300 TNBC patients from pretreatment biopsies subtyped using either the intrinsic (PAM50 or TNBCtype approaches. Combined analysis of TNBC patients demonstrated that TNBC subtypes significantly differ in response to similar neoadjuvant chemotherapy with 41% of BL1 patients achieving a pathological complete response compared to 18% for BL2 and 29% for LAR with 95% confidence intervals (CIs; [33, 51], [9, 28], [17, 41], respectively. Collectively, we provide pre-clinical data that could inform

  5. Involving users in the refinement of the competency-based achievement system: an innovative approach to competency-based assessment.

    Science.gov (United States)

    Ross, Shelley; Poth, Cheryl-Anne; Donoff, Michel G; Papile, Chiara; Humphries, Paul; Stasiuk, Samantha; Georgis, Rebecca

    2012-01-01

    Competency-based assessment innovations are being implemented to address concerns about the effectiveness of traditional approaches to medical training and the assessment of competence. Integrating intended users' perspectives during the piloting and refinement process of an innovation is necessary to ensure the innovation meets users' needs. Failure to do so results in no opportunity for users to influence the innovation, nor for developers to assess why an innovation works or does not work in different contexts. A qualitative participatory action research approach was used. Sixteen first-year residents participated in three focus groups and two interviews during piloting. Verbatim transcripts were analyzed individually and then across all transcripts using a constant comparison approach. The analysis revealed three key characteristics related to the impact on the residents' acceptance of the innovation as being a worthwhile investment of time and effort: access to frequent, timely, and specific feedback from preceptors. Findings were used to refine the innovation further. This study highlights the necessary conditions for assessing the success of implementation of educational innovations. Reciprocal communication between users and developers is vital. This reflects the approaches recommended in the Ottawa Consensus Statement on research in assessment published in Medical Teacher in March 2011.

  6. An approach of requirements tracing in formal refinement

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Leuschel, Michael

    2010-01-01

    Formal modeling of computing systems yields models that are intended to be correct with respect to the requirements that have been formalized. The complexity of typical computing systems can be addressed by formal refinement introducing all the necessary details piecemeal. We report on preliminar...... changes, making use of corresponding techniques already built into the Event-B method....

  7. An approach to selecting routes over which to transport excess salt from the Deaf Smith County Site

    International Nuclear Information System (INIS)

    1987-09-01

    This report presents an approach to be utilized in the identification of rail and/or highway routes for the disposal of waste salt and other salt contaminated material from repository construction. Relevant issues regarding salt transport also are identified. The report identifies a sequence of activities that precede actual route selection, i.e., final selection of a salt disposal method and its location, refined estimates of salt shipment volume and schedule, followed by selection of rail or truck or a combination thereof, as the preferred transport mode. After these factors are known, the route selection process can proceed. Chapter 2.0 of this report identifies directives and requirements that potentially could affect salt transport from the Deaf Smith site. A summary of salt disposal alternatives and reference cases is contained in Chapter 3.0. Chapter 4.0 identifies and discusses current methods of salt handling and transport in the United States, and also provides some perspective as to the volume of excess salt to be transported from the Deaf Smith site relative to current industry practices. Chapter 5.0 identifies an approach to the salt transportation issue, and suggests one system for evaluating alternative highway routes for truck shipments

  8. Using a systematic approach to select flagship species for bird conservation.

    Science.gov (United States)

    Veríssimo, Diogo; Pongiluppi, Tatiana; Santos, Maria Cintia M; Develey, Pedro F; Fraser, Iain; Smith, Robert J; MacMilan, Douglas C

    2014-02-01

    Conservation marketing campaigns that focus on flagship species play a vital role in biological diversity conservation because they raise funds and change people's behavior. However, most flagship species are selected without considering the target audience of the campaign, which can hamper the campaign's effectiveness. To address this problem, we used a systematic and stakeholder-driven approach to select flagship species for a conservation campaign in the Serra do Urubu in northeastern Brazil. We based our techniques on environmental economic and marketing methods. We used choice experiments to examine the species attributes that drive preference and latent-class models to segment respondents into groups by preferences and socioeconomic characteristics. We used respondent preferences and information on bird species inhabiting the Serra do Urubu to calculate a flagship species suitability score. We also asked respondents to indicate their favorite species from a set list to enable comparison between methods. The species' traits that drove audience preference were geographic distribution, population size, visibility, attractiveness, and survival in captivity. However, the importance of these factors differed among groups and groups differed in their views on whether species with small populations and the ability to survive in captivity should be prioritized. The popularity rankings of species differed between approaches, a result that was probably related to the different ways in which the 2 methods measured preference. Our new approach is a transparent and evidence-based method that can be used to refine the way stakeholders are engaged in the design of conservation marketing campaigns. © 2013 Society for Conservation Biology.

  9. Formal refinement of extended state machines

    Directory of Open Access Journals (Sweden)

    Thomas Fayolle

    2016-06-01

    Full Text Available In a traditional formal development process, e.g. using the B method, the informal user requirements are (manually translated into a global abstract formal specification. This translation is especially difficult to achieve. The Event-B method was developed to incrementally and formally construct such a specification using stepwise refinement. Each increment takes into account new properties and system aspects. In this paper, we propose to couple a graphical notation called Algebraic State-Transition Diagrams (ASTD with an Event-B specification in order to provide a better understanding of the software behaviour. The dynamic behaviour is captured by the ASTD, which is based on automata and process algebra operators, while the data model is described by means of an Event-B specification. We propose a methodology to incrementally refine such specification couplings, taking into account new refinement relations and consistency conditions between the control specification and the data specification. We compare the specifications obtained using each approach for readability and proof complexity. The advantages and drawbacks of the traditional approach and of our methodology are discussed. The whole process is illustrated by a railway CBTC-like case study. Our approach is supported by tools for translating ASTD's into B and Event-B into B.

  10. Refining of raw materials, lignite present economic problems

    Energy Technology Data Exchange (ETDEWEB)

    Schirmer, G.

    1985-06-01

    East Germany seeks an economic intensification program that involves refining raw materials to a higher level. Lignite briquetting prior to liquefaction and gasification illustrates both the theoretical and practical aspects of that goal and also introduces questions of secure supplies. The author describes the special labor processes, use of technology, recycling of waste materials, and other new problems that the approach entails as the refined raw materials become new materials or energy sources. Economics based on the value of the refined product and the cost of the materials determine the degree of refinement. The concept also involves the relationship of producer and user as profits increase.

  11. Technological studies on uranium refining at nuclear materials authority, Egypt

    International Nuclear Information System (INIS)

    Mohammed, H.S.

    1997-01-01

    In 1992 nuclear materials authority (NMA) took a decision to establish yellow cake refining. Unit so as to study refining of El-Atshan yellow cake which recently produced by ion-exchange pilot plant, production sector. The research studies followed the conventional refining rout to produce nuclear grade UO 3 . This implies investigations on some common solvents to refine the cake viz. tri alkyl phosphates, tri alkyl phosphine oxides, dialkyl phosphoric acid as well as high-molecular weight long-chain tertiary amines. Moreover, non-conventional refining process has also been presented depending on the selectivity of uranyl ion to be dissolved by carbonate and to be precipitated by hydrogen peroxide. Most of the proposed processes were found feasible to refine El-Atshan yellow cake. however, the non- conventional refining process appears to be the most promising, owing to its superior performance and economy

  12. Towards automated crystallographic structure refinement with phenix.refine

    OpenAIRE

    Afonine, Pavel V.; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W.; Mustyakimov, Marat; Terwilliger, Thomas C.; Urzhumtsev, Alexandre; Zwart, Peter H.; Adams, Paul D.

    2012-01-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An i...

  13. Refinement from a control problem to program

    DEFF Research Database (Denmark)

    Schenke, Michael; Ravn, Anders P.

    1996-01-01

    The distinguishing feature of the presented refinement approach is that it links formalisms from a top level requirements notation down to programs together in a mathematically coherent development trajectory. The approach uses Duration Calculus, a real-time interval logic, to specifyrequirements...

  14. Basic effects of pulp refining on fiber properties--a review.

    Science.gov (United States)

    Gharehkhani, Samira; Sadeghinezhad, Emad; Kazi, Salim Newaz; Yarmand, Hooman; Badarudin, Ahmad; Safaei, Mohammad Reza; Zubir, Mohd Nashrul Mohd

    2015-01-22

    The requirement for high quality pulps which are widely used in paper industries has increased the demand for pulp refining (beating) process. Pulp refining is a promising approach to improve the pulp quality by changing the fiber characteristics. The diversity of research on the effect of refining on fiber properties which is due to the different pulp sources, pulp consistency and refining equipment has interested us to provide a review on the studies over the last decade. In this article, the influence of pulp refining on structural properties i.e., fibrillations, fine formation, fiber length, fiber curl, crystallinity and distribution of surface chemical compositions is reviewed. The effect of pulp refining on electrokinetic properties of fiber e.g., surface and total charges of pulps is discussed. In addition, an overview of different refining theories, refiners as well as some tests for assessing the pulp refining is presented. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Bayesian ensemble refinement by replica simulations and reweighting

    Science.gov (United States)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-01

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  16. Refinement in Z and Object-Z foundations and advanced applications

    CERN Document Server

    Derrick, John

    2013-01-01

    Refinement is one of the cornerstones of the formal approach to software engineering, and its use in various domains has led to research on new applications and generalisation. This book brings together this important research in one volume, with the addition of examples drawn from different application areas. It covers four main themes:Data refinement and its application to ZGeneralisations of refinement that change the interface and atomicity of operationsRefinement in Object-ZModelling state and behaviour by combining Object-Z with CSPRefinement in Z and Object-Z: Foundations and Advanced A

  17. Refining discordant gene trees.

    Science.gov (United States)

    Górecki, Pawel; Eulenstein, Oliver

    2014-01-01

    Evolutionary studies are complicated by discordance between gene trees and the species tree in which they evolved. Dealing with discordant trees often relies on comparison costs between gene and species trees, including the well-established Robinson-Foulds, gene duplication, and deep coalescence costs. While these costs have provided credible results for binary rooted gene trees, corresponding cost definitions for non-binary unrooted gene trees, which are frequently occurring in practice, are challenged by biological realism. We propose a natural extension of the well-established costs for comparing unrooted and non-binary gene trees with rooted binary species trees using a binary refinement model. For the duplication cost we describe an efficient algorithm that is based on a linear time reduction and also computes an optimal rooted binary refinement of the given gene tree. Finally, we show that similar reductions lead to solutions for computing the deep coalescence and the Robinson-Foulds costs. Our binary refinement of Robinson-Foulds, gene duplication, and deep coalescence costs for unrooted and non-binary gene trees together with the linear time reductions provided here for computing these costs significantly extends the range of trees that can be incorporated into approaches dealing with discordance.

  18. Effects of grain refinement on the rheological behaviors of semisolid hypoeutectic Al-Si alloys

    International Nuclear Information System (INIS)

    Yan, M.; Luo, W.

    2007-01-01

    The paper experimentally investigated the effects of grain refinement on the rheological response of Al and hypoeutectic Al-Si alloys. Selected refiners included K 2 TiF 6 , K 2 TiF 6 plus graphite and Al-5Ti-B. The apparent viscosity of semisolid Al alloys was measured during solidification. Samples at different solid fractions were quenched to observe the microstructure. It was found that grain refinement drastically lowered the apparent viscosity of Al-Si alloys. Among selected refiners, the effect of Al-5Ti-B was the best. The effect of K 2 TiF 6 plus graphite was better than that of K 2 TiF 6 . Silicon contents in Al alloys affected the apparent viscosity. With increasing silicon content the apparent viscosity decreased, resulted from promotion of silicon to both refining effects of titanium and boron

  19. The European refining and distribution industry at the 2010 vista

    International Nuclear Information System (INIS)

    Lacour, J.J.; Tessmer, G.; Ward, I.

    1998-01-01

    Oil company chairmen belonging to the AFTP, DGMK and IP associations met together to debate about the future of the European refining industry. The following topics were discussed: is it the end of the refining crisis? Which uncertainties will have to be met? What is the situation of petroleum products supply and demand? What are the consumers' expectations? How to face the environmental constraints? Which future for the refining activities in Europe? Seven round-tables took place with the following themes: the factors of uncertainty in the future of refining activities, the petroleum products supply and demand (automotive fuels, fuel oils, lubricants), the refining activities and the supply of consumers (service stations and supermarkets), the situation of the European petroleum policy, the European refining industry and the public regulations (development of more efficient environmental approaches), the impact of environmental constraints and the technical solutions, and the future of the refining industry. (J.S.)

  20. Iterative Refinement Methods for Time-Domain Equalizer Design

    Directory of Open Access Journals (Sweden)

    Evans Brian L

    2006-01-01

    Full Text Available Commonly used time domain equalizer (TEQ design methods have been recently unified as an optimization problem involving an objective function in the form of a Rayleigh quotient. The direct generalized eigenvalue solution relies on matrix decompositions. To reduce implementation complexity, we propose an iterative refinement approach in which the TEQ length starts at two taps and increases by one tap at each iteration. Each iteration involves matrix-vector multiplications and vector additions with matrices and two-element vectors. At each iteration, the optimization of the objective function either improves or the approach terminates. The iterative refinement approach provides a range of communication performance versus implementation complexity tradeoffs for any TEQ method that fits the Rayleigh quotient framework. We apply the proposed approach to three such TEQ design methods: maximum shortening signal-to-noise ratio, minimum intersymbol interference, and minimum delay spread.

  1. Operator Product Formulas in the Algebraic Approach of the Refined Topological Vertex

    International Nuclear Information System (INIS)

    Cai Li-Qiang; Wang Li-Fang; Wu Ke; Yang Jie

    2013-01-01

    The refined topological vertex of Iqbal—Kozçaz—Vafa has been investigated from the viewpoint of the quantum algebra of type W 1+∞ by Awata, Feigin, and Shiraishi. They introduced the trivalent intertwining operator Φ which is normal ordered along with some prefactors. We manage to establish formulas from the infinite operator product of the vertex operators and the generalized ones to restore this prefactor, and obtain an explicit formula for the vertex realization of the topological vertex as well as the refined topological vertex

  2. Refining revolution

    Energy Technology Data Exchange (ETDEWEB)

    Fesharaki, F.; Isaak, D.

    1984-01-01

    A review of changes in the oil refining industry since 1973 examines the drop in capacity use and its effect on profits of the Organization of Economic Cooperation and Development (OECD) countries compared to world refining. OPEC countries used their new oil revenues to expand Gulf refineries, which put additional pressure on OECD refiners. OPEC involvement in global marketing, however, could help to secure supplies. Scrapping some older OECD refineries could improve the percentage of capacity in use if new construction is kept to a minimum. Other issues facing refiners are the changes in oil demand patterns and government responses to the market. 2 tables.

  3. Comparison of geometrical isomerization of unsaturated fatty acids in selected commercially refined oils

    Directory of Open Access Journals (Sweden)

    Tasan, M.

    2011-09-01

    Full Text Available Four different commercially refined vegetable oils were analyzed by capillary gas-liquid chromatography for their trans fatty acid contents. The results obtained showed that the total trans FA contents in refined sunflower, corn, soybean, and hazelnut oils were 0.68 ± 0.41, 0.51 ± 0.24, 1.27 ± 0.57, and 0.26 ± 0.07% of total FA, respectively. The total trans FA comprised isomers of the C18:1, C18:2 and C18:3 FA. Meanwhile, five brands of the refined sunflower oil and two brands of hazelnut oil contained no measurable amounts of total trans C18:3 acids. The total trans C18:2 acid was the predominant trans FA found in the refined sunflower and corn oils, while trans polyunsaturated FAs for the refined soybean oils were found at high levels. However, total trans C18:1 acid was the major trans FA for refined hazelnut oils. The commercially refined vegetable oils with a relatively high total polyunsaturated FA contained considerable amounts of trans polyunsaturated isomers. This study indicates that it is necessary to optimize industrial deodorization, especially the time and temperature, for each different FA composition of oil used.

    Cuatro aceites vegetales refinados comerciales diferentes fueron analizados por cromatografía de gases para determinar el contenido en ácidos grasos trans. Los resultados obtenidos mostraron que el contenido total de los FA trans de aceites refinados de girasol, maíz, soja y avellana fueron 0.68 ± 0.41, 0.51 ± 0.24, 1.27 ± 0.57, y 0.26 ± 0.07% de FA totales, respetivamente. Los ácidos grasos totales trans comprenden a isómeros de FA C18:1, C18:2 y C18:3. Cinco marcas de aceites de girasol refinado y dos marcas de aceite de avellana contenían cantidades no medibles de ácidos trans C18:3 totales. Los ácidos C18:2 trans totales fueron los FA trans predominantes en el aceite de girasol y ma

  4. Application of discriminative models for interactive query refinement in video retrieval

    Science.gov (United States)

    Srivastava, Amit; Khanwalkar, Saurabh; Kumar, Anoop

    2013-12-01

    The ability to quickly search for large volumes of videos for specific actions or events can provide a dramatic new capability to intelligence agencies. Example-based queries from video are a form of content-based information retrieval (CBIR) where the objective is to retrieve clips from a video corpus, or stream, using a representative query sample to find more like this. Often, the accuracy of video retrieval is largely limited by the gap between the available video descriptors and the underlying query concept, and such exemplar queries return many irrelevant results with relevant ones. In this paper, we present an Interactive Query Refinement (IQR) system which acts as a powerful tool to leverage human feedback and allow intelligence analyst to iteratively refine search queries for improved precision in the retrieved results. In our approach to IQR, we leverage discriminative models that operate on high dimensional features derived from low-level video descriptors in an iterative framework. Our IQR model solicits relevance feedback on examples selected from the region of uncertainty and updates the discriminating boundary to produce a relevance ranked results list. We achieved 358% relative improvement in Mean Average Precision (MAP) over initial retrieval list at a rank cutoff of 100 over 4 iterations. We compare our discriminative IQR model approach to a naïve IQR and show our model-based approach yields 49% relative improvement over the no model naïve system.

  5. A benefit/risk approach towards selecting appropriate pharmaceutical dosage forms - an application for paediatric dosage form selection.

    Science.gov (United States)

    Sam, Tom; Ernest, Terry B; Walsh, Jennifer; Williams, Julie L

    2012-10-05

    The design and selection of new pharmaceutical dosage forms involves the careful consideration and balancing of a quality target product profile against technical challenges and development feasibility. Paediatric dosage forms present particular complexity due to the diverse patient population, patient compliance challenges and safety considerations of this vulnerable population. This paper presents a structured framework for assessing the comparative benefits and risks of different pharmaceutical design options against pre-determined criteria relating to (1) efficacy, (2) safety and (3) patient access. This benefit/risk framework has then been applied to three hypothetical, but realistic, scenarios for paediatric dosage forms in order to explore its utility in guiding dosage form design and formulation selection. The approach allows a rigorous, systematic and qualitative assessment of the merits and disadvantages of each dosage form option and helps identify mitigating strategies to modify risk. The application of a weighting and scoring system to the criteria depending on the specific case could further refine the analysis and aid decision-making. In this paper, one case study is scored for illustrative purposes. However, it is acknowledged that in real development scenarios, the generation of actual data considering the very specific situation for the patient/product/developer would come into play to drive decisions on the most appropriate dosage form strategy. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Financial optimisation and risk management in refining activities

    International Nuclear Information System (INIS)

    Fiorenzani, S.

    2006-01-01

    The real options approach has become a benchmark in real assets evaluation and optimal management problems, especially in liberalised and competitive markets such as the oil and hydrocarbon markets. This paper describes how the same approach can be a useful tool for both risk management decisions and the financial optimisation problem. Refineries are black boxes, which can be used for the transformation of crude oil into more refined hydrocarbon products. These black boxes are characterised by operational flexibilities and constraints, which should be optimally managed in order to maximise the refiner's economic goals. Stochastic dynamic programming represents the right mathematical instrument employed to solve the decision-making problem in such an economic environment. (author)

  7. Refining mineral oils

    Energy Technology Data Exchange (ETDEWEB)

    1946-07-05

    A process is described refining raw oils such as mineral oils, shale oils, tar, their fractions and derivatives, by extraction with a selected solvent or a mixture of solvents containing water, forming a solvent more favorable for the hydrocarbons poor in hydrogen than for hydrocarbons rich in hydrogen, this process is characterized by the addition of an aiding solvent for the water which can be mixed or dissolved in the water and the solvent or in the dissolving mixture and increasing in this way the solubility of the water in the solvent or the dissolving mixture.

  8. Grid refinement model in lattice Boltzmann method for stream function-vorticity formulations

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Myung Seob [Dept. of Mechanical Engineering, Dongyang Mirae University, Seoul (Korea, Republic of)

    2015-03-15

    In this study, we present a grid refinement model in the lattice Boltzmann method (LBM) for two-dimensional incompressible fluid flow. That is, the model combines the desirable features of the lattice Boltzmann method and stream function-vorticity formulations. In order to obtain an accurate result, very fine grid (or lattice) is required near the solid boundary. Therefore, the grid refinement model is used in the lattice Boltzmann method for stream function-vorticity formulation. This approach is more efficient in that it can obtain the same accurate solution as that in single-block approach even if few lattices are used for computation. In order to validate the grid refinement approach for the stream function-vorticity formulation, the numerical simulations of lid-driven cavity flows were performed and good results were obtained.

  9. Possibilities and limitations of parametric Rietveld refinement on high pressure data. The case study of LaFeO3

    International Nuclear Information System (INIS)

    Etter, Martin; Mueller, Melanie; Dinnebier, Robert E.; Hanfland, Michael

    2014-01-01

    Parametric Rietveld refinement is a powerful technique to apply directly physical or empirical equations to the refinement of in situ powder diffraction data. In order to investigate the possibilities and limitations of parametric Rietveld refinements for high pressure data four competitive crystallographic approaches were used to carry out a full structural investigation of the orthoferrite LaFeO 3 (Pbnm at ambient conditions) under high pressure up to 47 GPa. Approach A with traditional Rietveld refinement using atomic coordinates, Approach B where the Rietveld refinement was done by using the rigid body method, Approach C where symmetry modes were used and Approach D where the newly developed method of the rotational symmetry mode description for a rigid body was used. For all approaches sequential as well as parametric refinements were carried out, confirming a second order phase transition of LaFeO 3 to a higher symmetric phase (space group Ibmm) at around 21.1?GPa and an isostructural first order phase transition at around 38 GPa. Limitations due to non-hydrostatic conditions as well as the possibilities of a direct modeling of phase transitions with parametric Rietveld refinement are discussed in detail. (orig.)

  10. Refining cost-effectiveness analyses using the net benefit approach and econometric methods: an example from a trial of anti-depressant treatment.

    Science.gov (United States)

    Sabes-Figuera, Ramon; McCrone, Paul; Kendricks, Antony

    2013-04-01

    Economic evaluation analyses can be enhanced by employing regression methods, allowing for the identification of important sub-groups and to adjust for imperfect randomisation in clinical trials or to analyse non-randomised data. To explore the benefits of combining regression techniques and the standard Bayesian approach to refine cost-effectiveness analyses using data from randomised clinical trials. Data from a randomised trial of anti-depressant treatment were analysed and a regression model was used to explore the factors that have an impact on the net benefit (NB) statistic with the aim of using these findings to adjust the cost-effectiveness acceptability curves. Exploratory sub-samples' analyses were carried out to explore possible differences in cost-effectiveness. Results The analysis found that having suffered a previous similar depression is strongly correlated with a lower NB, independent of the outcome measure or follow-up point. In patients with previous similar depression, adding an selective serotonin reuptake inhibitors (SSRI) to supportive care for mild-to-moderate depression is probably cost-effective at the level used by the English National Institute for Health and Clinical Excellence to make recommendations. This analysis highlights the need for incorporation of econometric methods into cost-effectiveness analyses using the NB approach.

  11. Filled pause refinement based on the pronunciation probability for lecture speech.

    Directory of Open Access Journals (Sweden)

    Yan-Hua Long

    Full Text Available Nowadays, although automatic speech recognition has become quite proficient in recognizing or transcribing well-prepared fluent speech, the transcription of speech that contains many disfluencies remains problematic, such as spontaneous conversational and lecture speech. Filled pauses (FPs are the most frequently occurring disfluencies in this type of speech. Most recent studies have shown that FPs are widely believed to increase the error rates for state-of-the-art speech transcription, primarily because most FPs are not well annotated or provided in training data transcriptions and because of the similarities in acoustic characteristics between FPs and some common non-content words. To enhance the speech transcription system, we propose a new automatic refinement approach to detect FPs in British English lecture speech transcription. This approach combines the pronunciation probabilities for each word in the dictionary and acoustic language model scores for FP refinement through a modified speech recognition forced-alignment framework. We evaluate the proposed approach on the Reith Lectures speech transcription task, in which only imperfect training transcriptions are available. Successful results are achieved for both the development and evaluation datasets. Acoustic models trained on different styles of speech genres have been investigated with respect to FP refinement. To further validate the effectiveness of the proposed approach, speech transcription performance has also been examined using systems built on training data transcriptions with and without FP refinement.

  12. Catalysts in petroleum refining and petrochemical industries 1995

    Energy Technology Data Exchange (ETDEWEB)

    Absi-Halabi, M.; Beshara, J.; Qabazard, H.; Stanislaus, A. [eds.] [Petroleum, Petrochemicals and Materials Division, Kuwait Institute of Scientific Research, Kuwait (Kuwait)

    1996-07-01

    Catalysis plays an increasingly critical role in modern petroleum refining and basic petrochemical industries. The market demands for and specifications of petroleum and petrochemical products are continuously changing. They have impacted the industry significantly over the past twenty years. Numerous new refining processes have been developed and significant improvements were made on existing technologies. Catalysts have been instrumental in enabling the industry to meet the continuous challenges posed by the market. As we enter the 21st century, new challenges for catalysis science and technology are anticipated in almost every field. Particularly, better utilization of petroleum resources and demands for cleaner transportation fuels are major items on the agenda. It is against this background that the 2nd International Conference on Catalysts in Petroleum Refining and Petrochemical Industries was organized. The papers from the conference were carefully selected from around 100 submissions. They were a mix of reviews providing an overview of selected areas, original fundamental research results, and industrial experiences. The papers in the proceedings were grouped in the following sections for quick reference: Plenary Papers; Hydroprocessing of Petroleum Residues and Distillates; Fluid Catalytic Cracking; Oxidation Catalysis; Aromatization and Polymerization Catalysis; Catalyst Characterization and Performance. The plenary papers were mostly reviews covering important topics related to the objectives of the conference. The remaining sections cover various topics of major impact on modern petroleum refining and petrochemical industries. A large number of papers dealt with hydroprocessing of petroleum distillates and residues which reflects the concern over meeting future sulfur-level specifications for diesel and fuel oils

  13. Possibilities and limitations of parametric Rietveld refinement on high pressure data. The case study of LaFeO{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Etter, Martin; Mueller, Melanie; Dinnebier, Robert E. [Max-Planck-Institut fuer Festkoerperforschung, Stuttgart (Germany); Hanfland, Michael [European Synchrotron Radiation Facility (ESRF), Grenoble (France)

    2014-04-01

    Parametric Rietveld refinement is a powerful technique to apply directly physical or empirical equations to the refinement of in situ powder diffraction data. In order to investigate the possibilities and limitations of parametric Rietveld refinements for high pressure data four competitive crystallographic approaches were used to carry out a full structural investigation of the orthoferrite LaFeO{sub 3} (Pbnm at ambient conditions) under high pressure up to 47 GPa. Approach A with traditional Rietveld refinement using atomic coordinates, Approach B where the Rietveld refinement was done by using the rigid body method, Approach C where symmetry modes were used and Approach D where the newly developed method of the rotational symmetry mode description for a rigid body was used. For all approaches sequential as well as parametric refinements were carried out, confirming a second order phase transition of LaFeO{sub 3} to a higher symmetric phase (space group Ibmm) at around 21.1?GPa and an isostructural first order phase transition at around 38 GPa. Limitations due to non-hydrostatic conditions as well as the possibilities of a direct modeling of phase transitions with parametric Rietveld refinement are discussed in detail. (orig.)

  14. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    Science.gov (United States)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable

  15. 《文选》在古代朝鲜半岛的传播及其价值%Studies on the value and circulation of Selections of Refined Literature in ancient Korean peninsular

    Institute of Scientific and Technical Information of China (English)

    季南

    2012-01-01

      Selections of Refined Literature, which is known as the kingleader of anthology or the source of literature works, speaded to ancient Korean peninsular. The status of Selections of Refined Literature was in unbalanced and unstable state with the change of the political system and literary concept. Selections of Refined Literature has an important role in history, literature and acadamic.%  被誉为“总集之弁冕”“文章之渊薮”的《文选》伴随着中朝两国的文化交流传到古代朝鲜半岛。由于政治制度和文学观念的变迁,《文选》的地位在古代朝鲜半岛各朝代呈现出不平衡不稳定的状态。《文选》之于古代朝鲜半岛有着重要的历史价值、文学价值、学术价值。

  16. Grain refinement through severe plastic deformation (SPD) processing

    International Nuclear Information System (INIS)

    Izairi, N.; Vevecka - Priftaj, A.

    2012-01-01

    There is considerable current interest in processing metallic samples through procedures involving the imposition of severe plastic deformation (SPD). These procedures lead to very significant grain refinement to the submicrometer or even the nanometer level, resulting in advanced physical properties. Among various SPD processes, Equal Channel Angular Pressing, High pressure Torsion and Accumulated Roll Bonding have been widely used for many metals and alloys. In the present work, we present an overview of the most used methods of SPD for grain refinement and the production of bulk nano structured materials with enhancement in their mechanical and functional properties. In order to examine the potential for using ECAP to refine the grain size and improve the mechanical properties, two commercial 5754 Al alloy and AA 3004 , were selected for study. Processing by ECAP gives a reduction in the grain size and an increase in the microhardness. (Author)

  17. An analytical approach to elucidate the mechanism of grain refinement in calcium added Mg-Al alloys

    International Nuclear Information System (INIS)

    Nagasivamuni, B.; Ravi, K.R.

    2015-01-01

    Highlights: • Minor additions of Ca (<0.2%) refines the grain structure in Mg-(3, 6 and 9)Al alloys. • Analytical model elucidate that nucleation potency is enhanced after Ca addition. • Ternary Mg-Al-xCa growth restriction values (Q t ) are computed using Scheil equations. • Grain size predictions elucidate that nucleation events dominate grain refinement. • Growth restriction due to the higher Ca addition on grain refinement is not significant. - Abstract: The present study investigates the grain refinement of Mg-3Al, Mg-6Al and Mg-9Al alloys by calcium addition. The maximum reduction in grain size has been observed at 0.2% Ca addition in Mg-Al alloys, in which any further addition (up to 0.4%) has marginal improvement in grain refinement. The mechanism associated with the grain refinement of Mg-Al alloys by Ca addition is discussed in terms of growth restriction factor (Q) and constitutional undercooling (ΔT CS ) using analytical model. The influence of growth restriction factor (Q) on the final grain size of Ca-added Mg-Al alloys are calculated with the help analytical model by assuming that the number of nucleant particles is not altered through Ca addition. For accurate grain size calculations, the value of Q has been estimated with reliable thermodynamic database using Scheil solidification simulation. The comparison of predicted and experimental grain size results indicate that constitutional undercooling activation of nucleation events plays dominant role in grain refinement in Mg-Al alloys by calcium addition, whereas the increase in growth restriction value has negligible effect

  18. Multilevel local refinement and multigrid methods for 3-D turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Liao, C.; Liu, C. [UCD, Denver, CO (United States); Sung, C.H.; Huang, T.T. [David Taylor Model Basin, Bethesda, MD (United States)

    1996-12-31

    A numerical approach based on multigrid, multilevel local refinement, and preconditioning methods for solving incompressible Reynolds-averaged Navier-Stokes equations is presented. 3-D turbulent flow around an underwater vehicle is computed. 3 multigrid levels and 2 local refinement grid levels are used. The global grid is 24 x 8 x 12. The first patch is 40 x 16 x 20 and the second patch is 72 x 32 x 36. 4th order artificial dissipation are used for numerical stability. The conservative artificial compressibility method are used for further improvement of convergence. To improve the accuracy of coarse/fine grid interface of local refinement, flux interpolation method for refined grid boundary is used. The numerical results are in good agreement with experimental data. The local refinement can improve the prediction accuracy significantly. The flux interpolation method for local refinement can keep conservation for a composite grid, therefore further modify the prediction accuracy.

  19. Protein structure modeling and refinement by global optimization in CASP12.

    Science.gov (United States)

    Hong, Seung Hwan; Joung, InSuk; Flores-Canales, Jose C; Manavalan, Balachandran; Cheng, Qianyi; Heo, Seungryong; Kim, Jong Yun; Lee, Sun Young; Nam, Mikyung; Joo, Keehyoung; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2018-03-01

    For protein structure modeling in the CASP12 experiment, we have developed a new protocol based on our previous CASP11 approach. The global optimization method of conformational space annealing (CSA) was applied to 3 stages of modeling: multiple sequence-structure alignment, three-dimensional (3D) chain building, and side-chain re-modeling. For better template selection and model selection, we updated our model quality assessment (QA) method with the newly developed SVMQA (support vector machine for quality assessment). For 3D chain building, we updated our energy function by including restraints generated from predicted residue-residue contacts. New energy terms for the predicted secondary structure and predicted solvent accessible surface area were also introduced. For difficult targets, we proposed a new method, LEEab, where the template term played a less significant role than it did in LEE, complemented by increased contributions from other terms such as the predicted contact term. For TBM (template-based modeling) targets, LEE performed better than LEEab, but for FM targets, LEEab was better. For model refinement, we modified our CASP11 molecular dynamics (MD) based protocol by using explicit solvents and tuning down restraint weights. Refinement results from MD simulations that used a new augmented statistical energy term in the force field were quite promising. Finally, when using inaccurate information (such as the predicted contacts), it was important to use the Lorentzian function for which the maximal penalty arising from wrong information is always bounded. © 2017 Wiley Periodicals, Inc.

  20. Assume-Guarantee Abstraction Refinement Meets Hybrid Systems

    Science.gov (United States)

    Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas

    2014-01-01

    Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.

  1. A unified conformational selection and induced fit approach to protein-peptide docking.

    Directory of Open Access Journals (Sweden)

    Mikael Trellet

    Full Text Available Protein-peptide interactions are vital for the cell. They mediate, inhibit or serve as structural components in nearly 40% of all macromolecular interactions, and are often associated with diseases, making them interesting leads for protein drug design. In recent years, large-scale technologies have enabled exhaustive studies on the peptide recognition preferences for a number of peptide-binding domain families. Yet, the paucity of data regarding their molecular binding mechanisms together with their inherent flexibility makes the structural prediction of protein-peptide interactions very challenging. This leaves flexible docking as one of the few amenable computational techniques to model these complexes. We present here an ensemble, flexible protein-peptide docking protocol that combines conformational selection and induced fit mechanisms. Starting from an ensemble of three peptide conformations (extended, a-helix, polyproline-II, flexible docking with HADDOCK generates 79.4% of high quality models for bound/unbound and 69.4% for unbound/unbound docking when tested against the largest protein-peptide complexes benchmark dataset available to date. Conformational selection at the rigid-body docking stage successfully recovers the most relevant conformation for a given protein-peptide complex and the subsequent flexible refinement further improves the interface by up to 4.5 Å interface RMSD. Cluster-based scoring of the models results in a selection of near-native solutions in the top three for ∼75% of the successfully predicted cases. This unified conformational selection and induced fit approach to protein-peptide docking should open the route to the modeling of challenging systems such as disorder-order transitions taking place upon binding, significantly expanding the applicability limit of biomolecular interaction modeling by docking.

  2. Region-of-interest volumetric visual hull refinement

    KAUST Repository

    Knoblauch, Daniel

    2010-01-01

    This paper introduces a region-of-interest visual hull refinement technique, based on flexible voxel grids for volumetric visual hull reconstructions. Region-of-interest refinement is based on a multipass process, beginning with a focussed visual hull reconstruction, resulting in a first 3D approximation of the target, followed by a region-of-interest estimation, tasked with identifying features of interest, which in turn are used to locally refine the voxel grid and extract a higher-resolution surface representation for those regions. This approach is illustrated for the reconstruction of avatars for use in tele-immersion environments, where head and hand regions are of higher interest. To allow reproducability and direct comparison a publicly available data set for human visual hull reconstruction is used. This paper shows that region-of-interest reconstruction of the target is faster and visually comparable to higher resolution focused visual hull reconstructions. This approach reduces the amount of data generated through the reconstruction, allowing faster post processing, as rendering or networking of the surface voxels. Reconstruction speeds support smooth interactions between the avatar and the virtual environment, while the improved resolution of its facial region and hands creates a higher-degree of immersion and potentially impacts the perception of body language, facial expressions and eye-to-eye contact. Copyright © 2010 by the Association for Computing Machinery, Inc.

  3. Local Refinement of the Super Element Model of Oil Reservoir

    Directory of Open Access Journals (Sweden)

    A.B. Mazo

    2017-12-01

    Full Text Available In this paper, we propose a two-stage method for petroleum reservoir simulation. The method uses two models with different degrees of detailing to describe hydrodynamic processes of different space-time scales. At the first stage, the global dynamics of the energy state of the deposit and reserves is modeled (characteristic scale of such changes is km / year. The two-phase flow equations in the model of global dynamics operate with smooth averaged pressure and saturation fields, and they are solved numerically on a large computational grid of super-elements with a characteristic cell size of 200-500 m. The tensor coefficients of the super-element model are calculated using special procedures of upscaling of absolute and relative phase permeabilities. At the second stage, a local refinement of the super-element model is constructed for calculating small-scale processes (with a scale of m / day, which take place, for example, during various geological and technical measures aimed at increasing the oil recovery of a reservoir. Then we solve the two-phase flow problem in the selected area of the measure exposure on a detailed three-dimensional grid, which resolves the geological structure of the reservoir, and with a time step sufficient for describing fast-flowing processes. The initial and boundary conditions of the local problem are formulated on the basis of the super-element solution. This approach allows us to reduce the computational costs in order to solve the problems of designing and monitoring the oil reservoir. To demonstrate the proposed approach, we give an example of the two-stage modeling of the development of a layered reservoir with a local refinement of the model during the isolation of a water-saturated high-permeability interlayer. We show a good compliance between the locally refined solution of the super-element model in the area of measure exposure and the results of numerical modeling of the whole history of reservoir

  4. “New” Antigenic Targets and Methodological Approaches for Refining Laboratory Diagnosis of Antiphospholipid Syndrome

    Science.gov (United States)

    Misasi, Roberta; Capozzi, Antonella; Longo, Agostina; Recalchi, Serena; Lococo, Emanuela; Alessandri, Cristiano; Conti, Fabrizio; Valesini, Guido

    2015-01-01

    Antiphospholipid antibodies (aPLs) are a heterogeneous group of antibodies directed against phospholipids or protein/phospholipid complexes. Currently, aPLs are assessed using either “solid-phase” assays that identify anticardiolipin antibodies and anti-β2-glycoprotein I antibodies or “liquid-phase” assay that identifies lupus anticoagulant. However, in the last few years, “new” antigenic targets and methodological approaches have been employed for refining laboratory diagnosis of antiphospholipid syndrome (APS). In this review the potential diagnostic value of antibodies to domains of β2-GPI, prothrombin/phosphatidylserine, vimentin/cardiolipin, protein S, protein C, annexin A2, annexin A5, and phospholipid antigens is discussed. Moreover, new technical approaches, including chemiluminescence, multiline dot assay, and thin layer chromatography (TLC) immunostaining, which utilize different supports for detection of aPL, have been developed. A special focus has been dedicated on “seronegative” APS, that is, those patients with a clinical profile suggestive of APS (thromboses, recurrent miscarriages, or foetal loss), who are persistently negative for the routinely used aPL. Recent findings suggest that, in sera from patients with SN-APS, antibodies may be detected using “new” antigenic targets (mainly vimentin/cardiolipin) or methodological approaches different from traditional techniques (TLC immunostaining). Thus, APS represents a mosaic, in which antibodies against different antigenic targets may be detected thanks to the continuously evolving new technologies. PMID:25874238

  5. A new approach to grain refinement of an Mg-Li-Al cast alloy

    International Nuclear Information System (INIS)

    Jiang, B.; Qiu, D.; Zhang, M.-X.; Ding, P.D.; Gao, L.

    2010-01-01

    Crystallographic calculation based on the edge-to-edge matching model predicted that both TiB 2 and Al 3 Ti intermetallic compounds have strong potential to be effective grain refiners for β phase in the Mg-14Li-1Al alloy due to the small atomic matching misfit across the interface between the compounds and β phase. Experimental results showed that addition of 1.25 wt%Al-5Ti-1B master alloy reduced grain size of β phase in the alloy from 1750 to 500 μm. The possible grain refining mechanisms were also discussed.

  6. Effect of grain refinement on the microstructure and tensile properties of thin 319 Al castings

    International Nuclear Information System (INIS)

    Shabani, M.J.; Emamy, M.; Nemati, N.

    2011-01-01

    The structural examinations and tensile properties of thin-section Al castings (319 Al alloy) have been investigated by applying a pattern with different cross sections (2-12 mm). Al-5Ti-1B and Al-5Zr grain refiners were added to the molten Al alloy to produce different levels of Ti (0.01%, 0.05%, 0.1% and 0.15%) and Zr (0.05%, 0.1%, 0.2%, 0.3%, 0.4% and 0.5%) in the castings. From macrostructural studies, it was found that Al-5Zr is less effective in grain refining of 319 alloy in comparison with Al-5Ti-1B master alloy. The optimum levels of grain refiners were selected for determination of tensile properties. T6 heat treatment was applied for selected specimens before tensile testing. Further structural results also showed that thinner sections are less affected by grain refiners. This observation was found to be in a good agreement with tensile test results, where tensile properties of the base and grain refined alloys did not show considerable differences in thinner sections (<6 mm).

  7. Local multigrid mesh refinement in view of nuclear fuel 3D modelling in pressurised water reactors

    International Nuclear Information System (INIS)

    Barbie, L.

    2013-01-01

    The aim of this study is to improve the performances, in terms of memory space and computational time, of the current modelling of the Pellet-Cladding mechanical Interaction (PCI), complex phenomenon which may occurs during high power rises in pressurised water reactors. Among the mesh refinement methods - methods dedicated to efficiently treat local singularities - a local multi-grid approach was selected because it enables the use of a black-box solver while dealing few degrees of freedom at each level. The Local Defect Correction (LDC) method, well suited to a finite element discretization, was first analysed and checked in linear elasticity, on configurations resulting from the PCI, since its use in solid mechanics is little widespread. Various strategies concerning the implementation of the multilevel algorithm were also compared. Coupling the LDC method with the Zienkiewicz-Zhu a posteriori error estimator in order to automatically detect the zones to be refined, was then tested. Performances obtained on two-dimensional and three-dimensional cases are very satisfactory, since the algorithm proposed is more efficient than h-adaptive refinement methods. Lastly, the LDC algorithm was extended to nonlinear mechanics. Space/time refinement as well as transmission of the initial conditions during the re-meshing step were looked at. The first results obtained are encouraging and show the interest of using the LDC method for PCI modelling. (author) [fr

  8. Refining of fossil resin flotation concentrate from western coal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, G.F.; Miller, J.D.

    1995-02-16

    During the past several years, significant research efforts have been made to develop process technology for the selective flotation of fossil resin from western coals. As a result of these efforts, several new flotation technologies have been developed. Operation of a proof-of-concept continuous flotation circuit showed the selective flotation process to be sufficiently profitable to justify the development of a fossil resin industry. However, little attention has been given to the refining of the fossil resin flotation concentrate although solvent refining is a critical step for the fossil resin to become a marketable product. In view of this situation, DOE funded this two-year project to evaluate the following aspects of the fossil resin refining technology: 1) Characterization of the fossil resin flotation concentrate and its refined products; 2) Kinetics of fossil resin extraction; 3) Effects of operating variables on solvent extraction; 4) Extraction solvents; 5) Proof-of-concept continuous refining tests; and 6) Technical and economic analysis. The results from this research effort have led to the following conclusions: Hexane- or heptane-refined fossil resin has a light-yellow color, a melting point of 140 - 142{degrees}C, a density of 1.034 gram/cm, and good solubility in nonpolar solvents. Among the four solvents evaluated (hexane, heptane, toluene and ethyl acetate), hexane is the most appropriate solvent based on overall technical and economic considerations. Batch extraction tests and kinetic studies suggest that the main interaction between the resin and the solvent is expected to be the forces associated with solvation phenomena. Temperature has the most significant effect on extraction rate. With hexane as the solvent, a recovery of 90% cam be achieved at 50{degrees}C and 10% solids concentration with moderate agitation for 1 hour.

  9. Flanking region sequence information to refine microRNA target ...

    Indian Academy of Sciences (India)

    Prakash

    (SVM)-based target prediction refinement approach has been introduced through .... are kernel-based statistical learning machines, where a discriminant ...... Cox T and Cuff J 2002 The Ensembl genome database project;. Nucleic Acids Res.

  10. Petroleum refining industry in China

    International Nuclear Information System (INIS)

    Walls, W.D.

    2010-01-01

    The oil refining industry in China has faced rapid growth in oil imports of increasingly sour grades of crude with which to satisfy growing domestic demand for a slate of lighter and cleaner finished products sold at subsidized prices. At the same time, the world petroleum refining industry has been moving from one that serves primarily local and regional markets to one that serves global markets for finished products, as world refining capacity utilization has increased. Globally, refined product markets are likely to experience continued globalization until refining investments significantly expand capacity in key demand regions. We survey the oil refining industry in China in the context of the world market for heterogeneous crude oils and growing world trade in refined petroleum products. (author)

  11. A systematic review of COTS evaluation and selection approaches

    Directory of Open Access Journals (Sweden)

    Rakesh Garg

    2017-11-01

    Full Text Available In the past decades, a number of researchers have made their significant contributions to develop different approaches for solving a very challenging problem of commercial off-the shelf (COTS selection. The development of software with high quality and minimum development time has always been a difficult job for the software developers. Therefore, in today’s scenario, software developers move towards the implementation of component based software engineering that relies on the integration of small pieces of code namely (COTS. In this study, we present a comprehensive descriptive explanation of the various COTS evaluation and selection approaches developed by various researchers in the past to understand the concept of COTS selection. The advantages and disadvantages of each COTS selection approach are also provided, which will give a better prospect to the readers to understand the various existing COTS evaluation and selection approaches.

  12. On the refinement calculus

    CERN Document Server

    Vickers, Trevor

    1992-01-01

    On the Refinement Calculus gives one view of the development of the refinement calculus and its attempt to bring together - among other things - Z specifications and Dijkstra's programming language. It is an excellent source of reference material for all those seeking the background and mathematical underpinnings of the refinement calculus.

  13. Effect of Duration on Ti Grain Refinement of A356 and Melt Quality

    Science.gov (United States)

    Gürsoy, Özen; Erzi, Eray; Yüksel, Çağlar; Dispinar, Derya

    Grain refinement of aluminium alloys increases fluidity and feedability; and thus higher mechanical properties and decreased porosity is achieved. Typically, various ratios of Ti-B is used as grain refiner. It is well known that due to the sedimentation, the effectiveness of the grain refinement decreases which is called the fading effect. In this work, this effect has been investigated by means of melt quality. Two different melting temperatures were selected (725 and 750C) and samples were cast into die and sand mould. After the addition of grain refiners, samples were collected at 10 minutes of interval. Metallographic examinations were carried out where microstructural change and porosity distribution were investigated. The results were correlated with bifilm index (i.e. melt quality).

  14. North American refining

    International Nuclear Information System (INIS)

    Osten, James; Haltmaier, Susan

    2000-01-01

    This article examines the current status of the North American refining industry, and considers the North American economy and the growth in demand in the petroleum industry, petroleum product demand and quality, crude oil upgrading to meet product standards, and changes in crude oil feedstocks such as the use of heavier crudes and bitumens. Refining expansion, the declining profits in refining, and changes due to environmental standards are discussed. The Gross Domestic Product and oil demand for the USA, Canada, Mexico, and Venezuela for the years 1995-2020 are tabulated

  15. Refining SCJ Mission Specifications into Parallel Handler Designs

    Directory of Open Access Journals (Sweden)

    Frank Zeyda

    2013-05-01

    Full Text Available Safety-Critical Java (SCJ is a recent technology that restricts the execution and memory model of Java in such a way that applications can be statically analysed and certified for their real-time properties and safe use of memory. Our interest is in the development of comprehensive and sound techniques for the formal specification, refinement, design, and implementation of SCJ programs, using a correct-by-construction approach. As part of this work, we present here an account of laws and patterns that are of general use for the refinement of SCJ mission specifications into designs of parallel handlers used in the SCJ programming paradigm. Our notation is a combination of languages from the Circus family, supporting state-rich reactive models with the addition of class objects and real-time properties. Our work is a first step to elicit laws of programming for SCJ and fits into a refinement strategy that we have developed previously to derive SCJ programs.

  16. Survey for service selection approaches in dynamic environments

    CSIR Research Space (South Africa)

    Manqele, Lindelweyizizwe S

    2017-09-01

    Full Text Available The usage of the service selection approaches across different dynamic service provisioning environments has increased the challenges associated with an effective method that can be used to select a relevant service. The use of service selection...

  17. Current research progress in grain refinement of cast magnesium alloys: A review article

    International Nuclear Information System (INIS)

    Ali, Yahia; Qiu, Dong; Jiang, Bin; Pan, Fusheng; Zhang, Ming-Xing

    2015-01-01

    Grain refinement of cast magnesium alloys, particularly in magnesium–aluminium (Mg–Al) based alloys, has been an active research topic in the past two decades, because it has been considered as one of the most effective approaches to simultaneously increase the strength, ductility and formability. The development of new grain refiners was normally based on the theories/models that were established through comprehensive and considerable studies of grain refinement in cast Al alloys. Generally, grain refinement in cast Al can be achieved through either inoculation treatment, which is a process of adding, or in situ forming, foreign particles to promote heterogeneous nucleation rate, or restricting grain growth by controlling the constitutional supercooling or both. But, the concrete and tangible grain refinement mechanism in cast metals is still not fully understood and there are a number of controversies. Therefore, most of the new developed grain refiners for Mg–Al based alloys are not as efficient as the commercially available ones, such as zirconium in non-Al containing Mg alloys. To facilitate the research in grain refinement of cast magnesium alloys, this review starts with highlighting the theoretical aspects of grain refinement in cast metals, followed by reviewing the latest research progress in grain refinement of magnesium alloys in terms of the solute effect and potent nucleants

  18. Current research progress in grain refinement of cast magnesium alloys: A review article

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Yahia; Qiu, Dong [School of Mechanical and Mining Engineering, University of Queensland, St Lucia, QLD 4072 (Australia); Jiang, Bin; Pan, Fusheng [College of Materials Science and Engineering, Chongqing University, Chongqing 400030 (China); Zhang, Ming-Xing, E-mail: Mingxing.Zhang@uq.edu.au [School of Mechanical and Mining Engineering, University of Queensland, St Lucia, QLD 4072 (Australia)

    2015-01-15

    Grain refinement of cast magnesium alloys, particularly in magnesium–aluminium (Mg–Al) based alloys, has been an active research topic in the past two decades, because it has been considered as one of the most effective approaches to simultaneously increase the strength, ductility and formability. The development of new grain refiners was normally based on the theories/models that were established through comprehensive and considerable studies of grain refinement in cast Al alloys. Generally, grain refinement in cast Al can be achieved through either inoculation treatment, which is a process of adding, or in situ forming, foreign particles to promote heterogeneous nucleation rate, or restricting grain growth by controlling the constitutional supercooling or both. But, the concrete and tangible grain refinement mechanism in cast metals is still not fully understood and there are a number of controversies. Therefore, most of the new developed grain refiners for Mg–Al based alloys are not as efficient as the commercially available ones, such as zirconium in non-Al containing Mg alloys. To facilitate the research in grain refinement of cast magnesium alloys, this review starts with highlighting the theoretical aspects of grain refinement in cast metals, followed by reviewing the latest research progress in grain refinement of magnesium alloys in terms of the solute effect and potent nucleants.

  19. Refinement of Parallel and Reactive Programs

    OpenAIRE

    Back, R. J. R.

    1992-01-01

    We show how to apply the refinement calculus to stepwise refinement of parallel and reactive programs. We use action systems as our basic program model. Action systems are sequential programs which can be implemented in a parallel fashion. Hence refinement calculus methods, originally developed for sequential programs, carry over to the derivation of parallel programs. Refinement of reactive programs is handled by data refinement techniques originally developed for the sequential refinement c...

  20. Creating value in refining

    International Nuclear Information System (INIS)

    Cobb, C.B.

    2001-01-01

    This article focuses on recent developments in the US refining industry and presents a model for improving the performance of refineries based on the analysis of the refining industry by Cap Gemini Ernst and Young. The identification of refineries in risk of failing, the construction of pipelines for refinery products from Gulf State refineries, mergers and acquisitions, and poor financial performance are discussed. Current challenges concerning the stagnant demand for refinery products, environmental regulations, and shareholder value are highlighted. The structure of the industry, the creation of value in refining, and the search for business models are examined. The top 25 US companies and US refining business groups are listed

  1. The effects of predictor method factors on selection outcomes: A modular approach to personnel selection procedures.

    Science.gov (United States)

    Lievens, Filip; Sackett, Paul R

    2017-01-01

    Past reviews and meta-analyses typically conceptualized and examined selection procedures as holistic entities. We draw on the product design literature to propose a modular approach as a complementary perspective to conceptualizing selection procedures. A modular approach means that a product is broken down into its key underlying components. Therefore, we start by presenting a modular framework that identifies the important measurement components of selection procedures. Next, we adopt this modular lens for reviewing the available evidence regarding each of these components in terms of affecting validity, subgroup differences, and applicant perceptions, as well as for identifying new research directions. As a complement to the historical focus on holistic selection procedures, we posit that the theoretical contributions of a modular approach include improved insight into the isolated workings of the different components underlying selection procedures and greater theoretical connectivity among different selection procedures and their literatures. We also outline how organizations can put a modular approach into operation to increase the variety in selection procedures and to enhance the flexibility in designing them. Overall, we believe that a modular perspective on selection procedures will provide the impetus for programmatic and theory-driven research on the different measurement components of selection procedures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco

    2012-01-01

    to a small set of abstract events that can be handled by automata learning tools. In this article, we show how such abstractions can be constructed fully automatically for a restricted class of extended finite state machines in which one can test for equality of data parameters, but no operations on data...... are allowed. Our approach uses counterexample-guided abstraction refinement: whenever the current abstraction is too coarse and induces nondeterministic behavior, the abstraction is refined automatically. Using Tomte, a prototype tool implementing our algorithm, we have succeeded to learn – fully......Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete events...

  3. Categorization and selection of regulatory approaches for nuclear power plants

    International Nuclear Information System (INIS)

    Sugaya, Junko; Harayama, Yuko

    2009-01-01

    Several new regulatory approaches have been introduced to Japanese nuclear safety regulations, in which a prescriptive and deterministic approach had traditionally predominated. However, the options of regulatory approaches that can possibly be applied to nuclear safety regulations as well as the methodology for selecting the options are not systematically defined. In this study, various regulatory approaches for nuclear power plants are categorized as prescriptive or nonprescriptive, outcome-based or process-based, and deterministic or risk-informed. 18 options of regulatory approaches are conceptually developed and the conditions for selecting the appropriate regulatory approaches are identified. Current issues on nuclear regulations regarding responsibilities, transparency, consensus standards and regulatory inspections are examined from the viewpoints of regulatory approaches to verify usefulness of the categorization and selection concept of regulatory approaches. Finally, some of the challenges at the transitional phase of regulatory approaches are discussed. (author)

  4. Macromolecular refinement by model morphing using non-atomic parameterizations.

    Science.gov (United States)

    Cowtan, Kevin; Agirre, Jon

    2018-02-01

    Refinement is a critical step in the determination of a model which explains the crystallographic observations and thus best accounts for the missing phase components. The scattering density is usually described in terms of atomic parameters; however, in macromolecular crystallography the resolution of the data is generally insufficient to determine the values of these parameters for individual atoms. Stereochemical and geometric restraints are used to provide additional information, but produce interrelationships between parameters which slow convergence, resulting in longer refinement times. An alternative approach is proposed in which parameters are not attached to atoms, but to regions of the electron-density map. These parameters can move the density or change the local temperature factor to better explain the structure factors. Varying the size of the region which determines the parameters at a particular position in the map allows the method to be applied at different resolutions without the use of restraints. Potential applications include initial refinement of molecular-replacement models with domain motions, and potentially the use of electron density from other sources such as electron cryo-microscopy (cryo-EM) as the refinement model.

  5. Southeast Asian oil markets and refining

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D. [FACTS, Inc., Honolulu, Hawaii (United States)

    1999-09-01

    An overview of the Southeast Asian oil markets and refining is presented concentrating on Brunei, Malaysia, the Philippines, Singapore and Thailand refiners. Key statistics of the refiners in this region are tabulated. The demand and the quality of Indonesian, Malaysian, Philippine, Singapore and Thai petroleum products are analysed. Crude distillation unit capacity trends in the Southeastern Asian refining industry are discussed along with cracking to distillation ratios, refining in these countries, and the impact of changes in demand and refining on the product trade.

  6. Southeast Asian oil markets and refining

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.

    1999-01-01

    An overview of the Southeast Asian oil markets and refining is presented concentrating on Brunei, Malaysia, the Philippines, Singapore and Thailand refiners. Key statistics of the refiners in this region are tabulated. The demand and the quality of Indonesian, Malaysian, Philippine, Singapore and Thai petroleum products are analysed. Crude distillation unit capacity trends in the Southeastern Asian refining industry are discussed along with cracking to distillation ratios, refining in these countries, and the impact of changes in demand and refining on the product trade

  7. Structure refinement and membrane positioning of selectively labeled OmpX in phospholipid nanodiscs

    Energy Technology Data Exchange (ETDEWEB)

    Hagn, Franz, E-mail: franz.hagn@tum.de; Wagner, Gerhard, E-mail: gerhard-wagner@hms.harvard.edu [Harvard Medical School, Department of Biological Chemistry and Molecular Pharmacology (United States)

    2015-04-15

    NMR structural studies on membrane proteins are often complicated by their large size, taking into account the contribution of the membrane mimetic. Therefore, classical resonance assignment approaches often fail. The large size of phospholipid nanodiscs, a detergent-free phospholipid bilayer mimetic, prevented their use in high-resolution solution-state NMR spectroscopy so far. We recently introduced smaller nanodiscs that are suitable for NMR structure determination. However, side-chain assignments of a membrane protein in nanodiscs still remain elusive. Here, we utilized a NOE-based approach to assign (stereo-) specifically labeled Ile, Leu, Val and Ala methyl labeled and uniformly {sup 15}N-Phe and {sup 15}N-Tyr labeled OmpX and calculated a refined high-resolution structure. In addition, we were able to obtain residual dipolar couplings (RDCs) of OmpX in nanodiscs using Pf1 phage medium for the induction of weak alignment. Back-calculated NOESY spectra of the obtained NMR structures were compared to experimental NOESYs in order to validate the quality of these structures. We further used NOE information between protonated lipid head groups and side-chain methyls to determine the position of OmpX in the phospholipid bilayer. These data were verified by paramagnetic relaxation enhancement (PRE) experiments obtained with Gd{sup 3+}-modified lipids. Taken together, this study emphasizes the need for the (stereo-) specific labeling of membrane proteins in a highly deuterated background for high-resolution structure determination, particularly in large membrane mimicking systems like phospholipid nanodiscs. Structure validation by NOESY back-calculation will be helpful for the structure determination and validation of membrane proteins where NOE assignment is often difficult. The use of protein to lipid NOEs will be beneficial for the positioning of a membrane protein in the lipid bilayer without the need for preparing multiple protein samples.

  8. Modeling pH-zone refining countercurrent chromatography: a dynamic approach.

    Science.gov (United States)

    Kotland, Alexis; Chollet, Sébastien; Autret, Jean-Marie; Diard, Catherine; Marchal, Luc; Renault, Jean-Hugues

    2015-04-24

    A model based on mass transfer resistances and acid-base equilibriums at the liquid-liquid interface was developed for the pH-zone refining mode when it is used in countercurrent chromatography (CCC). The binary separation of catharanthine and vindoline, two alkaloids used as starting material for the semi-synthesis of chemotherapy drugs, was chosen for the model validation. Toluene/CH3CN/water (4/1/5, v/v/v) was selected as biphasic solvent system. First, hydrodynamics and mass transfer were studied by using chemical tracers. Trypan blue only present in the aqueous phase allowed the determination of the parameters τextra and Pe for hydrodynamic characterization whereas acetone, which partitioned between the two phases, allowed the determination of the transfer parameter k0a. It was shown that mass transfer was improved by increasing both flow rate and rotational speed, which is consistent with the observed mobile phase dispersion. Then, the different transfer parameters of the model (i.e. the local transfer coefficient for the different species involved in the process) were determined by fitting experimental concentration profiles. The model accurately predicted both equilibrium and dynamics factors (i.e. local mass transfer coefficients and acid-base equilibrium constant) variation with the CCC operating conditions (cell number, flow rate, rotational speed and thus stationary phase retention). The initial hypotheses (the acid-base reactions occurs instantaneously at the interface and the process is mainly governed by mass transfer) are thus validated. Finally, the model was used as a tool for catharanthine and vindoline separation prediction in the whole experimental domain that corresponded to a flow rate between 20 and 60 mL/min and rotational speeds from 900 and 2100 rotation per minutes. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Effect of strontium on the grain refining efficiency of Mg-3Al alloy refined by carbon inoculation

    International Nuclear Information System (INIS)

    Du Jun; Yang Jian; Kuwabara, Mamoru; Li Wenfang; Peng Jihua

    2009-01-01

    The effect of Sr on the grain refining efficiency of the Mg-3Al alloy refined by carbon inoculation has been investigated in the present study. A significant grain refinement was obtained for the Mg-3Al alloy treated with either 0.2% C or 0.2% Sr. The Al-C-O particles were found in the sample refined by 0.2% C, and the element O should come from reaction between Al 4 C 3 nuclei of Mg grains and water during the process of sample preparation. The grain size of the sample refined by carbon inoculation was further decreased after the combined addition of Sr. The grain size decreased with increasing Sr content. Much higher refining efficiency was obtained when the Sr addition was increased to 0.5%. Sr is an effective element to improve the grain refining efficiency for the Mg-Al alloys refined by carbon inoculation. The number of Al 4 C 3 particles in the sample refined by the combination of carbon and Sr was more than that in the sample refined by only carbon. No Al-C-O-Sr-rich particles were obviously found in the sample refined by the combination of carbon and a little (<0.5%) Sr addition

  10. The Generation of AlmFe in Dilute Aluminium Alloys with Different Grain Refining Additions

    Science.gov (United States)

    Meredith, M. W.; Greer, A. L.; Evans, P. V.; Hamerton, R. G.

    Al13Fe4, Al6Fe and AlmFe are common intermetallics in commercial AA1XXX series Al alloys. Grain-refining additions (based on either Al-Ti-B or Al-Ti-C) are usually added to such alloys during solidification processing to aid the grain structure development. They also influence the favoured intermetallic and, hence, can affect the materials' properties. This work simulates commercial casting practices in an attempt to determine the mechanisms by which one intermetallic phase is favoured over another by the introduction of grain-refining additions. Directional solidification experiments on Al-0.3wt.%Fe-0.15wt.%Si with and without grain refiner are conducted using Bridgman apparatus. The type, amount and effectiveness of the grain-refining additions are altered and the resulting intermetallic phase selection followed. The materials are characterised using optical microscopy, scanning electron microscopy and X-ray diffraction. AlmFe is seen to form when Al-Ti-B grain-refiner is introduced but only when the refinement is successful; reducing the effectiveness of the refiner led to Al6Fe forming under all conditions. Al-Ti-C refiners are seen to promote AlmFe at lower solidification velocities than when Al-Ti-B was used even though the grain structure was not as refined. These trends can be explained within existing eutectic theory, by considering growth undercooling.

  11. Linearly Refined Session Types

    Directory of Open Access Journals (Sweden)

    Pedro Baltazar

    2012-11-01

    Full Text Available Session types capture precise protocol structure in concurrent programming, but do not specify properties of the exchanged values beyond their basic type. Refinement types are a form of dependent types that can address this limitation, combining types with logical formulae that may refer to program values and can constrain types using arbitrary predicates. We present a pi calculus with assume and assert operations, typed using a session discipline that incorporates refinement formulae written in a fragment of Multiplicative Linear Logic. Our original combination of session and refinement types, together with the well established benefits of linearity, allows very fine-grained specifications of communication protocols in which refinement formulae are treated as logical resources rather than persistent truths.

  12. Statistical approach for selection of biologically informative genes.

    Science.gov (United States)

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  13. A refined regional modeling approach for the Corn Belt - Experiences and recommendations for large-scale integrated modeling

    Science.gov (United States)

    Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.

    2015-05-01

    Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential

  14. An Emergency Georeferencing Framework for GF-4 Imagery Based on GCP Prediction and Dynamic RPC Refinement

    Directory of Open Access Journals (Sweden)

    Pengfei Li

    2017-10-01

    Full Text Available GaoFen-4 (GF-4 imagery has very potential in terms of emergency response due to its gazing mode. However, only poor geometric accuracy can be obtained using the rational polynomial coefficient (RPC parameters provided, making ground control points (GCPs necessary for emergency response. However, selecting GCPs is traditionally time-consuming, labor-intensive, and not fully reliable. This is mainly due to the facts that (1 manual GCP selection is time-consuming and cumbersome because of too many human interventions, especially for the first few GCPs; (2 typically, GF-4 gives planar array imagery acquired at rather large tilt angles, and the distortion introduces problems in image matching; (3 reference data will not always be available, especially under emergency circumstances. This paper provides a novel emergency georeferencing framework for GF-4 Level 1 imagery. The key feature is GCP prediction based on dynamic RPC refinement, which is able to predict even the first GCP and the prediction will be dynamically refined as the selection goes on. This is done by two techniques: (1 GCP prediction using RPC parameters and (2 dynamic RPC refinement using as few as only one GCP. Besides, online map services are also adopted to automatically provide reference data. Experimental results show that (1 GCP predictions improve using dynamic RPC refinement; (2 GCP selection becomes more efficient with GCP prediction; (3 the integration of online map services constitutes a good example for emergency response.

  15. Structural heredity of TiC and its influences on refinement behaviors of AlTiC master alloy

    Institute of Scientific and Technical Information of China (English)

    王振卿; 刘相法; 柳延辉; 张均燕; 于丽娜; 边秀房

    2003-01-01

    Heredity of microstructure in AlTiC master alloy, grain refiners, was analyzed. It is found that, for morphologies and distributions of TiC particles, there are visible heredity which originates from raw materials or processing methods of Al melt, and will ultimately be transferred to the solid state structure through the melt stage, and this phenomenon can cause hereditary influences on refinement: formation of chain-like TiC morphology results in rapid refinement fading behavior; distribution of TiC along grain boundaries greatly reduces refinement efficiency. Controlling of structural heredity through proper selections of raw materials and processing parameters is of great importance in obtaining ideal microstructures and improving refinement behaviors of AlTiC master alloys.

  16. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation

    International Nuclear Information System (INIS)

    Fournier, D.

    2011-01-01

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  17. Hybrid feature selection for supporting lightweight intrusion detection systems

    Science.gov (United States)

    Song, Jianglong; Zhao, Wentao; Liu, Qiang; Wang, Xin

    2017-08-01

    Redundant and irrelevant features not only cause high resource consumption but also degrade the performance of Intrusion Detection Systems (IDS), especially when coping with big data. These features slow down the process of training and testing in network traffic classification. Therefore, a hybrid feature selection approach in combination with wrapper and filter selection is designed in this paper to build a lightweight intrusion detection system. Two main phases are involved in this method. The first phase conducts a preliminary search for an optimal subset of features, in which the chi-square feature selection is utilized. The selected set of features from the previous phase is further refined in the second phase in a wrapper manner, in which the Random Forest(RF) is used to guide the selection process and retain an optimized set of features. After that, we build an RF-based detection model and make a fair comparison with other approaches. The experimental results on NSL-KDD datasets show that our approach results are in higher detection accuracy as well as faster training and testing processes.

  18. Iterative feature refinement for accurate undersampled MR image reconstruction

    Science.gov (United States)

    Wang, Shanshan; Liu, Jianbo; Liu, Qiegen; Ying, Leslie; Liu, Xin; Zheng, Hairong; Liang, Dong

    2016-05-01

    Accelerating MR scan is of great significance for clinical, research and advanced applications, and one main effort to achieve this is the utilization of compressed sensing (CS) theory. Nevertheless, the existing CSMRI approaches still have limitations such as fine structure loss or high computational complexity. This paper proposes a novel iterative feature refinement (IFR) module for accurate MR image reconstruction from undersampled K-space data. Integrating IFR with CSMRI which is equipped with fixed transforms, we develop an IFR-CS method to restore meaningful structures and details that are originally discarded without introducing too much additional complexity. Specifically, the proposed IFR-CS is realized with three iterative steps, namely sparsity-promoting denoising, feature refinement and Tikhonov regularization. Experimental results on both simulated and in vivo MR datasets have shown that the proposed module has a strong capability to capture image details, and that IFR-CS is comparable and even superior to other state-of-the-art reconstruction approaches.

  19. Iterative feature refinement for accurate undersampled MR image reconstruction

    International Nuclear Information System (INIS)

    Wang, Shanshan; Liu, Jianbo; Liu, Xin; Zheng, Hairong; Liang, Dong; Liu, Qiegen; Ying, Leslie

    2016-01-01

    Accelerating MR scan is of great significance for clinical, research and advanced applications, and one main effort to achieve this is the utilization of compressed sensing (CS) theory. Nevertheless, the existing CSMRI approaches still have limitations such as fine structure loss or high computational complexity. This paper proposes a novel iterative feature refinement (IFR) module for accurate MR image reconstruction from undersampled K-space data. Integrating IFR with CSMRI which is equipped with fixed transforms, we develop an IFR-CS method to restore meaningful structures and details that are originally discarded without introducing too much additional complexity. Specifically, the proposed IFR-CS is realized with three iterative steps, namely sparsity-promoting denoising, feature refinement and Tikhonov regularization. Experimental results on both simulated and in vivo MR datasets have shown that the proposed module has a strong capability to capture image details, and that IFR-CS is comparable and even superior to other state-of-the-art reconstruction approaches. (paper)

  20. Refining shale-oil distillates

    Energy Technology Data Exchange (ETDEWEB)

    Altpeter, J

    1952-03-17

    A process is described for refining distillates from shale oil, brown coal, tar, and other tar products by extraction with selective solvents, such as lower alcohols, halogen-hydrins, dichlorodiethyl ether, liquid sulfur dioxide, and so forth, as well as treating with alkali solution, characterized in that the distillate is first treated with completely or almost completely recovered phenol or cresotate solution, the oil is separated from the phenolate with solvent, for example concentrated or adjusted to a determined water content of lower alcohol, furfural, halogen-hydrin, dichlorodiethyl ether, liquid sulfur dioxide, or the like, extracted, and the raffinate separated from the extract layer, if necessary after distillation or washing out of solvent, and freeing with alkali solution from residual phenol or creosol.

  1. Commercial refining in the Mediterranean

    International Nuclear Information System (INIS)

    Packer, P.

    1999-01-01

    About 9% of the world's oil refining capacity is on the Mediterranean: some of the world's biggest and most advanced refineries are on Sicily and Sardinia. The Mediterranean refineries are important suppliers to southern Europe and N. Africa. The article discusses commercial refining in the Mediterranean under the headings of (i) historic development, (ii) product demand, (iii) refinery configurations, (iv) refined product trade, (v) financial performance and (vi) future outlook. Although some difficulties are foreseen, refining in the Mediterranean is likely to continue to be important well into the 21st century. (UK)

  2. Refining Nodes and Edges of State Machines

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Snook, Colin

    2011-01-01

    State machines are hierarchical automata that are widely used to structure complex behavioural specifications. We develop two notions of refinement of state machines, node refinement and edge refinement. We compare the two notions by means of examples and argue that, by adopting simple conventions...... refinement theory and UML-B state machine refinement influences the style of node refinement. Hence we propose a method with direct proof of state machine refinement avoiding the detour via Event-B that is needed by UML-B....

  3. Lung Segmentation Refinement based on Optimal Surface Finding Utilizing a Hybrid Desktop/Virtual Reality User Interface

    Science.gov (United States)

    Sun, Shanhui; Sonka, Milan; Beichel, Reinhard R.

    2013-01-01

    Recently, the optimal surface finding (OSF) and layered optimal graph image segmentation of multiple objects and surfaces (LOGISMOS) approaches have been reported with applications to medical image segmentation tasks. While providing high levels of performance, these approaches may locally fail in the presence of pathology or other local challenges. Due to the image data variability, finding a suitable cost function that would be applicable to all image locations may not be feasible. This paper presents a new interactive refinement approach for correcting local segmentation errors in the automated OSF-based segmentation. A hybrid desktop/virtual reality user interface was developed for efficient interaction with the segmentations utilizing state-of-the-art stereoscopic visualization technology and advanced interaction techniques. The user interface allows a natural and interactive manipulation on 3-D surfaces. The approach was evaluated on 30 test cases from 18 CT lung datasets, which showed local segmentation errors after employing an automated OSF-based lung segmentation. The performed experiments exhibited significant increase in performance in terms of mean absolute surface distance errors (2.54 ± 0.75 mm prior to refinement vs. 1.11 ± 0.43 mm post-refinement, p ≪ 0.001). Speed of the interactions is one of the most important aspects leading to the acceptance or rejection of the approach by users expecting real-time interaction experience. The average algorithm computing time per refinement iteration was 150 ms, and the average total user interaction time required for reaching complete operator satisfaction per case was about 2 min. This time was mostly spent on human-controlled manipulation of the object to identify whether additional refinement was necessary and to approve the final segmentation result. The reported principle is generally applicable to segmentation problems beyond lung segmentation in CT scans as long as the underlying segmentation

  4. Lung segmentation refinement based on optimal surface finding utilizing a hybrid desktop/virtual reality user interface.

    Science.gov (United States)

    Sun, Shanhui; Sonka, Milan; Beichel, Reinhard R

    2013-01-01

    Recently, the optimal surface finding (OSF) and layered optimal graph image segmentation of multiple objects and surfaces (LOGISMOS) approaches have been reported with applications to medical image segmentation tasks. While providing high levels of performance, these approaches may locally fail in the presence of pathology or other local challenges. Due to the image data variability, finding a suitable cost function that would be applicable to all image locations may not be feasible. This paper presents a new interactive refinement approach for correcting local segmentation errors in the automated OSF-based segmentation. A hybrid desktop/virtual reality user interface was developed for efficient interaction with the segmentations utilizing state-of-the-art stereoscopic visualization technology and advanced interaction techniques. The user interface allows a natural and interactive manipulation of 3-D surfaces. The approach was evaluated on 30 test cases from 18 CT lung datasets, which showed local segmentation errors after employing an automated OSF-based lung segmentation. The performed experiments exhibited significant increase in performance in terms of mean absolute surface distance errors (2.54±0.75 mm prior to refinement vs. 1.11±0.43 mm post-refinement, p≪0.001). Speed of the interactions is one of the most important aspects leading to the acceptance or rejection of the approach by users expecting real-time interaction experience. The average algorithm computing time per refinement iteration was 150 ms, and the average total user interaction time required for reaching complete operator satisfaction was about 2 min per case. This time was mostly spent on human-controlled manipulation of the object to identify whether additional refinement was necessary and to approve the final segmentation result. The reported principle is generally applicable to segmentation problems beyond lung segmentation in CT scans as long as the underlying segmentation utilizes the

  5. Anomalies in the refinement of isoleucine

    International Nuclear Information System (INIS)

    Berntsen, Karen R. M.; Vriend, Gert

    2014-01-01

    The side-chain torsion angles of isoleucines in X-ray protein structures are a function of resolution, secondary structure and refinement software. Detailing the standard torsion angles used in refinement software can improve protein structure refinement. A study of isoleucines in protein structures solved using X-ray crystallography revealed a series of systematic trends for the two side-chain torsion angles χ 1 and χ 2 dependent on the resolution, secondary structure and refinement software used. The average torsion angles for the nine rotamers were similar in high-resolution structures solved using either the REFMAC, CNS or PHENIX software. However, at low resolution these programs often refine towards somewhat different χ 1 and χ 2 values. Small systematic differences can be observed between refinement software that uses molecular dynamics-type energy terms (for example CNS) and software that does not use these terms (for example REFMAC). Detailing the standard torsion angles used in refinement software can improve the refinement of protein structures. The target values in the molecular dynamics-type energy functions can also be improved

  6. Anomalies in the refinement of isoleucine

    Energy Technology Data Exchange (ETDEWEB)

    Berntsen, Karen R. M.; Vriend, Gert, E-mail: gerrit.vriend@radboudumc.nl [Radboud University Medical Center, Geert Grooteplein 26-28, 6525 GA Nijmegen (Netherlands)

    2014-04-01

    The side-chain torsion angles of isoleucines in X-ray protein structures are a function of resolution, secondary structure and refinement software. Detailing the standard torsion angles used in refinement software can improve protein structure refinement. A study of isoleucines in protein structures solved using X-ray crystallography revealed a series of systematic trends for the two side-chain torsion angles χ{sub 1} and χ{sub 2} dependent on the resolution, secondary structure and refinement software used. The average torsion angles for the nine rotamers were similar in high-resolution structures solved using either the REFMAC, CNS or PHENIX software. However, at low resolution these programs often refine towards somewhat different χ{sub 1} and χ{sub 2} values. Small systematic differences can be observed between refinement software that uses molecular dynamics-type energy terms (for example CNS) and software that does not use these terms (for example REFMAC). Detailing the standard torsion angles used in refinement software can improve the refinement of protein structures. The target values in the molecular dynamics-type energy functions can also be improved.

  7. Procurement planning in oil refining industries considering blending operations

    DEFF Research Database (Denmark)

    Oddsdottir, Thordis Anna; Grunow, Martin; Akkerman, Renzo

    2013-01-01

    This paper addresses procurement planning in oil refining, which has until now only had limited attention in the literature. We introduce a mixed integer nonlinear programming (MINLP) model and develop a novel two-stage solution approach, which aims at computational efficiency while addressing...... parameters than in previous literature. The developed approach is tested using historical data from Statoil A/S as well as through a comprehensive numerical analysis. The approach generates a feasible procurement plan within acceptable computation time, is able to quickly adjust an existing plan to take...

  8. Bio-Refining of Carbohydrate-Rich Food Waste for Biofuels

    Directory of Open Access Journals (Sweden)

    Hoang-Tuong Nguyen Hao

    2015-06-01

    Full Text Available The global dependence on finite fossil fuel-derived energy is of serious concern given the predicted population increase. Over the past decades, bio-refining of woody biomass has received much attention, but data on food waste refining are sorely lacking, despite annual and global deposition of 1.3 billion tons in landfills. In addition to negative environmental impacts, this represents a squandering of valuable energy, water and nutrient resources. The potential of carbohydrate-rich food waste (CRFW for biofuel (by Rhodotorulla glutinis fermentation and biogas production (by calculating theoretical methane yield was therefore investigated using a novel integrated bio-refinery approach. In this approach, hydrolyzed CRFW from three different conditions was used for Rhodotorulla glutinis cultivation to produce biolipids, whilst residual solids after hydrolysis were characterized for methane recovery potential via anaerobic digestion. Initially, CRFW was hydrolysed using thermal- (Th, chemical- (Ch and Th-Ch combined hydrolysis (TCh, with the CRFW-leachate serving as a control (Pcon. Excessive foaming led to the loss of TCh cultures, while day-7 biomass yields were similar (3.4–3.6 g dry weight (DW L−1 for the remaining treatments. Total fatty acid methyl ester (FAME content of R. glutinis cultivated on CRFW hydrolysates were relatively low (~6.5% but quality parameters (i.e., cetane number, density, viscosity and higher heating values of biomass extracted biodiesel complied with ASTM standards. Despite low theoretical RS-derived methane potential, further research under optimised and scaled conditions will reveal the potential of this approach for the bio-refining of CRFW for energy recovery and value-added co-product production.

  9. Petroleum refining fitness assessment to the sectoral approaches to address climate change; Analise da aptidao do setor refino de petroleo as abordagens setoriais para lidar com as mudancas climaticas globais

    Energy Technology Data Exchange (ETDEWEB)

    Merschmann, Paulo Roberto de Campos

    2010-03-15

    The climate agreement that will take place from 2013 onwards needs to address some of the concerns that were not considered in the Kyoto Protocol. Such concerns include the absence of emission targets for big emitters developing countries and the impacts of unequal carbon-policies on the competitiveness of Annex 1 energy-intensive sectors. Sectoral approaches for energy-intensive sectors can be a solution to both concerns, mainly if they address climate change issues involving all the countries in which these sectors have a significant participation. A sector is a good candidate to the sectoral approaches if it has some characteristics. Such characteristics are high impact to the competitiveness of Annex 1 enterprises derived of the lack of compromises of enterprises located in non Annex 1 countries, high level of opportunities to mitigate GHG emissions based on the application of sectoral approaches and easy sectoral approaches implementation in the sector. Then, this work assesses the petroleum refining sector fitness to the sectoral approaches to address climate change. Also, this dissertation compares the petroleum refining sector characteristics to the characteristics of well suited sectors to the sectoral approaches. (author)

  10. Refining and petrochemicals

    Energy Technology Data Exchange (ETDEWEB)

    Constancio, Silva

    2006-07-01

    In 2004, refining margins showed a clear improvement that persisted throughout the first three quarters of 2005. This enabled oil companies to post significantly higher earnings for their refining activity in 2004 compared to 2003, with the results of the first half of 2005 confirming this trend. As for petrochemicals, despite a steady rise in the naphtha price, higher cash margins enabled a turnaround in 2004 as well as a clear improvement in oil company financial performance that should continue in 2005, judging by the net income figures reported for the first half-year. Despite this favorable business environment, capital expenditure in refining and petrochemicals remained at a low level, especially investment in new capacity, but a number of projects are being planned for the next five years. (author)

  11. Refining and petrochemicals

    International Nuclear Information System (INIS)

    Constancio, Silva

    2006-01-01

    In 2004, refining margins showed a clear improvement that persisted throughout the first three quarters of 2005. This enabled oil companies to post significantly higher earnings for their refining activity in 2004 compared to 2003, with the results of the first half of 2005 confirming this trend. As for petrochemicals, despite a steady rise in the naphtha price, higher cash margins enabled a turnaround in 2004 as well as a clear improvement in oil company financial performance that should continue in 2005, judging by the net income figures reported for the first half-year. Despite this favorable business environment, capital expenditure in refining and petrochemicals remained at a low level, especially investment in new capacity, but a number of projects are being planned for the next five years. (author)

  12. Multi-criteria Group Decision Making based on Linguistic Refined Neutrosophic Strategy

    OpenAIRE

    Kalyan Mondal; Surapati Pramanik; Bibhas C. Giri

    2018-01-01

    Multi-criteria group decision making (MCGDM) strategy, which consists of a group of experts acting collectively for best selection among all possible alternatives with respect to some criteria, is focused on in this study. To develop the paper, we define linguistic neutrosophic refine set.

  13. Selective amygdalohippocampectomy via trans-superior temporal gyrus keyhole approach.

    Science.gov (United States)

    Mathon, Bertrand; Clemenceau, Stéphane

    2016-04-01

    Hippocampal sclerosis is the most common cause of drug-resistant epilepsy amenable for surgical treatment and seizure control. The rationale of the selective amygdalohippocampectomy is to spare cerebral tissue not included in the seizure generator. Describe the selective amygdalohippocampectomy through the trans-superior temporal gyrus keyhole approach. Selective amygdalohippocampectomy for temporal lobe epilepsy is performed when the data (semiology, neuroimaging, electroencephalography) point to the mesial temporal structures. The trans-superior temporal gyrus keyhole approach is a minimally invasive and safe technique that allows disconnection of the temporal stem and resection of temporomesial structures.

  14. User-centric Query Refinement and Processing Using Granularity Based Strategies

    NARCIS (Netherlands)

    Zeng, Y.; Zhong, N.; Wang, Y.; Qin, Y.; Huang, Z.; Zhou, H; Yao, Y; van Harmelen, F.A.H.

    2011-01-01

    Under the context of large-scale scientific literatures, this paper provides a user-centric approach for refining and processing incomplete or vague query based on cognitive- and granularity-based strategies. From the viewpoints of user interests retention and granular information processing, we

  15. Niobium-base grain refiner for aluminium

    International Nuclear Information System (INIS)

    Silva Pontes, P. da; Robert, M.H.; Cupini, N.L.

    1980-01-01

    A new chemical grain refiner for aluminium has been developed, using inoculation of a niobium-base compound. When a bath of molten aluminium is inoculated whith this refiner, an intermetallic aluminium-niobium compound is formed which acts as a powerful nucleant, producing extremely fine structure comparable to those obtained by means of the traditional grain refiner based on titanium and boron. It was found that the refinement of the structure depends upon the weight percentage of the new refiner inoculated as well as the time of holding the bath after inoculation and before pouring, but mainly on the inoculating temperature. (Author) [pt

  16. Analysis of a HP-refinement method for solving the neutron transport equation using two error estimators

    International Nuclear Information System (INIS)

    Fournier, D.; Le Tellier, R.; Suteau, C.; Herbin, R.

    2011-01-01

    The solution of the time-independent neutron transport equation in a deterministic way invariably consists in the successive discretization of the three variables: energy, angle and space. In the SNATCH solver used in this study, the energy and the angle are respectively discretized with a multigroup approach and the discrete ordinate method. A set of spatial coupled transport equations is obtained and solved using the Discontinuous Galerkin Finite Element Method (DGFEM). Within this method, the spatial domain is decomposed into elements and the solution is approximated by a hierarchical polynomial basis in each one. This approach is time and memory consuming when the mesh becomes fine or the basis order high. To improve the computational time and the memory footprint, adaptive algorithms are proposed. These algorithms are based on an error estimation in each cell. If the error is important in a given region, the mesh has to be refined (h−refinement) or the polynomial basis order increased (p−refinement). This paper is related to the choice between the two types of refinement. Two ways to estimate the error are compared on different benchmarks. Analyzing the differences, a hp−refinement method is proposed and tested. (author)

  17. Indian refining industry

    International Nuclear Information System (INIS)

    Singh, I.J.

    2002-01-01

    The author discusses the history of the Indian refining industry and ongoing developments under the headings: the present state; refinery configuration; Indian capabilities for refinery projects; and reforms in the refining industry. Tables lists India's petroleum refineries giving location and capacity; new refinery projects together with location and capacity; and expansion projects of Indian petroleum refineries. The Indian refinery industry has undergone substantial expansion as well as technological changes over the past years. There has been progressive technology upgrading, energy efficiency, better environmental control and improved capacity utilisation. Major reform processes have been set in motion by the government of India: converting the refining industry from a centrally controlled public sector dominated industry to a delicensed regime in a competitive market economy with the introduction of a liberal exploration policy; dismantling the administered price mechanism; and a 25 year hydrocarbon vision. (UK)

  18. Spanish Refining

    International Nuclear Information System (INIS)

    Lores, F.R.

    2001-01-01

    An overview of petroleum refining in Spain is presented (by Repsol YPF) and some views on future trends are discussed. Spain depends heavily on imports. Sub-headings in the article cover: sources of crude imports, investments and logistics and marketing, -detailed data for each are shown diagrammatically. Tables show: (1) economic indicators (e.g. total GDP, vehicle numbers and inflation) for 1998-200; (2) crude oil imports for 1995-2000; (3) oil products balance for 1995-2000; (4) commodities demand, by product; (5) refining in Spain in terms of capacity per region; (6) outlets in Spain and other European countries in 2002 and (7) sales distribution channel by product

  19. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    OpenAIRE

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S.; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality i...

  20. Refining intra-protein contact prediction by graph analysis

    Directory of Open Access Journals (Sweden)

    Eyal Eran

    2007-05-01

    Full Text Available Abstract Background Accurate prediction of intra-protein residue contacts from sequence information will allow the prediction of protein structures. Basic predictions of such specific contacts can be further refined by jointly analyzing predicted contacts, and by adding information on the relative positions of contacts in the protein primary sequence. Results We introduce a method for graph analysis refinement of intra-protein contacts, termed GARP. Our previously presented intra-contact prediction method by means of pair-to-pair substitution matrix (P2PConPred was used to test the GARP method. In our approach, the top contact predictions obtained by a basic prediction method were used as edges to create a weighted graph. The edges were scored by a mutual clustering coefficient that identifies highly connected graph regions, and by the density of edges between the sequence regions of the edge nodes. A test set of 57 proteins with known structures was used to determine contacts. GARP improves the accuracy of the P2PConPred basic prediction method in whole proteins from 12% to 18%. Conclusion Using a simple approach we increased the contact prediction accuracy of a basic method by 1.5 times. Our graph approach is simple to implement, can be used with various basic prediction methods, and can provide input for further downstream analyses.

  1. Comparing Refinements for Failure and Bisimulation Semantics

    NARCIS (Netherlands)

    Eshuis, H.; Fokkinga, M.M.

    2002-01-01

    Refinement in bisimulation semantics is defined differently from refinement in failure semantics: in bisimulation semantics refinement is based on simulations between labelled transition systems, whereas in failure semantics refinement is based on inclusions between failure systems. There exist

  2. SFESA: a web server for pairwise alignment refinement by secondary structure shifts.

    Science.gov (United States)

    Tong, Jing; Pei, Jimin; Grishin, Nick V

    2015-09-03

    Protein sequence alignment is essential for a variety of tasks such as homology modeling and active site prediction. Alignment errors remain the main cause of low-quality structure models. A bioinformatics tool to refine alignments is needed to make protein alignments more accurate. We developed the SFESA web server to refine pairwise protein sequence alignments. Compared to the previous version of SFESA, which required a set of 3D coordinates for a protein, the new server will search a sequence database for the closest homolog with an available 3D structure to be used as a template. For each alignment block defined by secondary structure elements in the template, SFESA evaluates alignment variants generated by local shifts and selects the best-scoring alignment variant. A scoring function that combines the sequence score of profile-profile comparison and the structure score of template-derived contact energy is used for evaluation of alignments. PROMALS pairwise alignments refined by SFESA are more accurate than those produced by current advanced alignment methods such as HHpred and CNFpred. In addition, SFESA also improves alignments generated by other software. SFESA is a web-based tool for alignment refinement, designed for researchers to compute, refine, and evaluate pairwise alignments with a combined sequence and structure scoring of alignment blocks. To our knowledge, the SFESA web server is the only tool that refines alignments by evaluating local shifts of secondary structure elements. The SFESA web server is available at http://prodata.swmed.edu/sfesa.

  3. Crystal structure refinement with SHELXL

    Energy Technology Data Exchange (ETDEWEB)

    Sheldrick, George M., E-mail: gsheldr@shelx.uni-ac.gwdg.de [Department of Structural Chemistry, Georg-August Universität Göttingen, Tammannstraße 4, Göttingen 37077 (Germany)

    2015-01-01

    New features added to the refinement program SHELXL since 2008 are described and explained. The improvements in the crystal structure refinement program SHELXL have been closely coupled with the development and increasing importance of the CIF (Crystallographic Information Framework) format for validating and archiving crystal structures. An important simplification is that now only one file in CIF format (for convenience, referred to simply as ‘a CIF’) containing embedded reflection data and SHELXL instructions is needed for a complete structure archive; the program SHREDCIF can be used to extract the .hkl and .ins files required for further refinement with SHELXL. Recent developments in SHELXL facilitate refinement against neutron diffraction data, the treatment of H atoms, the determination of absolute structure, the input of partial structure factors and the refinement of twinned and disordered structures. SHELXL is available free to academics for the Windows, Linux and Mac OS X operating systems, and is particularly suitable for multiple-core processors.

  4. METHOD FOR SELECTION OF PROJECT MANAGEMENT APPROACH BASED ON FUZZY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Igor V. KONONENKO

    2017-03-01

    Full Text Available Literature analysis of works that devoted to research of the selection a project management approach and development of effective methods for this problem solution is given. Mathematical model and method for selection of project management approach with fuzzy concepts of applicability of existing approaches are proposed. The selection is made of such approaches as the PMBOK Guide, the ISO21500 standard, the PRINCE2 methodology, the SWEBOK Guide, agile methodologies Scrum, XP, and Kanban. The number of project parameters which have a great impact on the result of the selection and measure of their impact is determined. Project parameters relate to information about the project, team, communication, critical project risks. They include the number of people involved in the project, the customer's experience with this project team, the project team's experience in this field, the project team's understanding of requirements, adapting ability, initiative, and others. The suggested method is considered on the example of its application for selection a project management approach to software development project.

  5. An Approximate Approach to Automatic Kernel Selection.

    Science.gov (United States)

    Ding, Lizhong; Liao, Shizhong

    2016-02-02

    Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.

  6. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    Science.gov (United States)

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  7. Latin American oil markets and refining

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.; Obadia, C.

    1999-01-01

    This paper provides an overview of the oil markets and refining in Argentina, Brazil, Chile, Colombia, Ecuador, Mexico, Peru and Venezuela, and examines the production of crude oil in these countries. Details are given of Latin American refiners highlighting trends in crude distillation unit capacity, cracking to distillation ratios, and refining in the different countries. Latin American oil trade is discussed, and charts are presented illustrating crude production, oil consumption, crude refining capacity, cracking to distillation ratios, and oil imports and exports

  8. A parallel adaptive mesh refinement algorithm for predicting turbulent non-premixed combusting flows

    International Nuclear Information System (INIS)

    Gao, X.; Groth, C.P.T.

    2005-01-01

    A parallel adaptive mesh refinement (AMR) algorithm is proposed for predicting turbulent non-premixed combusting flows characteristic of gas turbine engine combustors. The Favre-averaged Navier-Stokes equations governing mixture and species transport for a reactive mixture of thermally perfect gases in two dimensions, the two transport equations of the κ-ψ turbulence model, and the time-averaged species transport equations, are all solved using a fully coupled finite-volume formulation. A flexible block-based hierarchical data structure is used to maintain the connectivity of the solution blocks in the multi-block mesh and facilitate automatic solution-directed mesh adaptation according to physics-based refinement criteria. This AMR approach allows for anisotropic mesh refinement and the block-based data structure readily permits efficient and scalable implementations of the algorithm on multi-processor architectures. Numerical results for turbulent non-premixed diffusion flames, including cold- and hot-flow predictions for a bluff body burner, are described and compared to available experimental data. The numerical results demonstrate the validity and potential of the parallel AMR approach for predicting complex non-premixed turbulent combusting flows. (author)

  9. Supplier selection problem: A fuzzy multicriteria approach | Allouche ...

    African Journals Online (AJOL)

    The purpose of this paper is to suggest a fuzzy multi-criteria approach to solve the supplier selection problem, an approach based on the fuzzy analytic hierarchy process and imprecise goal programming. To deal with decision-maker (DM) preferences, the concept of satisfaction function is introduced. The proposed ...

  10. APPLICATION OF DEEP LEARNING IN GLOBELAND30-2010 PRODUCT REFINEMENT

    Directory of Open Access Journals (Sweden)

    T. Liu

    2018-04-01

    Full Text Available GlobeLand30, as one of the best Global Land Cover (GLC product at 30-m resolution, has been widely used in many research fields. Due to the significant spectral confusion among different land cover types and limited textual information of Landsat data, the overall accuracy of GlobeLand30 is about 80 %. Although such accuracy is much higher than most other global land cover products, it cannot satisfy various applications. There is still a great need of an effective method to improve the quality of GlobeLand30. The explosive high-resolution satellite images and remarkable performance of Deep Learning on image classification provide a new opportunity to refine GlobeLand30. However, the performance of deep leaning depends on quality and quantity of training samples as well as model training strategy. Therefore, this paper 1 proposed an automatic training sample generation method via Google earth to build a large training sample set; and 2 explore the best training strategy for land cover classification using GoogleNet (Inception V3, one of the most widely used deep learning network. The result shows that the fine-tuning from first layer of Inception V3 using rough large sample set is the best strategy. The retrained network was then applied in one selected area from Xi’an city as a case study of GlobeLand30 refinement. The experiment results indicate that the proposed approach with Deep Learning and google earth imagery is a promising solution for further improving accuracy of GlobeLand30.

  11. Application of Deep Learning in GLOBELAND30-2010 Product Refinement

    Science.gov (United States)

    Liu, T.; Chen, X.

    2018-04-01

    GlobeLand30, as one of the best Global Land Cover (GLC) product at 30-m resolution, has been widely used in many research fields. Due to the significant spectral confusion among different land cover types and limited textual information of Landsat data, the overall accuracy of GlobeLand30 is about 80 %. Although such accuracy is much higher than most other global land cover products, it cannot satisfy various applications. There is still a great need of an effective method to improve the quality of GlobeLand30. The explosive high-resolution satellite images and remarkable performance of Deep Learning on image classification provide a new opportunity to refine GlobeLand30. However, the performance of deep leaning depends on quality and quantity of training samples as well as model training strategy. Therefore, this paper 1) proposed an automatic training sample generation method via Google earth to build a large training sample set; and 2) explore the best training strategy for land cover classification using GoogleNet (Inception V3), one of the most widely used deep learning network. The result shows that the fine-tuning from first layer of Inception V3 using rough large sample set is the best strategy. The retrained network was then applied in one selected area from Xi'an city as a case study of GlobeLand30 refinement. The experiment results indicate that the proposed approach with Deep Learning and google earth imagery is a promising solution for further improving accuracy of GlobeLand30.

  12. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    Science.gov (United States)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  13. Relational Demonic Fuzzy Refinement

    Directory of Open Access Journals (Sweden)

    Fairouz Tchier

    2014-01-01

    Full Text Available We use relational algebra to define a refinement fuzzy order called demonic fuzzy refinement and also the associated fuzzy operators which are fuzzy demonic join (⊔fuz, fuzzy demonic meet (⊓fuz, and fuzzy demonic composition (□fuz. Our definitions and properties are illustrated by some examples using mathematica software (fuzzy logic.

  14. The economic impact of taxes on refined petroleum products in the Philippines

    International Nuclear Information System (INIS)

    Boyd, R.; Uri, N.D.

    1993-01-01

    This paper uses an aggregate modelling approach to assess the impact of taxes on refined petroleum products on the Philippine economy. The effects of removing the 48% tax on premium and regular gasoline and the 24% tax on other refined petroleum products on prices and quantities are examined. For example, the consequences of a complete elimination of refined petroleum product taxes would be an increase in output by all producing sectors of about 3.7% or about 2.65 hundred billion Philippine pesos, a rise in the consumption of goods and services by about 13.6% or 4.2 hundred billion Philippine pesos, a rise in lower tax revenue for the government of 62.4% or 2.8 hundred billion Philippine pesos. When subjected to sensitivity analyses, the results are reasonably robust. (author)

  15. Panorama 2012 - Refining 2030

    International Nuclear Information System (INIS)

    Marion, Pierre; Saint-Antonin, Valerie

    2011-11-01

    The major uncertainty characterizing the global energy landscape impacts particularly on transport, which remains the virtually-exclusive bastion of the oil industry. The industry must therefore respond to increasing demand for mobility against a background marked by the emergence of alternatives to oil-based fuels and the need to reduce emissions of pollutants and greenhouse gases (GHG). It is in this context that the 'Refining 2030' study conducted by IFP Energies Nouvelles (IFPEN) forecasts what the global supply and demand balance for oil products could be, and highlights the type and geographical location of the refinery investment required. Our study shows that the bulk of the refining investment will be concentrated in the emerging countries (mainly those in Asia), whilst the areas historically strong in refining (Europe and North America) face reductions in capacity. In this context, the drastic reduction in the sulphur specification of bunker oil emerges as a structural issue for European refining, in the same way as increasingly restrictive regulation of refinery CO 2 emissions (quotas/taxation) and the persistent imbalance between gasoline and diesel fuels. (authors)

  16. Refining margins and prospects

    International Nuclear Information System (INIS)

    Baudouin, C.; Favennec, J.P.

    1997-01-01

    Refining margins throughout the world have remained low in 1996. In Europe, in spite of an improvement, particularly during the last few weeks, they are still not high enough to finance new investments. Although the demand for petroleum products is increasing, experts are still sceptical about any rapid recovery due to prevailing overcapacity and to continuing capacity growth. After a historical review of margins and an analysis of margins by regions, we analyse refining over-capacities in Europe and the unbalances between production and demand. Then we discuss the current situation concerning barriers to the rationalization, agreements between oil companies, and the consequences on the future of refining capacities and margins. (author)

  17. A quality-refinement process for medical imaging applications.

    Science.gov (United States)

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  18. Numerical modelling of surface waves generated by low frequency electromagnetic field for silicon refinement process

    Science.gov (United States)

    Geža, V.; Venčels, J.; Zāģeris, Ģ.; Pavlovs, S.

    2018-05-01

    One of the most perspective methods to produce SoG-Si is refinement via metallurgical route. The most critical part of this route is refinement from boron and phosphorus, therefore, approach under development will address this problem. An approach of creating surface waves on silicon melt’s surface is proposed in order to enlarge its area and accelerate removal of boron via chemical reactions and evaporation of phosphorus. A two dimensional numerical model is created which include coupling of electromagnetic and fluid dynamic simulations with free surface dynamics. First results show behaviour similar to experimental results from literature.

  19. Refinement Types for TypeScript

    OpenAIRE

    Vekris, Panagiotis; Cosman, Benjamin; Jhala, Ranjit

    2016-01-01

    We present Refined TypeScript (RSC), a lightweight refinement type system for TypeScript, that enables static verification of higher-order, imperative programs. We develop a formal core of RSC that delineates the interaction between refinement types and mutability. Next, we extend the core to account for the imperative and dynamic features of TypeScript. Finally, we evaluate RSC on a set of real world benchmarks, including parts of the Octane benchmarks, D3, Transducers, and the TypeScript co...

  20. Stock selection using a hybrid MCDM approach

    Directory of Open Access Journals (Sweden)

    Tea Poklepović

    2014-12-01

    Full Text Available The problem of selecting the right stocks to invest in is of immense interest for investors on both emerging and developed capital markets. Moreover, an investor should take into account all available data regarding stocks on the particular market. This includes fundamental and stock market indicators. The decision making process includes several stocks to invest in and more than one criterion. Therefore, the task of selecting the stocks to invest in can be viewed as a multiple criteria decision making (MCDM problem. Using several MCDM methods often leads to divergent rankings. The goal of this paper is to resolve these possible divergent results obtained from different MCDM methods using a hybrid MCDM approach based on Spearman’s rank correlation coefficient. Five MCDM methods are selected: COPRAS, linear assignment, PROMETHEE, SAW and TOPSIS. The weights for all criteria are obtained by using the AHP method. Data for this study includes information on stock returns and traded volumes from March 2012 to March 2014 for 19 stocks on the Croatian capital market. It also includes the most important fundamental and stock market indicators for selected stocks. Rankings using five selected MCDM methods in the stock selection problem yield divergent results. However, after applying the proposed approach the final hybrid rankings are obtained. The results show that the worse stocks to invest in happen to be the same when the industry is taken into consideration or when not. However, when the industry is taken into account, the best stocks to invest in are slightly different, because some industries are more profitable than the others.

  1. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba

    2016-07-20

    The native nature of high dimension low sample size of gene expression data make the classification task more challenging. Therefore, feature (gene) selection become an apparent need. Selecting a meaningful and relevant genes for classifier not only decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we present a new feature selection technique that takes advantage of clustering both samples and genes. Materials and methods We used leukemia gene expression dataset [1]. The effectiveness of the selected features were evaluated by four different classification methods; support vector machines, k-nearest neighbor, random forest, and linear discriminate analysis. The method evaluate the importance and relevance of each gene cluster by summing the expression level for each gene belongs to this cluster. The gene cluster consider important, if it satisfies conditions depend on thresholds and percentage otherwise eliminated. Results Initial analysis identified 7120 differentially expressed genes of leukemia (Fig. 15a), after applying our feature selection methodology we end up with specific 1117 genes discriminating two classes of leukemia (Fig. 15b). Further applying the same method with more stringent higher positive and lower negative threshold condition, number reduced to 58 genes have be tested to evaluate the effectiveness of the method (Fig. 15c). The results of the four classification methods are summarized in Table 11. Conclusions The feature selection method gave good results with minimum classification error. Our heat-map result shows distinct pattern of refines genes discriminating between two classes of leukemia.

  2. A Refinement Calculus for Circus - Mini-thesis

    OpenAIRE

    Oliveira, Marcel V. M.

    2004-01-01

    Most software developments do not use any of the existing theories and formalisms. This leads to a loss of precision and correctness on the resulting softwares. Two different approaches to formal techniques have been raised in the past decades: one focus on data aspects, and the other focus on the behavioural aspects of the system. Some combined languages have already been proposed to bring these two schools together. However, as far as we know, none of them has a related refinement calculus....

  3. Columnar to equiaxed transition in a refined Al-Cu alloy under diffusive and convective transport conditions

    Energy Technology Data Exchange (ETDEWEB)

    Dupouy, M.D.; Camel, D.; Mazille, J.E. [CEA Centre d' Etudes et de Recherches sur les Materiaux, 38 - Grenoble (France); Hugon, I. [Lab. de Metallographie, DCC/DTE/SIM, CEA Valrho (France)

    2000-07-01

    The columnar-equiaxed transition under diffusive transport conditions was studied in microgravity (EUROMIR95 and spacelab-LMS96) by solidifying four Al-4wt%Cu alloys refined at different levels, with a constant cooling rate (1 K/min), both under nearly isothermal conditions and under a decreasing temperature gradient. Isothermal samples showed a homogeneous equiaxed structure with no fading of the refiner efficiency. Gradient samples revealed a continuous transition consisting of an orientation of the microsegregation parallel to the solidification direction, without any grain selection effect. For comparison, ground samples evidence the influence of the motion of both refiner particles and growing equiaxed grains. (orig.)

  4. Probing the moduli dependence of refined topological amplitudes

    Directory of Open Access Journals (Sweden)

    I. Antoniadis

    2015-12-01

    Full Text Available With the aim of providing a worldsheet description of the refined topological string, we continue the study of a particular class of higher derivative couplings Fg,n in the type II string effective action compactified on a Calabi–Yau threefold. We analyse first order differential equations in the anti-holomorphic moduli of the theory, which relate the Fg,n to other component couplings. From the point of view of the topological theory, these equations describe the contribution of non-physical states to twisted correlation functions and encode an obstruction for interpreting the Fg,n as the free energy of the refined topological string theory. We investigate possibilities of lifting this obstruction by formulating conditions on the moduli dependence under which the differential equations simplify and take the form of generalised holomorphic anomaly equations. We further test this approach against explicit calculations in the dual heterotic theory.

  5. Neutron Powder Diffraction and Constrained Refinement

    DEFF Research Database (Denmark)

    Pawley, G. S.; Mackenzie, Gordon A.; Dietrich, O. W.

    1977-01-01

    The first use of a new program, EDINP, is reported. This program allows the constrained refinement of molecules in a crystal structure with neutron diffraction powder data. The structures of p-C6F4Br2 and p-C6F4I2 are determined by packing considerations and then refined with EDINP. Refinement is...

  6. Adaptive temporal refinement in injection molding

    Science.gov (United States)

    Karyofylli, Violeta; Schmitz, Mauritius; Hopmann, Christian; Behr, Marek

    2018-05-01

    Mold filling is an injection molding stage of great significance, because many defects of the plastic components (e.g. weld lines, burrs or insufficient filling) can occur during this process step. Therefore, it plays an important role in determining the quality of the produced parts. Our goal is the temporal refinement in the vicinity of the evolving melt front, in the context of 4D simplex-type space-time grids [1, 2]. This novel discretization method has an inherent flexibility to employ completely unstructured meshes with varying levels of resolution both in spatial dimensions and in the time dimension, thus allowing the use of local time-stepping during the simulations. This can lead to a higher simulation precision, while preserving calculation efficiency. A 3D benchmark case, which concerns the filling of a plate-shaped geometry, is used for verifying our numerical approach [3]. The simulation results obtained with the fully unstructured space-time discretization are compared to those obtained with the standard space-time method and to Moldflow simulation results. This example also serves for providing reliable timing measurements and the efficiency aspects of the filling simulation of complex 3D molds while applying adaptive temporal refinement.

  7. Refined geometric transition and qq-characters

    Science.gov (United States)

    Kimura, Taro; Mori, Hironori; Sugimoto, Yuji

    2018-01-01

    We show the refinement of the prescription for the geometric transition in the refined topological string theory and, as its application, discuss a possibility to describe qq-characters from the string theory point of view. Though the suggested way to operate the refined geometric transition has passed through several checks, it is additionally found in this paper that the presence of the preferred direction brings a nontrivial effect. We provide the modified formula involving this point. We then apply our prescription of the refined geometric transition to proposing the stringy description of doubly quantized Seiberg-Witten curves called qq-characters in certain cases.

  8. Calculating corrections in F-theory from refined BPS invariants and backreacted geometries

    Energy Technology Data Exchange (ETDEWEB)

    Poretschkin, Maximilian

    2015-07-01

    This thesis presents various corrections to F-theory compactifications which rely on the computation of refined Bogomol'nyi-Prasad-Sommerfield (BPS) invariants and the analysis of backreacted geometries. Detailed information about rigid supersymmetric theories in five dimensions is contained in an index counting refined BPS invariants. These BPS states fall into representations of SU(2){sub L} x SU(2){sub R}, the little group in five dimensions, which has an induced action on the cohomology of the moduli space of stable pairs. In the first part of this thesis, we present the computation of refined BPS state multiplicities associated to M-theory compactifications on local Calabi-Yau manifolds whose base is given by a del Pezzo or half K3 surface. For geometries with a toric realization we use an algorithm which is based on the Weierstrass normal form of the mirror geometry. In addition we use the refined holomorphic anomaly equation and the gap condition at the conifold locus in the moduli space in order to perform the direct integration and to fix the holomorphic ambiguity. In a second approach, we use the refined Goettsche formula and the refined modular anomaly equation that govern the (refined) genus expansion of the free energy of the half K3 surface. By this procedure, we compute the refined BPS invariants of the half K3 from which the results of the remaining del Pezzo surfaces are obtained by flop transitions and blow-downs. These calculations also make use of the high symmetry of the del Pezzo surfaces whose homology lattice contains the root lattice of exceptional Lie algebras. In cases where both approaches are applicable, we successfully check the compatibility of these two methods. In the second part of this thesis, we apply the results obtained from the calculation of the refined invariants of the del Pezzo respectively the half K3 surfaces to count non-perturbative objects in F-theory. The first application is given by BPS states of the E

  9. Petroleum movements and investments in the refining industry: The impact of worldwide environmental regulations

    International Nuclear Information System (INIS)

    Guariguata U., G.

    1995-01-01

    Since the enactment of the US Clean Air Act Amendments of 1990, the worldwide refining industry has aligned itself to become increasing attuned to the future well-being of the environment. Refiners must now develop strategies which address careful selection of crude slates, significant increases and changes in product movements, and upgrading of facilities to meet growing demand--in short, strategies which allow them to make substantial increases in capital investments. The objective of this paper is to determine the regional capital investments refiners must make in order to comply with environmental legislation. The methodology in making this determination was founded on a comprehensive analysis of worldwide petroleum supply/demand and distribution patterns for the coming five years, and included evaluation of a set of linear programming (LP) models based on forecasts for regional product demands and projections of regional specifications. The models considered two scenarios, in which either (1) refinery expansion occurs chiefly in the market consuming regions, or (2) crude producers take control of incremental crude volumes and further expand their planned refining projects and the marketing of refined products. The results of these models, coupled with an understanding of geopolitical situations and economic analyses, provided estimates for capital expenditures for the coming decade. In specific, the following issues were addressed, and are discussed in this paper: refined product trade outlook; crude supply; crude quality; shipping; and capital investments

  10. Validating neural-network refinements of nuclear mass models

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  11. A “genetics first” approach to selection

    Science.gov (United States)

    A different approach for using genomic information in genetic improvement is proposed. Past research in population genetics and animal breeding combined with information on sequence variants suggest the possibility that selection might be able to capture a portion of inbreeding and heterosis effect...

  12. Adaptive mesh refinement in titanium

    Energy Technology Data Exchange (ETDEWEB)

    Colella, Phillip; Wen, Tong

    2005-01-21

    In this paper, we evaluate Titanium's usability as a high-level parallel programming language through a case study, where we implement a subset of Chombo's functionality in Titanium. Chombo is a software package applying the Adaptive Mesh Refinement methodology to numerical Partial Differential Equations at the production level. In Chombo, the library approach is used to parallel programming (C++ and Fortran, with MPI), whereas Titanium is a Java dialect designed for high-performance scientific computing. The performance of our implementation is studied and compared with that of Chombo in solving Poisson's equation based on two grid configurations from a real application. Also provided are the counts of lines of code from both sides.

  13. Computational methods and modeling. 3. Adaptive Mesh Refinement for the Nodal Integral Method and Application to the Convection-Diffusion Equation

    International Nuclear Information System (INIS)

    Torej, Allen J.; Rizwan-Uddin

    2001-01-01

    The nodal integral method (NIM) has been developed for several problems, including the Navier-Stokes equations, the convection-diffusion equation, and the multigroup neutron diffusion equations. The coarse-mesh efficiency of the NIM is not fully realized in problems characterized by a wide range of spatial scales. However, the combination of adaptive mesh refinement (AMR) capability with the NIM can recover the coarse mesh efficiency by allowing high degrees of resolution in specific localized areas where it is needed and by using a lower resolution everywhere else. Furthermore, certain features of the NIM can be fruitfully exploited in the application of the AMR process. In this paper, we outline a general approach to couple nodal schemes with AMR and then apply it to the convection-diffusion (energy) equation. The development of the NIM with AMR capability (NIMAMR) is based on the well-known Berger-Oliger method for structured AMR. In general, the main components of all AMR schemes are 1. the solver; 2. the level-grid hierarchy; 3. the selection algorithm; 4. the communication procedures; 5. the governing algorithm. The first component, the solver, consists of the numerical scheme for the governing partial differential equations and the algorithm used to solve the resulting system of discrete algebraic equations. In the case of the NIM-AMR, the solver is the iterative approach to the solution of the set of discrete equations obtained by applying the NIM. Furthermore, in the NIM-AMR, the level-grid hierarchy (the second component) is based on the Hierarchical Adaptive Mesh Refinement (HAMR) system,6 and hence, the details of the hierarchy are omitted here. In the selection algorithm, regions of the domain that require mesh refinement are identified. The criterion to select regions for mesh refinement can be based on the magnitude of the gradient or on the Richardson truncation error estimate. Although an excellent choice for the selection criterion, the Richardson

  14. On Modal Refinement and Consistency

    DEFF Research Database (Denmark)

    Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej

    2007-01-01

    Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...

  15. Refining - Panorama 2008

    International Nuclear Information System (INIS)

    2008-01-01

    Investment rallied in 2007, and many distillation and conversion projects likely to reach the industrial stage were announced. With economic growth sustained in 2006 and still pronounced in 2007, oil demand remained strong - especially in emerging countries - and refining margins stayed high. Despite these favorable business conditions, tensions persisted in the refining sector, which has fallen far behind in terms of investing in refinery capacity. It will take renewed efforts over a long period to catch up. Looking at recent events that have affected the economy in many countries (e.g. the sub-prime crisis), prudence remains advisable

  16. Selective amygdalohippocampectomy via trans-superior temporal gyrus keyhole approach

    OpenAIRE

    Mathon , Bertrand; Clemenceau , Stéphane

    2016-01-01

    International audience; BackgroundHippocampal sclerosis is the most common cause of drug-resistant epilepsy amenable for surgical treatment and seizure control. The rationale of the selective amygdalohippocampectomy is to spare cerebral tissue not included in the seizure generator.MethodDescribe the selective amygdalohippocampectomy through the trans-superior temporal gyrus keyhole approach.ConclusionSelective amygdalohippocampectomy for temporal lobe epilepsy is performed when the data (semi...

  17. Refining Lane-Based Traffic Signal Settings to Satisfy Spatial Lane Length Requirements

    Directory of Open Access Journals (Sweden)

    Yanping Liu

    2017-01-01

    Full Text Available In conventional lane-based signal optimization models, lane markings guiding road users in making turns are optimized with traffic signal settings in a unified framework to maximize the overall intersection capacity or minimize the total delay. The spatial queue requirements of road lanes should be considered to avoid overdesigns of green durations. Point queue system adopted in the conventional lane-based framework causes overflow in practice. Based on the optimization results from the original lane-based designs, a refinement is proposed to enhance the lane-based settings to ensure that spatial holding limits of the approaching traffic lanes are not exceeded. A solution heuristic is developed to modify the green start times, green durations, and cycle length by considering the vehicle queuing patterns and physical holding capacities along the approaching traffic lanes. To show the effectiveness of this traffic signal refinement, a case study of one of the busiest and most complicated intersections in Hong Kong is given for demonstration. A site survey was conducted to collect existing traffic demand patterns and existing traffic signal settings in peak periods. Results show that the proposed refinement method is effective to ensure that all vehicle queue lengths satisfy spatial lane capacity limits, including short lanes, for daily operation.

  18. The Development of Petroleum Refining in the World Market Dimensions of Sustainable Development

    Directory of Open Access Journals (Sweden)

    Alexey S. Shapran

    2015-11-01

    Full Text Available The development of petroleum refining in the world market dimensions of sustainable development investigated by the author's interpretation of the OECD model "pressure – state – response", where the pressure parameters proposed use – CO2 emissions, the state parameters – indicators of output and foreign trade refining sector; indicators to community response – (GDP eco-intensity. On the basis of economic and mathematical modeling performed of the adaptation and their value for use in the model parameters, performed a quantitative assessment of the relationship between the key requirements for sustainable development and development of the world petroleum refining market. This approach gave to perform a quantitative assessment of the level and impact of individual factors on the development of world petroleum market in countries with different technological structures.

  19. Colorectal cancer chemoprevention: the potential of a selective approach.

    Science.gov (United States)

    Ben-Amotz, Oded; Arber, Nadir; Kraus, Sarah

    2010-10-01

    Colorectal cancer (CRC) is a leading cause of cancer death, and therefore demands special attention. Novel recent approaches for the chemoprevention of CRC focus on selective targeting of key pathways. We review the study by Zhang and colleagues, evaluating a selective approach targeting APC-deficient premalignant cells using retinoid-based therapy and TNF-related apoptosis-inducing ligand (TRAIL). This study demonstrates that induction of TRAIL-mediated death signaling contributes to the chemopreventive value of all-trans-retinyl acetate (RAc) by sensitizing premalignant adenoma cells for apoptosis without affecting normal cells. We discuss these important findings, raise few points that deserve consideration, and may further contribute to the development of RAc-based combination therapies with improved efficacy. The authors clearly demonstrate a synergistic interaction between TRAIL, RAc and APC, which leads to the specific cell death of premalignant target cells. The study adds to the growing body of literature related to CRC chemoprevention, and provides solid data supporting a potentially selective approach for preventing CRC using RAc and TRAIL.

  20. Multi-Layer Approach for the Detection of Selective Forwarding Attacks.

    Science.gov (United States)

    Alajmi, Naser; Elleithy, Khaled

    2015-11-19

    Security breaches are a major threat in wireless sensor networks (WSNs). WSNs are increasingly used due to their broad range of important applications in both military and civilian domains. WSNs are prone to several types of security attacks. Sensor nodes have limited capacities and are often deployed in dangerous locations; therefore, they are vulnerable to different types of attacks, including wormhole, sinkhole, and selective forwarding attacks. Security attacks are classified as data traffic and routing attacks. These security attacks could affect the most significant applications of WSNs, namely, military surveillance, traffic monitoring, and healthcare. Therefore, there are different approaches to detecting security attacks on the network layer in WSNs. Reliability, energy efficiency, and scalability are strong constraints on sensor nodes that affect the security of WSNs. Because sensor nodes have limited capabilities in most of these areas, selective forwarding attacks cannot be easily detected in networks. In this paper, we propose an approach to selective forwarding detection (SFD). The approach has three layers: MAC pool IDs, rule-based processing, and anomaly detection. It maintains the safety of data transmission between a source node and base station while detecting selective forwarding attacks. Furthermore, the approach is reliable, energy efficient, and scalable.

  1. Multi-Layer Approach for the Detection of Selective Forwarding Attacks

    Directory of Open Access Journals (Sweden)

    Naser Alajmi

    2015-11-01

    Full Text Available Security breaches are a major threat in wireless sensor networks (WSNs. WSNs are increasingly used due to their broad range of important applications in both military and civilian domains. WSNs are prone to several types of security attacks. Sensor nodes have limited capacities and are often deployed in dangerous locations; therefore, they are vulnerable to different types of attacks, including wormhole, sinkhole, and selective forwarding attacks. Security attacks are classified as data traffic and routing attacks. These security attacks could affect the most significant applications of WSNs, namely, military surveillance, traffic monitoring, and healthcare. Therefore, there are different approaches to detecting security attacks on the network layer in WSNs. Reliability, energy efficiency, and scalability are strong constraints on sensor nodes that affect the security of WSNs. Because sensor nodes have limited capabilities in most of these areas, selective forwarding attacks cannot be easily detected in networks. In this paper, we propose an approach to selective forwarding detection (SFD. The approach has three layers: MAC pool IDs, rule-based processing, and anomaly detection. It maintains the safety of data transmission between a source node and base station while detecting selective forwarding attacks. Furthermore, the approach is reliable, energy efficient, and scalable.

  2. The relationship between viscosity and refinement efficiency of pure aluminum by Al-Ti-B refiner

    Energy Technology Data Exchange (ETDEWEB)

    Yu Lina [Key Laboratory of Liquid Structure and Heredity of Materials, Ministry of Education, Shandong University, 73 Jingshi Road, Jinan 250061 (China); Liu Xiangfa [Key Laboratory of Liquid Structure and Heredity of Materials, Ministry of Education, Shandong University, 73 Jingshi Road, Jinan 250061 (China)]. E-mail: xfliu@sdu.edu.cn

    2006-11-30

    The relationship between viscosity and refinement efficiency of pure aluminum with the addition of Al-Ti-B master alloy was studied in this paper. The experimental results show that when the grain size of solidified sample is finer the viscosity of the melt is higher after the addition of different Al-Ti-B master alloys. This indicates that viscosity can be used to approximately estimate the refinement efficiency of Al-Ti-B refiners in production to a certain extent. The main reason was also discussed in this paper by using transmission electron microscopy (TEM) analysis and differential scanning calorimetry (DSC) experiment.

  3. Relational Demonic Fuzzy Refinement

    OpenAIRE

    Tchier, Fairouz

    2014-01-01

    We use relational algebra to define a refinement fuzzy order called demonic fuzzy refinement and also the associated fuzzy operators which are fuzzy demonic join $({\\bigsqcup }_{\\mathrm{\\text{f}}\\mathrm{\\text{u}}\\mathrm{\\text{z}}})$ , fuzzy demonic meet $({\\sqcap }_{\\mathrm{\\text{f}}\\mathrm{\\text{u}}\\mathrm{\\text{z}}})$ , and fuzzy demonic composition $({\\square }_{\\mathrm{\\text{f}}\\mathrm{\\text{u}}\\mathrm{\\text{z}}})$ . Our definitions and properties are illustrated by some examples using ma...

  4. A practical approach towards energy conversion through bio-refining of mixed kraft pulps

    Energy Technology Data Exchange (ETDEWEB)

    Dharm, D.; Upadhyaya, J.S.; Tyagi, C.H.; Ahamad, S. (Dept. of Paper Technology, Indian Inst. of Technology Roorkee, Saharanpur (India))

    2007-07-01

    The pulp and paper industry is an energy intensive process industry where energy contributes about 16-20% of the manufacturing cost. Due to shortage in energy availability and increase in energy cost, energy conservation has become a necessity in the paper industry. A laboratory study on bleached and unbleached kraft pulps having 15% bamboo, eucalyptus 15%, poplar waste 20% and veneer waste 50% was conducted using two distinct commercial enzymes i.e. cellulases and xylanase and its effect on slowness, drainage time, beating time, mechanical strength properties and power consumption were studied. Enzymatic pretreatment of chemical pulp in laboratory improves degSR by 5 and 4 points in case of bleached and unbleached pulps respectively at the same beating time. Breaking length improves up to 4.0% at the constant beating level. The application of cellulase during refining saves energy 18.5% at constant refining level i.e. 28 degSR. The enzymatic treatment shows a power saving per 100 metric tone of paper by 1390 kWh which costs power saving in euro 160.70/ 100 metric tone of paper. In this way, net cost saving by deducting the cost of enzyme per 100 metric tone of paper is euro 157.90. (orig.)

  5. Nucleation mechanisms of refined alpha microstructure in beta titanium alloys

    Science.gov (United States)

    Zheng, Yufeng

    . Therefore, the nucleation mechanisms proposed could successfully explain the features of refined and super-refined precipitates microstructure in Ti-5553, validated by thermodynamic calculations and phase field modeling simulation. In addition to the study of microstructure evolution in beta titanium alloys upon various heat treatment conditions, another effort made in the current study is to apply various phase transformation analysis tools on titanium alloys in order to capture the initial stage of precipitation and investigate the kinetics of precipitation. Especially Electro-Thermo-Mechanical Tester (ETMT) is used to in-situ measure the physical properties change of Ti-5553 during heat treatment and therefore analyze the kinetics of phase transformation. This part of work can be treated as complementary work of the study of microstructure evolution in beta titanium alloys. In summary, refined and super-refined precipitates microstructure in Ti-5553 are studied using both modern characterization techniques and computational simulation. Nucleation mechanisms are proposed to explain all the features of two specific microstructures and critical heat treatment conditions are figured out. Therefore, this insightful study is not only beneficial to understanding the details of phase transformation in the scientific aspect but also complement to selection of heat treatment conditions in industry area.

  6. Structure refinement of commensurately modulated bismuth tungstate, Bi2WO6

    International Nuclear Information System (INIS)

    Rae, A.D.; Thompson, J.G.; Withers, R.L.

    1991-01-01

    The displacive ferroelectric Bi 2 WO 6 [M r = 697.81, a = 5.4559 (4), b = 5.4360 (4), c = 16.4298 (17) A, Z = 4, D x = 9.512 g cm -3 , MoKα, λ = 0sun7107 A, μ = 958.6 cm -1 , F(000) = 1151.73], is described at room temperature as a commensurate modulation of an idealized Fmmm parent structure derived from an I4/mmm structure. Transmission electron microscopy clearly showed that there are coherent intergrowths of two distinct modulated variants in Bi 2 WO 6 crystals. Displacive models of inherent F2mm and Bmab symmetry are substantial and coherent over a large volume. They reduce the space-group symmetry to B2ab. A further substantial displacive mode corresponds to rotation of corner-connected WO 6 octahedra about axes parallel to c and has either of two inherent symmetries, Abam or Bbam, the difference being associated with the way this mode reduces the space-group symmetry to P2 1 ab, while the existence of the Bbam mode reduces the intensity of h + l = 2n + 1 data and acts like a stacking fault. Group theoretical analysis of the problem details how the X-ray data can be classified so as to monitor the refinement. Anomalous dispersion selects the overall sign of the F2mm mode and determines the polarity. The overall signs chosen for the Bmab and Abam symmetry components of atom displacements select between equivalent origins. The overall signs of induced modes of inherent Amam, Bbab and Ccma symmetry had to be determined by comparative refinement since the assumption that calculated phases are best estimates can retain the initial overall sign choice for these modes during least-squares refinement. Correlations between the dominant modes and the induced modes allowed a meaningful choice of signs to resolve the pseudo homometry. Only the sign of the Bbab mode was capable of self-correction during refinement. (orig./BHO)

  7. Innovative Approach for IBS Vendor Selection Problem

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2016-01-01

    Full Text Available Supply chain management in Industrialised Building System (IBS construction management has significantly determined the successful of company and project performance. Due to the wide variety of criteria and vendor available, the vendor selection process for a specific project needs is becoming more difficult. The need of decision aid for vendor selection in other areas is widely discussed in previous research. However, study on vendor selection for IBS project is largely neglected. Decision Support System (DSS is proposed for this purpose. Yet, most of the DSS models are impractical since they are complicated and difficult for a layman such as project managers to use. Research indicates that the rapid development of ICT has highly potential towards simple and effective DSS. Thus, this paper highlights the importance and research approach for vendor selection in IBS project management. The study is based on Design Science Research Methodology with combination of case studies. It is anticipates that this study will yield an effective value-for-money decision making platform to manage vendor selection process.

  8. A case study: Residue reduction at Deer Park Refining Limited Partnership

    International Nuclear Information System (INIS)

    Geehan, D.M.

    1996-01-01

    With input from Shell Synthetic Fuels Inc. (SSFI), Deer Park Refining Limited Partnership (DPRLP) analyzed options for managing the bottom of the barrel to extinction, with an objective of high return on investment. DPRLP is a joint venture of PEMEX and Shell Oil Company. This Gulf Coast refiner processes 227M BBL/D of heavy, high sulfur crude. This paper discusses the process options considered, their advantages and disadvantages, and the option selected as well as the options still open. Recent modernization projects at DPRLP are now on stream with high yield of clean products.There remains one by-product, petroleum coke, which presents opportunity as a low cost feed for one or more process options yielding attractive products. The Shell Coke (or Coal) Gasification Process is one of the options now being considered

  9. Task Refinement for Autonomous Robots using Complementary Corrective Human Feedback

    Directory of Open Access Journals (Sweden)

    Cetin Mericli

    2011-06-01

    Full Text Available A robot can perform a given task through a policy that maps its sensed state to appropriate actions. We assume that a hand-coded controller can achieve such a mapping only for the basic cases of the task. Refining the controller becomes harder and gets more tedious and error prone as the complexity of the task increases. In this paper, we present a new learning from demonstration approach to improve the robot's performance through the use of corrective human feedback as a complement to an existing hand-coded algorithm. The human teacher observes the robot as it performs the task using the hand-coded algorithm and takes over the control to correct the behavior when the robot selects a wrong action to be executed. Corrections are captured as new state-action pairs and the default controller output is replaced by the demonstrated corrections during autonomous execution when the current state of the robot is decided to be similar to a previously corrected state in the correction database. The proposed approach is applied to a complex ball dribbling task performed against stationary defender robots in a robot soccer scenario, where physical Aldebaran Nao humanoid robots are used. The results of our experiments show an improvement in the robot's performance when the default hand-coded controller is augmented with corrective human demonstration.

  10. Refining Students' Explanations of an Unfamiliar Physical Phenomenon-Microscopic Friction

    Science.gov (United States)

    Corpuz, Edgar De Guzman; Rebello, N. Sanjay

    2017-08-01

    The first phase of this multiphase study involves modeling of college students' thinking of friction at the microscopic level. Diagnostic interviews were conducted with 11 students with different levels of physics backgrounds. A phenomenographic approach of data analysis was used to generate categories of responses which subsequently were used to generate a model of explanation. Most of the students interviewed consistently used mechanical interactions in explaining microscopic friction. According to these students, friction is due to the interlocking or rubbing of atoms. Our data suggest that students' explanations of microscopic friction are predominantly influenced by their macroscopic experiences. In the second phase of the research, teaching experiment was conducted with 18 college students to investigate how students' explanations of microscopic friction can be refined by a series of model-building activities. Data were analyzed using Redish's two-level transfer framework. Our results show that through sequences of hands-on and minds-on activities, including cognitive dissonance and resolution, it is possible to facilitate the refinement of students' explanations of microscopic friction. The activities seemed to be productive in helping students activate associations that refine their ideas about microscopic friction.

  11. Panorama 2007: Refining and Petrochemicals

    International Nuclear Information System (INIS)

    Silva, C.

    2007-01-01

    The year 2005 saw a new improvement in refining margins that continued during the first three quarters of 2006. The restoration of margins in the last three years has allowed the refining sector to regain its profitability. In this context, the oil companies reported earnings for fiscal year 2005 that were up significantly compared to 2004, and the figures for the first half-year 2006 confirm this trend. Despite this favorable business environment, investments only saw a minimal increase in 2005 and the improvement expected for 2006 should remain fairly limited. Looking to 2010-2015, it would appear that the planned investment projects with the highest probability of reaching completion will be barely adequate to cover the increase in demand. Refining sector should continue to find itself under pressure. As for petrochemicals, despite a steady up-trend in the naphtha price, the restoration of margins consolidated a comeback that started in 2005. All in all, capital expenditure remained fairly low in both the refining and petrochemicals sectors, but many projects are planned for the next ten years. (author)

  12. Refining margins: recent trends

    International Nuclear Information System (INIS)

    Baudoin, C.; Favennec, J.P.

    1999-01-01

    Despite a business environment that was globally mediocre due primarily to the Asian crisis and to a mild winter in the northern hemisphere, the signs of improvement noted in the refining activity in 1996 were borne out in 1997. But the situation is not yet satisfactory in this sector: the low return on invested capital and the financing of environmental protection expenditure are giving cause for concern. In 1998, the drop in crude oil prices and the concomitant fall in petroleum product prices was ultimately rather favorable to margins. Two elements tended to put a damper on this relative optimism. First of all, margins continue to be extremely volatile and, secondly, the worsening of the economic and financial crisis observed during the summer made for a sharp decline in margins in all geographic regions, especially Asia. Since the beginning of 1999, refining margins are weak and utilization rates of refining capacities have decreased. (authors)

  13. Panorama 2009 - refining

    International Nuclear Information System (INIS)

    2008-01-01

    For oil companies to invest in new refining and conversion capacity, favorable conditions over time are required. In other words, refining margins must remain high and demand sustained over a long period. That was the situation prevailing before the onset of the financial crisis in the second half of 2008. The economic conjuncture has taken a substantial turn for the worse since then and the forecasts for 2009 do not look bright. Oil demand is expected to decrease in the OECD countries and to grow much more slowly in the emerging countries. It is anticipated that refining margins will fall in 2009 - in 2008, they slipped significantly in the United States - as a result of increasingly sluggish demand, especially for light products. The next few months will probably be unfavorable to investment. In addition to a gloomy business outlook, there may also be a problem of access to sources of financing. As for investment projects, a mainstream trend has emerged in the last few years: a shift away from the regions that have historically been most active (the OECD countries) towards certain emerging countries, mostly in Asia or the Middle East. The new conjuncture will probably not change this trend

  14. The effects of carbon prices and anti-leakage policies on selected industrial sectors in Spain – Cement, steel and oil refining

    International Nuclear Information System (INIS)

    Santamaría, Alberto; Linares, Pedro; Pintos, Pablo

    2014-01-01

    This paper assesses the impacts on the cement, steel and oil refining sectors in Spain of the carbon prices derived from the European Emissions Trading Scheme (EU ETS), and the potential effect on these sectors of the European Union anti-leakage policy measures. The assessment is carried out by means of three engineering models developed for this purpose. Our results show a high exposure to leakage of cement in coastal regions; a smaller risk in the steel sector, and non-negligible risk of leakage for the oil refining sector when carbon allowance prices reach high levels. We also find that the risk of leakage could be better handled with other anti-leakage policies than those currently in place in the EU. - Highlights: • We simulate the impact of carbon prices on the risk of leakage in the cement, steel and oil refining sectors. • We also assess the effectiveness of different anti-leakage policies in Europe. • Cement production in coastal areas is highly exposed. • The risk of leakage for steel and oil refining is smaller. • Anti-leakage policies should be modified to be efficient

  15. Application of integrated QFD and fuzzy AHP approach in selection of suppliers

    Directory of Open Access Journals (Sweden)

    Bojana Jovanović

    2014-10-01

    Full Text Available Supplier selection is a widely considered issue in the field of management, especially in quality management. In this paper, in the selection of suppliers of electronic components we used the integrated QFD and fuzzy AHP approaches. The QFD method is used as a tool for translating stakeholder needs into evaluating criteria for suppliers. The fuzzy AHP approach is used as a tool for prioritizing stakeholders, stakeholders’ requirements, evaluating criteria and, finally, for prioritizing suppliers. The paper showcases a case study of implementation of the integrated QFD and fuzzy AHP approaches in the selection of the electronic components supplier in one Serbian company that produces electronic devices. Also presented is the algorithm of implementation of the proposed approach. To the best of our knowledge, this is the first implementation of the proposed approach in a Serbian company.

  16. Romanian refining industry assesses restructuring

    International Nuclear Information System (INIS)

    Tanasescu, D.G.

    1991-01-01

    The Romanian crude oil refining industry, as all the other economic sectors, faces the problems accompanying the transition from a centrally planned economy to a market economy. At present, all refineries have registered as joint-stock companies and all are coordinated and assisted by Rafirom S.A., from both a legal and a production point of view. Rafirom S.A. is a joint-stock company that holds shares in refineries and other stock companies with activities related to oil refining. Such activities include technological research, development, design, transportation, storage, and domestic and foreign marketing. This article outlines the market forces that are expected to: drive rationalization and restructuring of refining operations and define the targets toward which the reconfigured refineries should strive

  17. Refinement of protein termini in template-based modeling using conformational space annealing.

    Science.gov (United States)

    Park, Hahnbeom; Ko, Junsu; Joo, Keehyoung; Lee, Julian; Seok, Chaok; Lee, Jooyoung

    2011-09-01

    The rapid increase in the number of experimentally determined protein structures in recent years enables us to obtain more reliable protein tertiary structure models than ever by template-based modeling. However, refinement of template-based models beyond the limit available from the best templates is still needed for understanding protein function in atomic detail. In this work, we develop a new method for protein terminus modeling that can be applied to refinement of models with unreliable terminus structures. The energy function for terminus modeling consists of both physics-based and knowledge-based potential terms with carefully optimized relative weights. Effective sampling of both the framework and terminus is performed using the conformational space annealing technique. This method has been tested on a set of termini derived from a nonredundant structure database and two sets of termini from the CASP8 targets. The performance of the terminus modeling method is significantly improved over our previous method that does not employ terminus refinement. It is also comparable or superior to the best server methods tested in CASP8. The success of the current approach suggests that similar strategy may be applied to other types of refinement problems such as loop modeling or secondary structure rearrangement. Copyright © 2011 Wiley-Liss, Inc.

  18. AHRQ series paper 3: identifying, selecting, and refining topics for comparative effectiveness systematic reviews: AHRQ and the effective health-care program.

    Science.gov (United States)

    Whitlock, Evelyn P; Lopez, Sarah A; Chang, Stephanie; Helfand, Mark; Eder, Michelle; Floyd, Nicole

    2010-05-01

    This article discusses the identification, selection, and refinement of topics for comparative effectiveness systematic reviews within the Agency for Healthcare Research and Quality's Effective Health Care (EHC) program. The EHC program seeks to align its research topic selection with the overall goals of the program, impartially and consistently apply predefined criteria to potential topics, involve stakeholders to identify high-priority topics, be transparent and accountable, and continually evaluate and improve processes. A topic prioritization group representing stakeholder and scientific perspectives evaluates topic nominations that fit within the EHC program (are "appropriate") to determine how "important" topics are as considered against seven criteria. The group then judges whether a new comparative effectiveness systematic review would be a duplication of existing research syntheses, and if not duplicative, if there is adequate type and volume of research to conduct a new systematic review. Finally, the group considers the "potential value and impact" of a comparative effectiveness systematic review. As the EHC program develops, ongoing challenges include ensuring the program addresses truly unmet needs for synthesized research because national and international efforts in this arena are uncoordinated, as well as engaging a range of stakeholders in program decisions while also achieving efficiency and timeliness.

  19. Six Sigma Project Selection Using Fuzzy TOPSIS Decision Making Approach

    Directory of Open Access Journals (Sweden)

    Rajeev Rathi

    2015-05-01

    Full Text Available Six Sigma is considered as a logical business strategy that attempts to identify and eliminate the defects or failures for improving the quality of product and processes. A decision on project selection in Six Sigma is always very critical; it plays a key role in successful implementation of Six Sigma. Selection of a right Six Sigma project is essentially important for an automotive company because it greatly influences the manufacturing costs. This paper discusses an approach for right Six Sigma project selection at an automotive industry using fuzzy logic based TOPSIS method. The fuzzy TOPSIS is a well recognized tool to undertake the fuzziness of the data involved in choosing the right preferences. In this context, evaluation criteria have been designed for selection of best alternative. The weights of evaluation criteria are calculated by using the MDL (modified digital logic method and final ranking is calculated through priority index obtained by using fuzzy TOPSIS method. In the selected case study, this approach has rightly helped to identify the right project for implementing Six Sigma for achieving improvement in productivity.

  20. Optimization of Refining Craft for Vegetable Insulating Oil

    Science.gov (United States)

    Zhou, Zhu-Jun; Hu, Ting; Cheng, Lin; Tian, Kai; Wang, Xuan; Yang, Jun; Kong, Hai-Yang; Fang, Fu-Xin; Qian, Hang; Fu, Guang-Pan

    2016-05-01

    Vegetable insulating oil because of its environmental friendliness are considered as ideal material instead of mineral oil used for the insulation and the cooling of the transformer. The main steps of traditional refining process included alkali refining, bleaching and distillation. This kind of refining process used in small doses of insulating oil refining can get satisfactory effect, but can't be applied to the large capacity reaction kettle. This paper using rapeseed oil as crude oil, and the refining process has been optimized for large capacity reaction kettle. The optimized refining process increases the acid degumming process. The alkali compound adds the sodium silicate composition in the alkali refining process, and the ratio of each component is optimized. Add the amount of activated clay and activated carbon according to 10:1 proportion in the de-colorization process, which can effectively reduce the oil acid value and dielectric loss. Using vacuum pumping gas instead of distillation process can further reduce the acid value. Compared some part of the performance parameters of refined oil products with mineral insulating oil, the dielectric loss of vegetable insulating oil is still high and some measures are needed to take to further optimize in the future.

  1. Grain refining efficiency of Al-Ti-C alloys

    International Nuclear Information System (INIS)

    Birol, Yuecel

    2006-01-01

    The problems associated with boride agglomeration and the poisoning effect of Zr in Zr-bearing alloys have created a big demand for boron-free grain refiners. The potential benefits of TiC as a direct nucleant for aluminium grains have thus generated a great deal of interest in TiC-bearing alloys in recent years. In Al-Ti-C grain refiners commercially available today, Al 3 Ti particles are introduced into the melt along with the TiC particles. Since the latter are claimed to nucleate α-Al directly, it is of great technological interest to see if reducing the Ti:C ratio further, i.e., increasing the C content of the grain refiner, will produce an increase in the grain refining efficiency of these alloys. A series of grain refiner samples with the Ti concentration fixed at 3% and a range of C contents between 0 and 0.75 were obtained by appropriately mixing an experimental Al-3Ti-0.75C alloy with Al-10Ti alloy and commercial purity aluminium. The grain refining efficiency of these grain refiners was assessed to investigate the role of the insoluble TiC and the soluble Al 3 Ti particles. The optimum chemistry for the Al-Ti-C grain refiners was also identified

  2. Grain refining efficiency of Al-Ti-C alloys

    Energy Technology Data Exchange (ETDEWEB)

    Birol, Yuecel [Materials Institute, Marmara Research Center, TUBITAK, 41470 Gebze, Kocaeli (Turkey)]. E-mail: yucel.birol@mam.gov.tr

    2006-09-28

    The problems associated with boride agglomeration and the poisoning effect of Zr in Zr-bearing alloys have created a big demand for boron-free grain refiners. The potential benefits of TiC as a direct nucleant for aluminium grains have thus generated a great deal of interest in TiC-bearing alloys in recent years. In Al-Ti-C grain refiners commercially available today, Al{sub 3}Ti particles are introduced into the melt along with the TiC particles. Since the latter are claimed to nucleate {alpha}-Al directly, it is of great technological interest to see if reducing the Ti:C ratio further, i.e., increasing the C content of the grain refiner, will produce an increase in the grain refining efficiency of these alloys. A series of grain refiner samples with the Ti concentration fixed at 3% and a range of C contents between 0 and 0.75 were obtained by appropriately mixing an experimental Al-3Ti-0.75C alloy with Al-10Ti alloy and commercial purity aluminium. The grain refining efficiency of these grain refiners was assessed to investigate the role of the insoluble TiC and the soluble Al{sub 3}Ti particles. The optimum chemistry for the Al-Ti-C grain refiners was also identified.

  3. ERP system implementation costs and selection factors of an implementation approach

    DEFF Research Database (Denmark)

    Johansson, Björn; Sudzina, Frantisek; Newman, Mike

    2011-01-01

    , which influence the implementation approach in an ERP project, cause also an increase of the project cost in a European context? Our survey was conducted in Denmark, Slovakia and Slovenia and focused on this issue. Our main findings are that: 1) the number of implemented modules influences selection......Different approaches on implementation of enterprise resource planning (ERPs) systems exist. In this article, we investigate relationship between factors influencing selection of implementation approach and companies' ability to stay within budget when implementing ERPs. The question is: do factors...... of an implementation approach; 2) companies with information strategies are more likely to stay within budget regarding ERP systems implementation. However, we also found that: 3) implementation approach does not significantly influence ability to stay within budget; 4) a clear relationship between factors influencing...

  4. Refinement of the Kansas City Plant site conceptual model with respect to dense non-aqueous phase liquids (DNAPL)

    International Nuclear Information System (INIS)

    Korte, N.E.; Hall, S.C.; Baker, J.L.

    1995-01-01

    This document presents a refinement of the site conceptual model with respect to dense non-aqueous phase liquid (DNAPL) at the US Department of Energy Kansas City Plant (KCP). This refinement was prompted by a review of the literature and the results of a limited study that was conducted to evaluate whether pools of DNAPL were present in contaminated locations at the KCP. The field study relied on the micropurge method of sample collection. This method has been demonstrated as a successful approach for obtaining discrete samples within a limited aquifer zone. Samples were collected at five locations across 5-ft well screens located at the base of the alluvial aquifer at the KCP. The hypothesis was that if pools of DNAPL were present, the dissolved concentration would increase with depth. Four wells with highly contaminated groundwater were selected for the test. Three of the wells were located in areas where DNAPL was suspected, and one where no DNAPL was believed to be present. The results demonstrated no discernible pattern with depth for the four wells tested. A review of the data in light of the available technical literature suggests that the fine-grained nature of the aquifer materials precludes the formation of pools. Instead, DNAPL is trapped as discontinuous ganglia that are probably widespread throughout the aquifer. The discontinuous nature of the DNAPL distribution prevents the collection of groundwater samples with concentrations approaching saturation. Furthermore, the results indicate that attempts to remediate the aquifer with conventional approaches will not result in restoration to pristine conditions because the tortuous groundwater flow paths will inhibit the efficiency of fluid-flow-based treatments

  5. practical common weight maximin approach for technology selection

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... proposes a multi-objective decision tool for industrial robot selection, which does not require subjective assessments ... Over the past several decades, manufacturers who have been faced with intense competition ... three groups: economic analysis techniques, analytical methods and strategic approaches.

  6. Transbrachial artery approach for selective cerebral angiography

    International Nuclear Information System (INIS)

    Touho, Hajime; Karasawa, Jun; Shishido, Hisashi; Morisako, Toshitaka; Numazawa, Shinichi; Yamada, Keisuke; Nagai, Shigeki; Shibamoto, Kenji

    1990-01-01

    Transaxillary or transbrachial approaches to the cerebral vessels have been reported, but selective angiography of all four vessels has not been possible through one route. In this report, a new technique for selective cerebral angiography with transbrachial approach is described. One hundred and twenty three patients with cerebral infarction, vertebrobasilar insufficiency, intracerebral hemorrhage, epilepsy, or cerebral tumor were examined. Those patients consisted of 85 outpatients and 38 inpatients whose age ranged from 15 years old to 82 years old. The patients were examined via the transbrachial approach (97 cases via the right brachial, 29 cases via the left). Materials included a DSA system (Digital Fluorikon 5000, General Electric Co.), a 4 French tight J-curved Simmons 80-cm catheter, a 19-gauge extra-thin-wall Seldinger needle, and a J/Straight floppy 125-cm guide-wire. Generally, the volume of the contrast agent (300 mgI/ml iopamidol) used in the common carotid artery angiogram was 6 ml, while that used in the vertebral artery angiogram was 4 ml. If catheterization of the vertebral artery or right common carotid artery was unsuccessful, about 8 ml of the contrast agent was injected into the subclavian or branchiocephalic artery. Definitive diagnosis and a decision on proper treatment of the patients can be easily obtained, and the results were clinically satisfactory. Moreover, no complications were encountered in this study. This new technique making a transbrachial approach to the cerebral vessels using the DSA system is introduced here. Neurosurgeons can use this technique easily, and they will find that it provides them with all the information they need about the patient. (author)

  7. Repetitive Model Refinement for Questionnaire Design Improvement in the Evaluation of Working Characteristics in Construction Enterprises

    Directory of Open Access Journals (Sweden)

    Jeng-Wen Lin

    2015-11-01

    Full Text Available This paper presents an iterative confidence interval based parametric refinement approach for questionnaire design improvement in the evaluation of working characteristics in construction enterprises. This refinement approach utilizes the 95% confidence interval of the estimated parameters of the model to determine their statistical significance in a least-squares regression setting. If this confidence interval of particular parameters covers the zero value, it is statistically valid to remove such parameters from the model and their corresponding questions from the designed questionnaire. The remaining parameters repetitively undergo this sifting process until their statistical significance cannot be improved. This repetitive model refinement approach is implemented in efficient questionnaire design by using both linear series and Taylor series models to remove non-contributing questions while keeping significant questions that are contributive to the issues studied, i.e., employees’ work performance being explained by their work values and cadres’ organizational commitment being explained by their organizational management. Reducing the number of questions alleviates the respondent burden and reduces costs. The results show that the statistical significance of the sifted contributing questions is decreased with a total mean relative change of 49%, while the Taylor series model increases the R-squared value by 17% compared with the linear series model.

  8. Data refinement for true concurrency

    Directory of Open Access Journals (Sweden)

    Brijesh Dongol

    2013-05-01

    Full Text Available The majority of modern systems exhibit sophisticated concurrent behaviour, where several system components modify and observe the system state with fine-grained atomicity. Many systems (e.g., multi-core processors, real-time controllers also exhibit truly concurrent behaviour, where multiple events can occur simultaneously. This paper presents data refinement defined in terms of an interval-based framework, which includes high-level operators that capture non-deterministic expression evaluation. By modifying the type of an interval, our theory may be specialised to cover data refinement of both discrete and continuous systems. We present an interval-based encoding of forward simulation, then prove that our forward simulation rule is sound with respect to our data refinement definition. A number of rules for decomposing forward simulation proofs over both sequential and parallel composition are developed.

  9. Grain Refinement of Permanent Mold Cast Copper Base Alloys

    Energy Technology Data Exchange (ETDEWEB)

    M.Sadayappan; J.P.Thomson; M.Elboujdaini; G.Ping Gu; M. Sahoo

    2005-04-01

    Grain refinement is a well established process for many cast and wrought alloys. The mechanical properties of various alloys could be enhanced by reducing the grain size. Refinement is also known to improve casting characteristics such as fluidity and hot tearing. Grain refinement of copper-base alloys is not widely used, especially in sand casting process. However, in permanent mold casting of copper alloys it is now common to use grain refinement to counteract the problem of severe hot tearing which also improves the pressure tightness of plumbing components. The mechanism of grain refinement in copper-base alloys is not well understood. The issues to be studied include the effect of minor alloy additions on the microstructure, their interaction with the grain refiner, effect of cooling rate, and loss of grain refinement (fading). In this investigation, efforts were made to explore and understand grain refinement of copper alloys, especially in permanent mold casting conditions.

  10. An integrated approach towards future ballistic neck protection materials selection.

    Science.gov (United States)

    Breeze, John; Helliker, Mark; Carr, Debra J

    2013-05-01

    Ballistic protection for the neck has historically taken the form of collars attached to the ballistic vest (removable or fixed), but other approaches, including the development of prototypes incorporating ballistic material into the collar of an under body armour shirt, are now being investigated. Current neck collars incorporate the same ballistic protective fabrics as the soft armour of the remaining vest, reflecting how ballistic protective performance alone has historically been perceived as the most important property for neck protection. However, the neck has fundamental differences from the thorax in terms of anatomical vulnerability, flexibility and equipment integration, necessitating a separate solution from the thorax in terms of optimal materials selection. An integrated approach towards the selection of the most appropriate combination of materials to be used for each of the two potential designs of future neck protection has been developed. This approach requires evaluation of the properties of each potential material in addition to ballistic performance alone, including flexibility, mass, wear resistance and thermal burden. The aim of this article is to provide readers with an overview of this integrated approach towards ballistic materials selection and an update of its current progress in the development of future ballistic neck protection.

  11. Comparing Syntactic and Semantics Action Refinement

    NARCIS (Netherlands)

    Goltz, Ursula; Gorrieri, Roberto; Rensink, Arend

    The semantic definition of action refinement on labelled configuration structures is compared with the notion of syntactic substitution, which can be used as another notion of action refinement in a process algebraic setting. The comparison is done by studying a process algebra equipped with

  12. Refinement of NMR structures using implicit solvent and advanced sampling techniques.

    Science.gov (United States)

    Chen, Jianhan; Im, Wonpil; Brooks, Charles L

    2004-12-15

    NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified

  13. Materials selection in micromechanical design: an application of the Ashby approach

    OpenAIRE

    Srikar, V.T.; Spearing, S.M.

    2003-01-01

    The set of materials available to microsystems designers is rapidly expanding. Techniques now exist to introduce and integrate a large number of metals, alloys, ceramics, glasses, polymers, and elastomers into microsystems, motivating the need for a rational approach for materials selection in microsystems design. As a step toward such an approach, we focus on the initial stages of materials selection for micromechanical structures with minimum feature sizes greater than 1 /spl mu/m. The vari...

  14. Refining a region-of-interest within an available CT image

    International Nuclear Information System (INIS)

    Enjilela, Esmaeil; Hussein, Esam M.A.

    2013-01-01

    This paper describes a numerical method for refining the image of a region-of-interest (RoI) within an existing tomographic slice, provided that projection data are stored along with the image. Using the attributes of the image, projection values (ray-sums) are adjusted to compensate for the material outside the RoI. Advantage is taken of the high degree of overdetermination of common computed tomography systems to reconstruct an RoI image over smaller pixels. The smaller size of a region-of-interest enables the use of iterative methods for RoI image reconstruction, which are less prone to error propagation. Simulation results are shown for an anthropomorphic head phantom, demonstrating that the introduced approach enhances both the spatial resolution and material contrast of RoI images; without the need to acquire any additional measurements or to alter existing imaging setups and systems. - Highlights: ► A method for refining the image of a region-of-interest within an existing tomographic image. ► Refined spatial-resolution within the region-of-interest, due to high redundancy of CT data. ► Enhancement in image contrast by the use of iterative image reconstruction, made possible by the smaller problem size. ► No need for additional measurements, no alteration of existing imaging setups and systems

  15. Steel refining possibilities in LF

    Science.gov (United States)

    Dumitru, M. G.; Ioana, A.; Constantin, N.; Ciobanu, F.; Pollifroni, M.

    2018-01-01

    This article presents the main possibilities for steel refining in Ladle Furnace (LF). These, are presented: steelmaking stages, steel refining through argon bottom stirring, online control of the bottom stirring, bottom stirring diagram during LF treatment of a heat, porous plug influence over the argon stirring, bottom stirring porous plug, analysis of porous plugs disposal on ladle bottom surface, bottom stirring simulation with ANSYS, bottom stirring simulation with Autodesk CFD.

  16. On Syntactic and Semantic Action Refinement

    NARCIS (Netherlands)

    Hagiya, M.; Goltz, U.; Mitchell, J.C.; Gorrieri, R.; Rensink, Arend

    1994-01-01

    The semantic definition of action refinement on labelled event structures is compared with the notion of syntactic substitution, which can be used as another notion of action refinement in a process algebraic setting. This is done by studying a process algebra equipped with the ACP sequential

  17. New 2D adaptive mesh refinement algorithm based on conservative finite-differences with staggered grid

    Science.gov (United States)

    Gerya, T.; Duretz, T.; May, D. A.

    2012-04-01

    We present new 2D adaptive mesh refinement (AMR) algorithm based on stress-conservative finite-differences formulated for non-uniform rectangular staggered grid. The refinement approach is based on a repetitive cell splitting organized via a quad-tree construction (every parent cell is split into 4 daughter cells of equal size). Irrespective of the level of resolution every cell has 5 staggered nodes (2 horizontal velocities, 2 vertical velocities and 1 pressure) for which respective governing equations, boundary conditions and interpolation equations are formulated. The connectivity of the grid is achieved via cross-indexing of grid cells and basic nodal points located in their corners: four corner nodes are indexed for every cell and up to 4 surrounding cells are indexed for every node. The accuracy of the approach depends critically on the formulation of the stencil used at the "hanging" velocity nodes located at the boundaries between different levels of resolution. Most accurate results are obtained for the scheme based on the volume flux balance across the resolution boundary combined with stress-based interpolation of velocity orthogonal to the boundary. We tested this new approach with a number of 2D variable viscosity analytical solutions. Our tests demonstrate that the adaptive staggered grid formulation has convergence properties similar to those obtained in case of a standard, non-adaptive staggered grid formulation. This convergence is also achieved when resolution boundary crosses sharp viscosity contrast interfaces. The convergence rates measured are found to be insensitive to scenarios when the transition in grid resolution crosses sharp viscosity contrast interfaces. We compared various grid refinement strategies based on distribution of different field variables such as viscosity, density and velocity. According to these tests the refinement allows for significant (0.5-1 order of magnitude) increase in the computational accuracy at the same

  18. Refinement for Transition Systems with Responses

    Directory of Open Access Journals (Sweden)

    Marco Carbone

    2012-07-01

    Full Text Available Motivated by the response pattern for property specifications and applications within flexible workflow management systems, we report upon an initial study of modal and mixed transition systems in which the must transitions are interpreted as must eventually, and in which implementations can contain may behaviors that are resolved at run-time. We propose Transition Systems with Responses (TSRs as a suitable model for this study. We prove that TSRs correspond to a restricted class of mixed transition systems, which we refer to as the action-deterministic mixed transition systems. We show that TSRs allow for a natural definition of deadlocked and accepting states. We then transfer the standard definition of refinement for mixed transition systems to TSRs and prove that refinement does not preserve deadlock freedom. This leads to the proposal of safe refinements, which are those that preserve deadlock freedom. We exemplify the use of TSRs and (safe refinements on a small medication workflow.

  19. Design of Grain Refiners for Aluminium Alloys

    Science.gov (United States)

    Tronche, A.; Greer, A. L.

    The efficiency of a grain refiner can be quantified as the number of grains per nucleant particle in the solidified product. Even for effective refiners in aluminium, such as Al-5Ti-1B, it is known from experiments that efficiencies are very low, at best 10-3 to 102. It is of interest to explore the reasons for such low values, and to assess the prospects for increased efficiency though design of refiners. Recently it has been shown [1] that a simple recalescence-based model can make quantitative predictions of grain size as a function of refiner addition level, cooling rate and solute content. In the model, the initiation of grains is limited by the free growth from nucleant particles, the size distribution of which is very important. The present work uses this model as the basis for discussing the effect of particle size distribution on grain refiner performance. Larger particles (of TiB2 in the case of present interest) promote greater efficiency, as do narrower size distributions. It is shown that even if the size distribution could be exactly specified, compromises would have to be made to balance efficiency (defined as above) with other desirable characteristics of a refiner.

  20. Precipitation behavior and grain refinement of burnishing Al-Zn-Mg alloy

    Directory of Open Access Journals (Sweden)

    Ce Pang

    2018-02-01

    Full Text Available Burnishing is a unique strengthening approach to improve the strength of surface layer and remains the ductility of the interior of metallic materials. In this work, burnishing treatment was employed to improve the surface microstructure of naturally aged Al-Zn-Mg alloys after solid solution. Transmission electron microscopy, high-resolution transmission electron microscopy, X-ray diffraction and nano-indentation were used to characterize the effects of the burnishing on the microstructures of surface layer and Guinier-Preston (GP zones. It was indicated that GP zones uniformly distributed and dispersed in the matrix before burnishing, and the amount of GP zones decreased dramatically after burnishing processing. Additionally, the grains in the surficial layer were refined into nano-crystals with an average grain size of 78 nm. Burnishing treatment not only led to formation of large number of dislocation substructures in the sub-surface and near-matrix surface, but also promoted the precipitation of metastable η' phase at grain boundaries. The synergistic effects of the grain refinement, dislocation multiplication and the precipitation of η' phase strengthen the burnished layer of Al-Zn-Mg alloy. Keywords: Al-Zn-Mg alloy, Burnishing, Nano-crystal, Precipitation, Grain refinement

  1. Source selection for analogical reasoning an empirical approach

    Energy Technology Data Exchange (ETDEWEB)

    Stubblefield, W.A. [Sandia National Labs., Albuquerque, NM (United States); Luger, G.F. [Univ. of New Mexico, Albuquerque, NM (United States)

    1996-12-31

    The effectiveness of an analogical reasoner depends upon its ability to select a relevant analogical source. In many problem domains, however, too little is known about target problems to support effective source selection. This paper describes the design and evaluation of SCAVENGER, an analogical reasoner that applies two techniques to this problem: (1) An assumption-based approach to matching that allows properties of candidate sources to match unknown target properties in the absence of evidence to the contrary. (2) The use of empirical learning to improve memory organization based on problem solving experience.

  2. Multi criteria decision making approaches for green supplier evaluation and selection

    DEFF Research Database (Denmark)

    Govindan, Kannan; Rajendran, S.; Sarkis, J.

    2015-01-01

    A large and growing body of literature to supplier evaluation and selection exists. Literature on green supplier evaluation that considers environmental factors are relatively limited. Recently, in supply chain management decision making, approaches for evaluating green supplier performance have ...... us to identify improvements for green supplier selection process and possible future directions.......A large and growing body of literature to supplier evaluation and selection exists. Literature on green supplier evaluation that considers environmental factors are relatively limited. Recently, in supply chain management decision making, approaches for evaluating green supplier performance have...... used both qualitative and quantitative environmental data. Given this evolving research field, the goal and purpose of this paper is to analyze research in international scientific journals and international conference proceedings that focus on green supplier selection. We propose the following...

  3. Large-eddy simulation of wind turbine wake interactions on locally refined Cartesian grids

    Science.gov (United States)

    Angelidis, Dionysios; Sotiropoulos, Fotis

    2014-11-01

    Performing high-fidelity numerical simulations of turbulent flow in wind farms remains a challenging issue mainly because of the large computational resources required to accurately simulate the turbine wakes and turbine/turbine interactions. The discretization of the governing equations on structured grids for mesoscale calculations may not be the most efficient approach for resolving the large disparity of spatial scales. A 3D Cartesian grid refinement method enabling the efficient coupling of the Actuator Line Model (ALM) with locally refined unstructured Cartesian grids adapted to accurately resolve tip vortices and multi-turbine interactions, is presented. Second order schemes are employed for the discretization of the incompressible Navier-Stokes equations in a hybrid staggered/non-staggered formulation coupled with a fractional step method that ensures the satisfaction of local mass conservation to machine zero. The current approach enables multi-resolution LES of turbulent flow in multi-turbine wind farms. The numerical simulations are in good agreement with experimental measurements and are able to resolve the rich dynamics of turbine wakes on grids containing only a small fraction of the grid nodes that would be required in simulations without local mesh refinement. This material is based upon work supported by the Department of Energy under Award Number DE-EE0005482 and the National Science Foundation under Award number NSF PFI:BIC 1318201.

  4. Bauxite Mining and Alumina Refining

    Science.gov (United States)

    Frisch, Neale; Olney, David

    2014-01-01

    Objective: To describe bauxite mining and alumina refining processes and to outline the relevant physical, chemical, biological, ergonomic, and psychosocial health risks. Methods: Review article. Results: The most important risks relate to noise, ergonomics, trauma, and caustic soda splashes of the skin/eyes. Other risks of note relate to fatigue, heat, and solar ultraviolet and for some operations tropical diseases, venomous/dangerous animals, and remote locations. Exposures to bauxite dust, alumina dust, and caustic mist in contemporary best-practice bauxite mining and alumina refining operations have not been demonstrated to be associated with clinically significant decrements in lung function. Exposures to bauxite dust and alumina dust at such operations are also not associated with the incidence of cancer. Conclusions: A range of occupational health risks in bauxite mining and alumina refining require the maintenance of effective control measures. PMID:24806720

  5. Estimates of dietary exposure to bisphenol A (BPA) from light metal packaging using food consumption and packaging usage data: a refined deterministic approach and a fully probabilistic (FACET) approach.

    Science.gov (United States)

    Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J

    2014-01-01

    The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the

  6. South Korea - oil refining overview

    International Nuclear Information System (INIS)

    Hayes, D.

    1999-01-01

    Following the economic problems of the 1990s, the petroleum refining industry of South Korea underwent much involuntary restructuring in 1999 with respect to takeovers and mergers and these are discussed. The demand for petroleum has now pretty well recovered. The reasons for fluctuating prices in the 1990s, how the new structure should be cushioned against changes in the future, and the potential for South Korea to export refined petroleum, are all discussed

  7. Adaptive Mesh Refinement in CTH

    International Nuclear Information System (INIS)

    Crawford, David

    1999-01-01

    This paper reports progress on implementing a new capability of adaptive mesh refinement into the Eulerian multimaterial shock- physics code CTH. The adaptivity is block-based with refinement and unrefinement occurring in an isotropic 2:1 manner. The code is designed to run on serial, multiprocessor and massive parallel platforms. An approximate factor of three in memory and performance improvements over comparable resolution non-adaptive calculations has-been demonstrated for a number of problems

  8. SU-E-J-128: Two-Stage Atlas Selection in Multi-Atlas-Based Image Segmentation

    International Nuclear Information System (INIS)

    Zhao, T; Ruan, D

    2015-01-01

    Purpose: In the new era of big data, multi-atlas-based image segmentation is challenged by heterogeneous atlas quality and high computation burden from extensive atlas collection, demanding efficient identification of the most relevant atlases. This study aims to develop a two-stage atlas selection scheme to achieve computational economy with performance guarantee. Methods: We develop a low-cost fusion set selection scheme by introducing a preliminary selection to trim full atlas collection into an augmented subset, alleviating the need for extensive full-fledged registrations. More specifically, fusion set selection is performed in two successive steps: preliminary selection and refinement. An augmented subset is first roughly selected from the whole atlas collection with a simple registration scheme and the corresponding preliminary relevance metric; the augmented subset is further refined into the desired fusion set size, using full-fledged registration and the associated relevance metric. The main novelty of this work is the introduction of an inference model to relate the preliminary and refined relevance metrics, based on which the augmented subset size is rigorously derived to ensure the desired atlases survive the preliminary selection with high probability. Results: The performance and complexity of the proposed two-stage atlas selection method were assessed using a collection of 30 prostate MR images. It achieved comparable segmentation accuracy as the conventional one-stage method with full-fledged registration, but significantly reduced computation time to 1/3 (from 30.82 to 11.04 min per segmentation). Compared with alternative one-stage cost-saving approach, the proposed scheme yielded superior performance with mean and medium DSC of (0.83, 0.85) compared to (0.74, 0.78). Conclusion: This work has developed a model-guided two-stage atlas selection scheme to achieve significant cost reduction while guaranteeing high segmentation accuracy. The benefit

  9. Refinements to calandria tube - liquid injection nozzle (CT-LIN) contact assessments

    International Nuclear Information System (INIS)

    Sedran, P.J.

    2012-01-01

    In recent years, the issue of CT-LIN contact, which first gained attention in 1989, has been addressed through CT-LIN gap measurements, followed by analytical predictions of time-to-contact. CT-LIN time-to-contact predictions have been preformed independently by CPUS Limited for Point Lepreau and Gentilly-2 and by AECL Sheridan Park (now Candu Energy Inc.) for Bruce Power and Gentilly-2. Both companies used the CDEPTH code in combination with CT-LIN gap measurements. Subsequent to the assessments for Point Lepreau and Gentilly-2, a recommended approach for future assessments was presented at the 2008 CANDU maintenance conference. Since that time, a number of refinements to the overall strategy for predicting CT-LIN time-to-contact have been developed and are outlined in this paper. The refinements include: 1. The use of ultrasonic LIN elevation measurements to confirm LIN creep sag behaviour 2. The development of a non-linear empirical CT Creep Sag Model 3. The development of a rationale for discrepancies observed in repeated optical CT-LIN gap measurements and a discussion of alternative CT-LIN gap measurements With these refinements, more accurate CT-LIN time-to-contact predictions can be obtained. For stations that plan to refurbish by 210,000 EFPH, the improvement in time-to-contact predictions resulting from the fore mentioned refinements will not be of any real benefit.. However, for stations that are planning life extensions in order to operate beyond 210,000 EFPH, CT-LIN contact will be an issue. For these stations, improvements in CT-LIN contact time predictions would be beneficial. This paper presents a summary of the proposed refinements and demonstrates how they would impact CT-LIN time-to-contact predictions. (author)

  10. Price implications for Russia's oil refining

    International Nuclear Information System (INIS)

    Khartukov, Eugene M.

    1998-01-01

    Over the past several years, Russia's oil industry has undergone its radical transformation from a wholly state-run and generously subsidized oil distribution system toward a substantially privatized, cash-strapped, and quasi-market ''petropreneurship''. This fully applies to the industry's downstream sector. Still unlike more dynamic E and C operations, the country's refining has turned out better fenced off competitive market forces and is less capable to respond to market imperatives. Consequently, jammed between depressed product prices and persistent feedstock costs, Russian refiners were badly hit by the world oil glut - which has made a radical modernization of the obsolete refining sector clearly a must. (author)

  11. Optimizing refiner operation with statistical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, G [Noranda Research Centre, Pointe Claire, PQ (Canada)

    1997-02-01

    The impact of refining conditions on the energy efficiency of the process and on the handsheet quality of a chemi-mechanical pulp was studied as part of a series of pilot scale refining trials. Statistical models of refiner performance were constructed from these results and non-linear optimization of process conditions were conducted. Optimization results indicated that increasing the ratio of specific energy applied in the first stage led to a reduction of some 15 per cent in the total energy requirement. The strategy can also be used to obtain significant increases in pulp quality for a given energy input. 20 refs., 6 tabs.

  12. A new approach to the LILW repository site selection

    International Nuclear Information System (INIS)

    Mele, I.; Zeleznik, N.

    1998-01-01

    After the failure of site selection, which was performed between 1990-1993, the Agency for Radwaste Management was urged to start a new site selection process for low and intermediate level waste (LILW). Since this is the most sensitive and delicate phase of the whole disposal project extensive analyses of foreign and domestic experiences in siting were performed. Three different models were studied and discussed at a workshop on preparation of the siting procedure for LILW repository. The participants invited to the workshop supported the combined approach, to the site selection, which is presented in this paper.(author)

  13. A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry

    OpenAIRE

    Zeynep Sener; Mehtap Dursun

    2014-01-01

    Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision meth...

  14. Pacific Basin Heavy Oil Refining Capacity

    Directory of Open Access Journals (Sweden)

    David Hackett

    2013-02-01

    Full Text Available The United States today is Canada’s largest customer for oil and refined oil products. However, this relationship may be strained due to physical, economic and political influences. Pipeline capacity is approaching its limits; Canadian oil is selling at substantive discounts to world market prices; and U.S. demand for crude oil and finished products (such as gasoline, has begun to flatten significantly relative to historical rates. Lower demand, combined with increased shale oil production, means U.S. demand for Canadian oil is expected to continue to decline. Under these circumstances, gaining access to new markets such as those in the Asia-Pacific region is becoming more and more important for the Canadian economy. However, expanding pipeline capacity to the Pacific via the proposed Northern Gateway pipeline and the planned Trans Mountain pipeline expansion is only feasible when there is sufficient demand and processing capacity to support Canadian crude blends. Canadian heavy oil requires more refining and produces less valuable end products than other lighter and sweeter blends. Canadian producers must compete with lighter, sweeter oils from the Middle East, and elsewhere, for a place in the Pacific Basin refineries built to handle heavy crude blends. Canadian oil sands producers are currently expanding production capacity. Once complete, the Northern Gateway pipeline and the Trans Mountain expansion are expected to deliver an additional 500,000 to 1.1 million barrels a day to tankers on the Pacific coast. Through this survey of the capacity of Pacific Basin refineries, including existing and proposed facilities, we have concluded that there is sufficient technical capacity in the Pacific Basin to refine the additional Canadian volume; however, there may be some modifications required to certain refineries to allow them to process Western Canadian crude. Any additional capacity for Canadian oil would require refinery modifications or

  15. Grain refinement mechanism in A3003 alloy

    International Nuclear Information System (INIS)

    Cho, Hoon; Shin, Je-Sik; Lee, Byoung-Soo; Jo, Hyung-Ho

    2009-01-01

    In the present study, in order to find out an grain refinement mechanism, 0.1wt.% Al-10wt.%Ti master alloy was added into A3003 alloy melt contained in graphite crucible and in alumina crucible, and then the melt holding time at 750 deg. C was systematically changed from 1 min up to 120 min. It is interesting to note that the grain refinement and fading phenomena remarkably depend on the crucible material. The fading effect in the specimens using alumina crucible can be explained as the result of TiAl 3 phase dissolution into molten aluminium matrix. In the specimens using graphite crucible, the grain refinement was occurred gradually with increasing holding time. It was suggest that the continuous grain refinement is due to transition of refinement mechanism from TiAl 3 phase to TiC phase. It can be mentioned that the TiC formed from titanium and carbon solute in the aluminium melt, which came from the Al-10Ti alloy and the graphite crucible.

  16. Developing, Approving and Maintaining Qualifications: Selected International Approaches. Research Report

    Science.gov (United States)

    Misko, Josie

    2015-01-01

    There are lessons for Australia in the key approaches to the development, approval, maintenance and quality assurance of qualifications adopted in countries overseas. This research takes into account a range of approaches used in selected European Union (EU) member states (Germany, Finland and Sweden), the United Kingdom (England, Northern Ireland…

  17. Comparison of a rational vs. high throughput approach for rapid salt screening and selection.

    Science.gov (United States)

    Collman, Benjamin M; Miller, Jonathan M; Seadeek, Christopher; Stambek, Julie A; Blackburn, Anthony C

    2013-01-01

    In recent years, high throughput (HT) screening has become the most widely used approach for early phase salt screening and selection in a drug discovery/development setting. The purpose of this study was to compare a rational approach for salt screening and selection to those results previously generated using a HT approach. The rational approach involved a much smaller number of initial trials (one salt synthesis attempt per counterion) that were selected based on a few strategic solubility determinations of the free form combined with a theoretical analysis of the ideal solvent solubility conditions for salt formation. Salt screening results for sertraline, tamoxifen, and trazodone using the rational approach were compared to those previously generated by HT screening. The rational approach produced similar results to HT screening, including identification of the commercially chosen salt forms, but with a fraction of the crystallization attempts. Moreover, the rational approach provided enough solid from the very initial crystallization of a salt for more thorough and reliable solid-state characterization and thus rapid decision-making. The crystallization techniques used in the rational approach mimic larger-scale process crystallization, allowing smoother technical transfer of the selected salt to the process chemist.

  18. Oil refining expansion criteria for Brazil

    International Nuclear Information System (INIS)

    Tavares, M.E.E.; Szklo, A.S.; Machado, G.V.; Schaeffer, R.; Mariano, J.B.; Sala, J.F.

    2006-01-01

    This paper assesses different strategies for the expansion of Brazil's oil refining segment, using criteria that range from energy security (reducing imports and vulnerability for key products) through to maximizing the profitability of this sector (boosting the output of higher value oil products) and adding value to Brazil's oil production (reducing exports of heavy acid oil). The development prospects are analyzed for conventional fuel production technology routes, sketching out three possible refining schemes for Brazilian oil and a GTL plant for producing gasoil from natural gas. Market scenario simulations indicate that investments will be required in Brazil's oil refining segment over and above those allocated to planned modifications in its current facilities, reducing the nation's vulnerability in terms of gasoil and petrochemical naphtha imports. Although not economically attractive, oil refining is a key activity that is crucial to oil company strategies. The decision to invest in this segment depends on local infrastructure conditions, environmental constraints and fuel specifications, in addition to oil company strategies, steady growth in demand and the definition of a government policy that eases institutional risks. (author)

  19. Oil refining expansion criteria for Brazil

    International Nuclear Information System (INIS)

    Tavares, Marina Elisabete Espinho; Szklo, Alexandre Salem; Machado, Giovani Vitoria; Schaeffer, Roberto; Mariano, Jacqueline Barboza; Sala, Janaina Francisco

    2006-01-01

    This paper assesses different strategies for the expansion of Brazil's oil refining segment, using criteria that range from energy security (reducing imports and vulnerability for key products) through to maximizing the profitability of this sector (boosting the output of higher value oil products) and adding value to Brazil's oil production (reducing exports of heavy acid oil). The development prospects are analyzed for conventional fuel production technology routes, sketching out three possible refining schemes for Brazilian oil and a GTL plant for producing gasoil from natural gas. Market scenario simulations indicate that investments will be required in Brazil's oil refining segment over and above those allocated to planned modifications in its current facilities, reducing the nation's vulnerability in terms of gasoil and petrochemical naphtha imports. Although not economically attractive, oil refining is a key activity that is crucial to oil company strategies. The decision to invest in this segment depends on local infrastructure conditions, environmental constraints and fuel specifications, in addition to oil company strategies, steady growth in demand and the definition of a government policy that eases institutional risks

  20. Solving the scalability issue in quantum-based refinement: Q|R#1.

    Science.gov (United States)

    Zheng, Min; Moriarty, Nigel W; Xu, Yanting; Reimers, Jeffrey R; Afonine, Pavel V; Waller, Mark P

    2017-12-01

    Accurately refining biomacromolecules using a quantum-chemical method is challenging because the cost of a quantum-chemical calculation scales approximately as n m , where n is the number of atoms and m (≥3) is based on the quantum method of choice. This fundamental problem means that quantum-chemical calculations become intractable when the size of the system requires more computational resources than are available. In the development of the software package called Q|R, this issue is referred to as Q|R#1. A divide-and-conquer approach has been developed that fragments the atomic model into small manageable pieces in order to solve Q|R#1. Firstly, the atomic model of a crystal structure is analyzed to detect noncovalent interactions between residues, and the results of the analysis are represented as an interaction graph. Secondly, a graph-clustering algorithm is used to partition the interaction graph into a set of clusters in such a way as to minimize disruption to the noncovalent interaction network. Thirdly, the environment surrounding each individual cluster is analyzed and any residue that is interacting with a particular cluster is assigned to the buffer region of that particular cluster. A fragment is defined as a cluster plus its buffer region. The gradients for all atoms from each of the fragments are computed, and only the gradients from each cluster are combined to create the total gradients. A quantum-based refinement is carried out using the total gradients as chemical restraints. In order to validate this interaction graph-based fragmentation approach in Q|R, the entire atomic model of an amyloid cross-β spine crystal structure (PDB entry 2oNA) was refined.

  1. Accurate macromolecular crystallographic refinement: incorporation of the linear scaling, semiempirical quantum-mechanics program DivCon into the PHENIX refinement package

    Energy Technology Data Exchange (ETDEWEB)

    Borbulevych, Oleg Y.; Plumley, Joshua A.; Martin, Roger I. [QuantumBio Inc., 2790 West College Avenue, State College, PA 16801 (United States); Merz, Kenneth M. Jr [University of Florida, Gainesville, Florida (United States); Westerhoff, Lance M., E-mail: lance@quantumbioinc.com [QuantumBio Inc., 2790 West College Avenue, State College, PA 16801 (United States)

    2014-05-01

    Semiempirical quantum-chemical X-ray macromolecular refinement using the program DivCon integrated with PHENIX is described. Macromolecular crystallographic refinement relies on sometimes dubious stereochemical restraints and rudimentary energy functionals to ensure the correct geometry of the model of the macromolecule and any covalently bound ligand(s). The ligand stereochemical restraint file (CIF) requires a priori understanding of the ligand geometry within the active site, and creation of the CIF is often an error-prone process owing to the great variety of potential ligand chemistry and structure. Stereochemical restraints have been replaced with more robust functionals through the integration of the linear-scaling, semiempirical quantum-mechanics (SE-QM) program DivCon with the PHENIX X-ray refinement engine. The PHENIX/DivCon package has been thoroughly validated on a population of 50 protein–ligand Protein Data Bank (PDB) structures with a range of resolutions and chemistry. The PDB structures used for the validation were originally refined utilizing various refinement packages and were published within the past five years. PHENIX/DivCon does not utilize CIF(s), link restraints and other parameters for refinement and hence it does not make as many a priori assumptions about the model. Across the entire population, the method results in reasonable ligand geometries and low ligand strains, even when the original refinement exhibited difficulties, indicating that PHENIX/DivCon is applicable to both single-structure and high-throughput crystallography.

  2. Comparing geological and statistical approaches for element selection in sediment tracing research

    Science.gov (United States)

    Laceby, J. Patrick; McMahon, Joe; Evrard, Olivier; Olley, Jon

    2015-04-01

    Elevated suspended sediment loads reduce reservoir capacity and significantly increase the cost of operating water treatment infrastructure, making the management of sediment supply to reservoirs of increasingly importance. Sediment fingerprinting techniques can be used to determine the relative contributions of different sources of sediment accumulating in reservoirs. The objective of this research is to compare geological and statistical approaches to element selection for sediment fingerprinting modelling. Time-integrated samplers (n=45) were used to obtain source samples from four major subcatchments flowing into the Baroon Pocket Dam in South East Queensland, Australia. The geochemistry of potential sources were compared to the geochemistry of sediment cores (n=12) sampled in the reservoir. The geochemical approach selected elements for modelling that provided expected, observed and statistical discrimination between sediment sources. Two statistical approaches selected elements for modelling with the Kruskal-Wallis H-test and Discriminatory Function Analysis (DFA). In particular, two different significance levels (0.05 & 0.35) for the DFA were included to investigate the importance of element selection on modelling results. A distribution model determined the relative contributions of different sources to sediment sampled in the Baroon Pocket Dam. Elemental discrimination was expected between one subcatchment (Obi Obi Creek) and the remaining subcatchments (Lexys, Falls and Bridge Creek). Six major elements were expected to provide discrimination. Of these six, only Fe2O3 and SiO2 provided expected, observed and statistical discrimination. Modelling results with this geological approach indicated 36% (+/- 9%) of sediment sampled in the reservoir cores were from mafic-derived sources and 64% (+/- 9%) were from felsic-derived sources. The geological and the first statistical approach (DFA0.05) differed by only 1% (σ 5%) for 5 out of 6 model groupings with only

  3. Lanthanum hexaboride as advanced structural refiner/getter in TiAl-based refractory intermetallics

    Energy Technology Data Exchange (ETDEWEB)

    Kartavykh, A.V., E-mail: karta@korolev-net.ru [Technological Institute for Superhard and Novel Carbon Materials (TISNCM), 7a Centralnaya str., 142190 Troitsk, Moscow (Russian Federation); National University of Science and Technology “MISIS”, Leninsky pr. 4, 119049 Moscow (Russian Federation); Asnis, E.A.; Piskun, N.V.; Statkevich, I.I. [The E.O. Paton Electric Welding Institute, 11 Bozhenko str., 03680 Kyiv (Ukraine); Gorshenkov, M.V.; Tcherdyntsev, V.V. [National University of Science and Technology “MISIS”, Leninsky pr. 4, 119049 Moscow (Russian Federation)

    2014-03-05

    Highlights: • Fist application of LaB{sub 6} additive in TiAl-based intermetallics casting. • Pilot synthesis/casting and study of selected TiAl(Nb,Cr,Zr)B,La alloys set. • Dual effect observed: phase structure refinement and oxygen impurity removal. • Co-precipitation of TiB and La{sub 2}O{sub 3} in melt: 2LaB{sub 6} + 12Ti + 3O → 12TiB↓ + La{sub 2}O{sub 3}↓. • Features of structure refinement and oxygen gettering mechanisms reported. -- Abstract: The work is aimed at the study of the formation and refinement of microstructure appearing in the solidifying refractory TiAl-based intermetallics being inoculated with precise boron addition. The novelty of research consists in test application of lanthanum hexaboride (LaB{sub 6}) ligature within semi-continuous electron beam casting process of selected alloys. Two ingots with nominal compositions Ti–44Al–5Nb–2Cr–1.5Zr–0.4B–0.07La and Ti–44Al–5Nb–1Cr–1.5Zr–1B–0.17La (at.%) have been synthesized and cast along with the reference alloy Ti–44Al–5Nb–3Cr–1.5Zr. Their comparative examination suggests (i) essential microstructural phase refinement effect coupled with (ii) threefold/fourfold decrease of background content of undesirable residual oxygen impurity in both alloys containing LaB{sub 6}. This advanced dual activity (i–ii) of LaB{sub 6} is explained by its complete dissolution, dissociation and following re-precipitation of effective Ti-based monoboride nucleants of orthorhombic B27 structure, those being accompanied by strong internal gettering of dissolved oxygen from the melt and from boride-inoculated solid α{sub 2}-Ti{sub 3}Al phase with liberated elemental lanthanum. The phase composition and structure of cast alloys; state and characterization of newly precipitated TiB boride; features of La{sub 2}O{sub 3} micro/nano-dimensional precipitation and oxygen gettering mechanism are reported and discussed.

  4. Refined large N duality for knots

    DEFF Research Database (Denmark)

    Kameyama, Masaya; Nawata, Satoshi

    We formulate large N duality of U(N) refined Chern-Simons theory with a torus knot/link in S³. By studying refined BPS states in M-theory, we provide the explicit form of low-energy effective actions of Type IIA string theory with D4-branes on the Ω-background. This form enables us to relate...

  5. Region-of-interest volumetric visual hull refinement

    KAUST Repository

    Knoblauch, Daniel; Kuester, Falko

    2010-01-01

    This paper introduces a region-of-interest visual hull refinement technique, based on flexible voxel grids for volumetric visual hull reconstructions. Region-of-interest refinement is based on a multipass process, beginning with a focussed visual

  6. Physicochemical and antioxidant properties of non-refined sugarcane alternatives to white sugar

    OpenAIRE

    Seguí Gil, Lucía; CALABUIG JIMENEZ, LAURA; Betoret Valls, Noelia; Fito Maupoey, Pedro

    2015-01-01

    [EN] Antioxidant properties of commercial sugarcane-derived products were analysed to study their suitability for being used as functional ingredients. Cane honey, several jaggeries and several brown sugars were selected from the market and analysed in terms of physicochemical characteristics and antioxidant properties, and compared with white refined sugar (twelve products in total). Moisture, water activity, total soluble solids, pH, colour and sugar profile are reported. As for antioxidant...

  7. A simulated annealing approach to supplier selection aware inventory planning

    OpenAIRE

    Turk, Seda; Miller, Simon; Özcan, Ender; John, Robert

    2015-01-01

    Selection of an appropriate supplier is a crucial and challenging task in the effective management of a supply chain. Also, appropriate inventory management is critical to the success of a supply chain operation. In recent years, there has been a growing interest in the area of selection of an appropriate vendor and creating good inventory planning using supplier selection information. In this paper, we consider both of these tasks in a two-stage approach employing Interval Type-2 Fuzzy Sets ...

  8. Ranking and selection of commercial off-the-shelf using fuzzy distance based approach

    Directory of Open Access Journals (Sweden)

    Rakesh Garg

    2015-06-01

    Full Text Available There is a tremendous growth of the use of the component based software engineering (CBSE approach for the development of software systems. The selection of the best suited COTS components which fulfils the necessary requirement for the development of software(s has become a major challenge for the software developers. The complexity of the optimal selection problem increases with an increase in alternative potential COTS components and the corresponding selection criteria. In this research paper, the problem of ranking and selection of Data Base Management Systems (DBMS components is modeled as a multi-criteria decision making problem. A ‘Fuzzy Distance Based Approach (FDBA’ method is proposed for the optimal ranking and selection of DBMS COTS components of an e-payment system based on 14 selection criteria grouped under three major categories i.e. ‘Vendor Capabilities’, ‘Business Issues’ and ‘Cost’. The results of this method are compared with other Analytical Hierarchy Process (AHP which is termed as a typical multi-criteria decision making approach. The proposed methodology is explained with an illustrated example.

  9. Protein homology model refinement by large-scale energy optimization.

    Science.gov (United States)

    Park, Hahnbeom; Ovchinnikov, Sergey; Kim, David E; DiMaio, Frank; Baker, David

    2018-03-20

    Proteins fold to their lowest free-energy structures, and hence the most straightforward way to increase the accuracy of a partially incorrect protein structure model is to search for the lowest-energy nearby structure. This direct approach has met with little success for two reasons: first, energy function inaccuracies can lead to false energy minima, resulting in model degradation rather than improvement; and second, even with an accurate energy function, the search problem is formidable because the energy only drops considerably in the immediate vicinity of the global minimum, and there are a very large number of degrees of freedom. Here we describe a large-scale energy optimization-based refinement method that incorporates advances in both search and energy function accuracy that can substantially improve the accuracy of low-resolution homology models. The method refined low-resolution homology models into correct folds for 50 of 84 diverse protein families and generated improved models in recent blind structure prediction experiments. Analyses of the basis for these improvements reveal contributions from both the improvements in conformational sampling techniques and the energy function.

  10. US refining reviewed

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.

    1998-01-01

    The paper reviews the history, present position and future prospects of the petroleum industry in the USA. The main focus is on supply and demand, the high quality of the products, refinery capacity and product trade balances. Diagrams show historical trends in output, product demand, demand for transport fuels and oil, refinery capacity, refinery closures, and imports and exports. Some particularly salient points brought out were (i) production of US crude shows a marked downward trend but imports of crude will continue to increase, (ii) product demand will continue to grow even though the levels are already high, (iii) the demand is dominated by those products that typically yield the highest income for the refiner, (i.e. high quality transport fuels for environmental compliance), (iv) refinery capacity has decreased since 1980 and (v) refining will continue to have financial problems but will still be profitable. (UK)

  11. NMR structural refinement of an extrahelical adenosine tridecamer d(CGCAGAATTCGCG)2 via a hybrid relaxation matrix procedure

    International Nuclear Information System (INIS)

    Nikonowicz, E.P.; Meadows, R.P.; Gorenstein, D.G.

    1990-01-01

    Until very recently interproton distances from NOESY experiments have been derived solely from the two-spin approximation method. Unfortunately, even at short mixing times, there is a significant error in many of these distances. A complete relaxation matrix approach employing a matrix eigenvalue/eigenvector solution to the Bloch equations avoids the approximation of the two-spin method. The authors calculated the structure of an extrahelical adenosine tridecamer oligodeoxyribonucleotide duplex, d-(CGCAGAATTCGCG) 2 , by an iterative refinement approach using a hybrid relaxation matrix method combined with restrained molecular dynamics calculations. Distances from the 2D NOESY spectra have been calculated from the relaxation rate matrix which has been evaluated from a hybrid NOESY volume matrix comprising elements from the experiment and those calculated from an initial structure. The hybrid matrix derived distances have then been used in a restrained molecular dynamics procedure to obtain a new structure that better approximates the NOESY spectra. The resulting partially refined structure is then used to calculate an improved theoretical NOESY volume matrix which is once again merged with the experimental matrix until refinement is complete. Although the crystal structure of the tridecamer clearly shows the extrahelical adenosine looped out way from the duplex, the NOESY distance restrained hybrid matrix/molecular dynamics structural refinement establishes that the extrahelical adenosine stacks into the duplex

  12. A potential approach for low flow selection in water resource supply and management

    Science.gov (United States)

    Ouyang, Ying

    2012-08-01

    SummaryLow flow selections are essential to water resource management, water supply planning, and watershed ecosystem restoration. In this study, a new approach, namely the frequent-low (FL) approach (or frequent-low index), was developed based on the minimum frequent-low flow or level used in minimum flows and/or levels program in northeast Florida, USA. This FL approach was then compared to the conventional 7Q10 approach for low flow selections prior to its applications, using the USGS flow data from the freshwater environment (Big Sunflower River, Mississippi) as well as from the estuarine environment (St. Johns River, Florida). Unlike the FL approach that is associated with the biological and ecological impacts, the 7Q10 approach could lead to the selections of extremely low flows (e.g., near-zero flows) that may hinder its use for establishing criteria to prevent streams from significant harm to biological and ecological communities. Additionally, the 7Q10 approach could not be used when the period of data records is less than 10 years by definition while this may not the case for the FL approach. Results from both approaches showed that the low flows from the Big Sunflower River and the St. Johns River decreased as time elapsed, demonstrating that these two rivers have become drier during the last several decades with a potential of salted water intrusion to the St. Johns River. Results from the FL approach further revealed that the recurrence probability of low flow increased while the recurrence interval of low flow decreased as time elapsed in both rivers, indicating that low flows occurred more frequent in these rivers as time elapsed. This report suggests that the FL approach, developed in this study, is a useful alternative for low flow selections in addition to the 7Q10 approach.

  13. Neutrosophic Refined Similarity Measure Based on Cosine Function

    Directory of Open Access Journals (Sweden)

    Said Broumi

    2014-12-01

    Full Text Available In this paper, the cosine similarity measure of neutrosophic refined (multi- sets is proposed and its properties are studied. The concept of this cosine similarity measure of neutrosophic refined sets is the extension of improved cosine similarity measure of single valued neutrosophic. Finally, using this cosine similarity measure of neutrosophic refined set, the application of medical diagnosis is presented.

  14. Rethinking of the heuristic-analytic dual process theory: a comment on Wada and Nittono (2004) and the reasoning process in the Wason selection task.

    Science.gov (United States)

    Cardaci, Maurizio; Misuraca, Raffaella

    2005-08-01

    This paper raises some methodological problems in the dual process explanation provided by Wada and Nittono for their 2004 results using the Wason selection task. We maintain that the Nittono rethinking approach is weak and that it should be refined to grasp better the evidence of analytic processes.

  15. Future of French refining

    International Nuclear Information System (INIS)

    Calvet, B.

    1993-01-01

    Over recent years, the refining industry has had to grapple with a growing burden of environmental and safety regulations concerning not only its plants and other facilities, but also its end products. At the same time, it has had to bear the effects of the reduction of the special status that used to apply to petroleum, and the consequences of economic freedom, to which we should add, as specifically concerns the French market, the impact of energy policy and the pro-nuclear option. The result is a drop in heavy fuel oil from 36 million tonnes per year in 1973 to 6.3 million in 1992, and in home-heating fuel from 37 to 18 million per year. This fast-moving market is highly competitive. The French market in particular is wide open to imports, but the refining companies are still heavy exporters for those products with high added-value, like lubricants, jet fuel, and lead-free gasolines. The competition has led the refining companies to commit themselves to quality, and to publicize their efforts in this direction. This is why the long-term perspectives for petroleum fuels are still wide open. This is supported by the probable expectation that the goal of economic efficiency is likely to soften the effects of the energy policy, which penalizes petroleum products, in that they have now become competitive again. In the European context, with the challenge of environmental protection and the decline in heavy fuel outlets, French refining has to keep on improving the quality of its products and plants, which means major investments. The industry absolutely must return to a more normal level of profitability, in order to sustain this financial effort, and generate the prosperity of its high-performance plants and equipment. 1 fig., 5 tabs

  16. The evolution of oil refining in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Reid, A. [CONCAWE, Brussels (Belgium)

    2013-04-01

    Back in 1963 when CONCAWE was founded, the world looked very different from what it is today, and so did the global and European refining industry. Oil product markets were expanding fast and new refineries were being built at a steady rate. The oil crisis of the 1970s brought an abrupt end to this, heralding a long era of consolidation and stepwise adaptation. At the same time the nature of the global oil business shifted from fully integrated companies producing, transporting and refining their own oil to a much more diversified situation where oil production ('upstream') and refining/distribution ('downstream') gradually became two essentially separate businesses. From being purely a 'cost centre' in an integrated chain, refining has become a separate activity in its own right, operating as a 'profit centre' between two global markets - crude oil and products - which, although not entirely independent, have their own dynamics and influences. In addition demand gradually shifted towards lighter products while the quality requirements on all products were considerably tightened. This article explores the new challenges that these changes have imposed on EU refiners, and describes CONCAWE's contributions to understanding their impact on refinery production and investments.

  17. Noise bias in the refinement of structures derived from single particles

    International Nuclear Information System (INIS)

    Stewart, Alex; Grigorieff, Nikolaus

    2004-01-01

    One of the main goals in the determination of three-dimensional macromolecular structures from electron microscope images of individual molecules and complexes (single particles) is a sufficiently high spatial resolution, about 4 A, at which the interpretation with an atomic model becomes possible. To reach high resolution, an iterative refinement procedure using an expectation maximization algorithm is often used that leads to a more accurate alignment of the positional and orientational parameters for each particle. We show here the results of refinement algorithms that use a phase residual, a linear correlation coefficient, or a weighted correlation coefficient to align individual particles. The algorithms were applied to computer-generated data sets that contained projections from model structures, as well as noise. The algorithms show different degrees of over-fitting, especially at high resolution where the signal is weak. We demonstrate that the degree of over-fitting is reduced with a weighting scheme that depends on the signal-to-noise ratio in the data. The weighting also improves the accuracy of resolution measurement by the commonly used Fourier shell correlation. The performance of the refinement algorithms is compared to that using a maximum likelihood approach. The weighted correlation coefficient was implemented in the computer program FREALIGN

  18. Refining's-clean new jingle

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that at a time when profit margins are slim and gasoline demand is down, the U.S. petroleum-refining industry is facing one of its greatest challenges; How to meet new federal and state laws for reformulated gasoline, oxygenated fuels, low-sulfur diesel and other measures to improve the environment. The American Petroleum Institute (API) estimates that industry will spend between $15 and $23 billion by the end of the decade to meet the U.S. Clean Air Act Amendments (CAAA) of 1990, and other legislation. ENSR Consulting and Engineering's capital-spending figure runs to between $70 and 100 billion this decade, including $24 billion to produce reformulated fuels and $10-12 billion to reduce refinery emissions. M.W. Kellogg Co. estimates that refiners may have to spend up to $30 billion this decade to meet the demand for reformulated gasoline. The estimates are wide-ranging because refiners are still studying their options and delaying final decisions as long as they can, to try to ensure they are the best and least-costly decisions. Oxygenated fuels will be required next winter, but federal regulations for reformulated gasoline won't go into effect until 1995, while California's tougher reformulated-fuels law will kick in the following year

  19. A New Spectral Shape-Based Record Selection Approach Using Np and Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Edén Bojórquez

    2013-01-01

    Full Text Available With the aim to improve code-based real records selection criteria, an approach inspired in a parameter proxy of spectral shape, named Np, is analyzed. The procedure is based on several objectives aimed to minimize the record-to-record variability of the ground motions selected for seismic structural assessment. In order to select the best ground motion set of records to be used as an input for nonlinear dynamic analysis, an optimization approach is applied using genetic algorithms focuse on finding the set of records more compatible with a target spectrum and target Np values. The results of the new Np-based approach suggest that the real accelerograms obtained with this procedure, reduce the scatter of the response spectra as compared with the traditional approach; furthermore, the mean spectrum of the set of records is very similar to the target seismic design spectrum in the range of interest periods, and at the same time, similar Np values are obtained for the selected records and the target spectrum.

  20. Refinement of the concept of uncertainty.

    Science.gov (United States)

    Penrod, J

    2001-04-01

    To analyse the conceptual maturity of uncertainty; to develop an expanded theoretical definition of uncertainty; to advance the concept using methods of concept refinement; and to analyse congruency with the conceptualization of uncertainty presented in the theory of hope, enduring, and suffering. Uncertainty is of concern in nursing as people experience complex life events surrounding health. In an earlier nursing study that linked the concepts of hope, enduring, and suffering into a single theoretical scheme, a state best described as 'uncertainty' arose. This study was undertaken to explore how this conceptualization fit with the scientific literature on uncertainty and to refine the concept. Initially, a concept analysis using advanced methods described by Morse, Hupcey, Mitcham and colleagues was completed. The concept was determined to be partially mature. A theoretical definition was derived and techniques of concept refinement using the literature as data were applied. The refined concept was found to be congruent with the concept of uncertainty that had emerged in the model of hope, enduring and suffering. Further investigation is needed to explore the extent of probabilistic reasoning and the effects of confidence and control on feelings of uncertainty and certainty.

  1. Refinement Checking on Parametric Modal Transition Systems

    DEFF Research Database (Denmark)

    Benes, Nikola; Kretínsky, Jan; Larsen, Kim Guldstrand

    2015-01-01

    Modal transition systems (MTS) is a well-studied specification formalism of reactive systems supporting a step-wise refinement methodology. Despite its many advantages, the formalism as well as its currently known extensions are incapable of expressing some practically needed aspects in the refin...

  2. Japan's refiner/marketers headed for major shakeout

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    Japan's downstream oil industry is in a state of crisis and headed for a major shakeout. The major catalyst for this was a dramatic deregulation step during April 1996 that allowed refined petroleum product imports by non-refiners. The move, together with a sharp drop in refining margins, falling retail gasoline prices, and a service station sector on the brink of collapse, are all leading to massive changes in the way the country's refiners and marketers do business. This paper reviews the collapse of corporate profits during this period of deregulation; the development of a new price system geared toward bringing the prices of gasoline, fuel oil, and kerosene into line with each other to offset the fall in gasoline prices; and industry restructuring including mergers, acquisitions, and marketing consolidation. The paper then makes predictions on the outcome of these changes on the Japanese oil industry

  3. Ab-initio crystal structure analysis and refinement approaches of oligo p-benzamides based on electron diffraction data

    DEFF Research Database (Denmark)

    Gorelik, Tatiana E; van de Streek, Jacco; Kilbinger, Andreas F M

    2012-01-01

    Ab-initio crystal structure analysis of organic materials from electron diffraction data is presented. The data were collected using the automated electron diffraction tomography (ADT) technique. The structure solution and refinement route is first validated on the basis of the known crystal stru...

  4. Pilot scale refinning of crude soybean oil | Mensah | Journal of ...

    African Journals Online (AJOL)

    Pilot scale refinning of crude soybean oil. ... Abstract. A laboratory process for refining soybean has been scaled up to a 145 tonne per annum pilot plant to refine crude soybean oil. ... The quality of the refined oil was found to be within national and codex standard specifications for edible oil from vegetable sources.

  5. Feature Selection using Multi-objective Genetic Algorith m: A Hybrid Approach

    OpenAIRE

    Ahuja, Jyoti; GJUST - Guru Jambheshwar University of Sciecne and Technology; Ratnoo, Saroj Dahiya; GJUST - Guru Jambheshwar University of Sciecne and Technology

    2015-01-01

    Feature selection is an important pre-processing task for building accurate and comprehensible classification models. Several researchers have applied filter, wrapper or hybrid approaches using genetic algorithms which are good candidates for optimization problems that involve large search spaces like in the case of feature selection. Moreover, feature selection is an inherently multi-objective problem with many competing objectives involving size, predictive power and redundancy of the featu...

  6. Method of preparing an Al-Ti-B grain refiner for aluminium-comprising products, and a method of casting aluminium products

    OpenAIRE

    Brinkman, H.J.; Duszczyk, J.; Katgerman, L.

    1999-01-01

    The invention relates to a method of preparing an Al-Ti-B grain refiner for cast aluminium-comprising products. According to the invention the preparation is realized by mixing powders selected from the group comprising aluminium, titanium, boron, and alloys and intermetallic compounds thereof, compressing, heating in an inert environment until an exothermic reaction is initiated and cooling. It has been shown that when the grain refiner thus prepared is applied, the quality of cast products ...

  7. A Diagnostic Approach to Increase Reusable Dinnerware Selection in a Cafeteria

    Science.gov (United States)

    Manuel, Jennifer C.; Sunseri, Mary Anne; Olson, Ryan; Scolari, Miranda

    2007-01-01

    The current project tested a diagnostic approach to selecting interventions to increase patron selection of reusable dinnerware in a cafeteria. An assessment survey, completed by a sample of 43 patrons, suggested that the primary causes of wasteful behavior were (a) environmental arrangement of dinnerware options and (b) competing motivational…

  8. Cavitation-aided grain refinement in aluminium alloys

    NARCIS (Netherlands)

    Atamanenko, T.V.

    2010-01-01

    This thesis deals with grain refinement under the influence of ultrasonic-driven cavitation in aluminium casting processes. Three major goals of this research were: (1) to identify the mechanism of the cavitation-aided grain refinement at different stages of solidification; (2) to reveal the

  9. Accounting for linkage disequilibrium in genome scans for selection without individual genotypes: The local score approach.

    Science.gov (United States)

    Fariello, María Inés; Boitard, Simon; Mercier, Sabine; Robelin, David; Faraut, Thomas; Arnould, Cécile; Recoquillay, Julien; Bouchez, Olivier; Salin, Gérald; Dehais, Patrice; Gourichon, David; Leroux, Sophie; Pitel, Frédérique; Leterrier, Christine; SanCristobal, Magali

    2017-07-01

    Detecting genomic footprints of selection is an important step in the understanding of evolution. Accounting for linkage disequilibrium in genome scans increases detection power, but haplotype-based methods require individual genotypes and are not applicable on pool-sequenced samples. We propose to take advantage of the local score approach to account for linkage disequilibrium in genome scans for selection, cumulating (possibly small) signals from single markers over a genomic segment, to clearly pinpoint a selection signal. Using computer simulations, we demonstrate that this approach detects selection with higher power than several state-of-the-art single-marker, windowing or haplotype-based approaches. We illustrate this on two benchmark data sets including individual genotypes, for which we obtain similar results with the local score and one haplotype-based approach. Finally, we apply the local score approach to Pool-Seq data obtained from a divergent selection experiment on behaviour in quail and obtain precise and biologically coherent selection signals: while competing methods fail to highlight any clear selection signature, our method detects several regions involving genes known to act on social responsiveness or autistic traits. Although we focus here on the detection of positive selection from multiple population data, the local score approach is general and can be applied to other genome scans for selection or other genomewide analyses such as GWAS. © 2017 John Wiley & Sons Ltd.

  10. Accurate macromolecular crystallographic refinement: incorporation of the linear scaling, semiempirical quantum-mechanics program DivCon into the PHENIX refinement package.

    Science.gov (United States)

    Borbulevych, Oleg Y; Plumley, Joshua A; Martin, Roger I; Merz, Kenneth M; Westerhoff, Lance M

    2014-05-01

    Macromolecular crystallographic refinement relies on sometimes dubious stereochemical restraints and rudimentary energy functionals to ensure the correct geometry of the model of the macromolecule and any covalently bound ligand(s). The ligand stereochemical restraint file (CIF) requires a priori understanding of the ligand geometry within the active site, and creation of the CIF is often an error-prone process owing to the great variety of potential ligand chemistry and structure. Stereochemical restraints have been replaced with more robust functionals through the integration of the linear-scaling, semiempirical quantum-mechanics (SE-QM) program DivCon with the PHENIX X-ray refinement engine. The PHENIX/DivCon package has been thoroughly validated on a population of 50 protein-ligand Protein Data Bank (PDB) structures with a range of resolutions and chemistry. The PDB structures used for the validation were originally refined utilizing various refinement packages and were published within the past five years. PHENIX/DivCon does not utilize CIF(s), link restraints and other parameters for refinement and hence it does not make as many a priori assumptions about the model. Across the entire population, the method results in reasonable ligand geometries and low ligand strains, even when the original refinement exhibited difficulties, indicating that PHENIX/DivCon is applicable to both single-structure and high-throughput crystallography.

  11. Selecting appropriate wastewater treatment technologies using a choosing-by-advantages approach.

    Science.gov (United States)

    Arroyo, Paz; Molinos-Senante, María

    2018-06-01

    Selecting the most sustainable wastewater treatment (WWT) technology among possible alternatives is a very complex task because the choice must integrate economic, environmental, and social criteria. Traditionally, several multi-criteria decision-making approaches have been applied, with the most often used being the analytical hierarchical process (AHP). However, AHP allows users to offset poor environmental and/or social performance with low cost. To overcome this limitation, our study examines a choosing-by-advantages (CBA) approach to rank seven WWT technologies for secondary WWT. CBA results were compared with results obtained by using the AHP approach. The rankings of WWT alternatives differed, depending on whether the CBA or AHP approach was used, which highlights the importance of the method used to support decision-making processes, particularly ones that rely on subjective interpretations by experts. This paper uses a holistic perspective to demonstrate the benefits of using the CBA approach to support a decision-making process when a group of experts must come to a consensus in selecting the most suitable WWT technology among several available. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Liver segmentation in contrast enhanced CT data using graph cuts and interactive 3D segmentation refinement methods

    Energy Technology Data Exchange (ETDEWEB)

    Beichel, Reinhard; Bornik, Alexander; Bauer, Christian; Sorantin, Erich [Departments of Electrical and Computer Engineering and Internal Medicine, Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242 (United States); Institute for Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, A-8010 Graz (Austria); Department of Electrical and Computer Engineering, Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242 (United States); Department of Radiology, Medical University Graz, Auenbruggerplatz 34, A-8010 Graz (Austria)

    2012-03-15

    Purpose: Liver segmentation is an important prerequisite for the assessment of liver cancer treatment options like tumor resection, image-guided radiation therapy (IGRT), radiofrequency ablation, etc. The purpose of this work was to evaluate a new approach for liver segmentation. Methods: A graph cuts segmentation method was combined with a three-dimensional virtual reality based segmentation refinement approach. The developed interactive segmentation system allowed the user to manipulate volume chunks and/or surfaces instead of 2D contours in cross-sectional images (i.e, slice-by-slice). The method was evaluated on twenty routinely acquired portal-phase contrast enhanced multislice computed tomography (CT) data sets. An independent reference was generated by utilizing a currently clinically utilized slice-by-slice segmentation method. After 1 h of introduction to the developed segmentation system, three experts were asked to segment all twenty data sets with the proposed method. Results: Compared to the independent standard, the relative volumetric segmentation overlap error averaged over all three experts and all twenty data sets was 3.74%. Liver segmentation required on average 16 min of user interaction per case. The calculated relative volumetric overlap errors were not found to be significantly different [analysis of variance (ANOVA) test, p = 0.82] between experts who utilized the proposed 3D system. In contrast, the time required by each expert for segmentation was found to be significantly different (ANOVA test, p = 0.0009). Major differences between generated segmentations and independent references were observed in areas were vessels enter or leave the liver and no accepted criteria for defining liver boundaries exist. In comparison, slice-by-slice based generation of the independent standard utilizing a live wire tool took 70.1 min on average. A standard 2D segmentation refinement approach applied to all twenty data sets required on average 38.2 min of

  13. Liver segmentation in contrast enhanced CT data using graph cuts and interactive 3D segmentation refinement methods

    International Nuclear Information System (INIS)

    Beichel, Reinhard; Bornik, Alexander; Bauer, Christian; Sorantin, Erich

    2012-01-01

    Purpose: Liver segmentation is an important prerequisite for the assessment of liver cancer treatment options like tumor resection, image-guided radiation therapy (IGRT), radiofrequency ablation, etc. The purpose of this work was to evaluate a new approach for liver segmentation. Methods: A graph cuts segmentation method was combined with a three-dimensional virtual reality based segmentation refinement approach. The developed interactive segmentation system allowed the user to manipulate volume chunks and/or surfaces instead of 2D contours in cross-sectional images (i.e, slice-by-slice). The method was evaluated on twenty routinely acquired portal-phase contrast enhanced multislice computed tomography (CT) data sets. An independent reference was generated by utilizing a currently clinically utilized slice-by-slice segmentation method. After 1 h of introduction to the developed segmentation system, three experts were asked to segment all twenty data sets with the proposed method. Results: Compared to the independent standard, the relative volumetric segmentation overlap error averaged over all three experts and all twenty data sets was 3.74%. Liver segmentation required on average 16 min of user interaction per case. The calculated relative volumetric overlap errors were not found to be significantly different [analysis of variance (ANOVA) test, p = 0.82] between experts who utilized the proposed 3D system. In contrast, the time required by each expert for segmentation was found to be significantly different (ANOVA test, p = 0.0009). Major differences between generated segmentations and independent references were observed in areas were vessels enter or leave the liver and no accepted criteria for defining liver boundaries exist. In comparison, slice-by-slice based generation of the independent standard utilizing a live wire tool took 70.1 min on average. A standard 2D segmentation refinement approach applied to all twenty data sets required on average 38.2 min of

  14. Liver segmentation in contrast enhanced CT data using graph cuts and interactive 3D segmentation refinement methods.

    Science.gov (United States)

    Beichel, Reinhard; Bornik, Alexander; Bauer, Christian; Sorantin, Erich

    2012-03-01

    Liver segmentation is an important prerequisite for the assessment of liver cancer treatment options like tumor resection, image-guided radiation therapy (IGRT), radiofrequency ablation, etc. The purpose of this work was to evaluate a new approach for liver segmentation. A graph cuts segmentation method was combined with a three-dimensional virtual reality based segmentation refinement approach. The developed interactive segmentation system allowed the user to manipulate volume chunks and∕or surfaces instead of 2D contours in cross-sectional images (i.e, slice-by-slice). The method was evaluated on twenty routinely acquired portal-phase contrast enhanced multislice computed tomography (CT) data sets. An independent reference was generated by utilizing a currently clinically utilized slice-by-slice segmentation method. After 1 h of introduction to the developed segmentation system, three experts were asked to segment all twenty data sets with the proposed method. Compared to the independent standard, the relative volumetric segmentation overlap error averaged over all three experts and all twenty data sets was 3.74%. Liver segmentation required on average 16 min of user interaction per case. The calculated relative volumetric overlap errors were not found to be significantly different [analysis of variance (ANOVA) test, p = 0.82] between experts who utilized the proposed 3D system. In contrast, the time required by each expert for segmentation was found to be significantly different (ANOVA test, p = 0.0009). Major differences between generated segmentations and independent references were observed in areas were vessels enter or leave the liver and no accepted criteria for defining liver boundaries exist. In comparison, slice-by-slice based generation of the independent standard utilizing a live wire tool took 70.1 min on average. A standard 2D segmentation refinement approach applied to all twenty data sets required on average 38.2 min of user interaction

  15. Optimization bitumen-based upgrading and refining schemes

    Energy Technology Data Exchange (ETDEWEB)

    Munteanu, M.; Chen, J. [National Centre for Upgrading Technology, Devon, AB (Canada); Natural Resources Canada, Devon, AB (Canada). CanmetENERGY

    2009-07-01

    This poster highlighted the results of a study in which the entire refining scheme for Canadian bitumen as feedstocks was modelled and simulated under different process configurations, operating conditions and product structures. The aim of the study was to optimize the economic benefits, product quality and energy use under a range of operational scenarios. Optimal refining schemes were proposed along with process conditions for existing refinery configurations and objectives. The goal was to provide guidelines and information for upgrading and refining process design and retrofitting. Critical steps were identified with regards to the upgrading process. It was concluded that the information obtained from this study would lead to significant improvement in process performance and operations, and in reducing the capital cost for building new upgraders and refineries. The simulation results provided valuable information for increasing the marketability of bitumen, reducing greenhouse gas emissions and other environmental impacts associated with bitumen upgrading and refining. tabs., figs.

  16. 1991 worldwide refining and gas processing directory

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This book ia an authority for immediate information on the industry. You can use it to find new business, analyze market trends, and to stay in touch with existing contacts while making new ones. The possibilities for business applications are numerous. Arranged by country, all listings in the directory include address, phone, fax and telex numbers, a description of the company's activities, names of key personnel and their titles, corporate headquarters, branch offices and plant sites. This newly revised edition lists more than 2000 companies and nearly 3000 branch offices and plant locations. This east-to-use reference also includes several of the most vital and informative surveys of the industry, including the U.S. Refining Survey, the Worldwide Construction Survey in Refining, Sulfur, Gas Processing and Related Fuels, the Worldwide Refining and Gas Processing Survey, the Worldwide Catalyst Report, and the U.S. and Canadian Lube and Wax Capacities Report from the National Petroleum Refiner's Association

  17. A Macdonald refined topological vertex

    Science.gov (United States)

    Foda, Omar; Wu, Jian-Feng

    2017-07-01

    We consider the refined topological vertex of Iqbal et al (2009 J. High Energy Phys. JHEP10(2009)069), as a function of two parameters ≤ft\\lgroup x, y \\right\\rgroup , and deform it by introducing the Macdonald parameters ≤ft\\lgroup q, t \\right\\rgroup , as in the work of Vuletić on plane partitions (Vuletić M 2009 Trans. Am. Math. Soc. 361 2789-804), to obtain ‘a Macdonald refined topological vertex’. In the limit q → t , we recover the refined topological vertex of Iqbal et al and in the limit x → y , we obtain a qt-deformation of the original topological vertex of Aganagic et al (2005 Commun. Math. Phys. 25 425-78). Copies of the vertex can be glued to obtain qt-deformed 5D instanton partition functions that have well-defined 4D limits and, for generic values of ≤ft\\lgroup q, t\\right\\rgroup , contain infinite-towers of poles for every pole present in the limit q → t .

  18. Transbasal versus endoscopic endonasal versus combined approaches for olfactory groove meningiomas: importance of approach selection.

    Science.gov (United States)

    Liu, James K; Silva, Nicole A; Sevak, Ilesha A; Eloy, Jean Anderson

    2018-04-01

    OBJECTIVE There has been much debate regarding the optimal surgical approach for resecting olfactory groove meningiomas (OGMs). In this paper, the authors analyzed the factors involved in approach selection and reviewed the surgical outcomes in a series of OGMs. METHODS A retrospective review of 28 consecutive OGMs from a prospective database was conducted. Each tumor was treated via one of 3 approaches: transbasal approach (n = 15), pure endoscopic endonasal approach (EEA; n = 5), and combined (endoscope-assisted) transbasal-EEA (n = 8). RESULTS The mean tumor volume was greatest in the transbasal (92.02 cm 3 ) and combined (101.15 cm 3 ) groups. Both groups had significant lateral dural extension over the orbits (transbasal 73.3%, p 95%) was achieved in 20% of transbasal and 37.5% of combined cases, all due to tumor adherence to the critical neurovascular structures. The rate of CSF leakage was 0% in the transbasal and combined groups, and there was 1 leak in the EEA group (20%), resulting in an overall CSF leakage rate of 3.6%. Olfaction was preserved in 66.7% in the transbasal group. There was no significant difference in length of stay or 30-day readmission rate between the 3 groups. The mean modified Rankin Scale score was 0.79 after the transbasal approach, 2.0 after EEA, and 2.4 after the combined approach (p = 0.0604). The mean follow-up was 14.5 months (range 1-76 months). CONCLUSIONS The transbasal approach provided the best clinical outcomes with the lowest rate of complications for large tumors (> 40 mm) and for smaller tumors (OGMs invading the sinonasal cavity. Careful patient selection using an individualized, tailored strategy is important to optimize surgical outcomes.

  19. Molecular genetics and livestock selection. Approaches, opportunities and risks

    International Nuclear Information System (INIS)

    Williams, J.L.

    2005-01-01

    Following domestication, livestock were selected both naturally through adaptation to their environments and by man so that they would fulfil a particular use. As selection methods have become more sophisticated, rapid progress has been made in improving those traits that are easily measured. However, selection has also resulted in decreased diversity. In some cases, improved breeds have replaced local breeds, risking the loss of important survival traits. The advent of molecular genetics provides the opportunity to identify the genes that control particular traits by a gene mapping approach. However, as with selection, the early mapping studies focused on traits that are easy to measure. Where molecular genetics can play a valuable role in livestock production is by providing the means to select effectively for traits that are difficult to measure. Identifying the genes underpinning particular traits requires a population in which these traits are segregating. Fortunately, several experimental populations have been created that have allowed a wide range of traits to be studied. Gene mapping work in these populations has shown that the role of particular genes in controlling variation in a given trait can depend on the genetic background. A second finding is that the most favourable alleles for a trait may in fact. be present in animals that perform poorly for the trait. In the long term, knowledge of -the genes controlling particular traits, and the way they interact with the genetic background, will allow introgression between breeds and the assembly of genotypes that are best suited to particular environments, producing animals with the desired characteristics. If used wisely, this approach will maintain genetic diversity while improving performance over a wide range of desired traits. (author)

  20. Dynamic quantum crystallography: lattice-dynamical models refined against diffraction data. II. Applications to L-alanine, naphthalene and xylitol.

    Science.gov (United States)

    Hoser, Anna A; Madsen, Anders Ø

    2017-03-01

    In the first paper of this series [Hoser & Madsen (2016). Acta Cryst. A72, 206-214], a new approach was introduced which enables the refinement of frequencies of normal modes obtained from ab initio periodic computations against single-crystal diffraction data. In this contribution, the performance of this approach is tested by refinement against data in the temperature range from 23 to 205 K on the molecular crystals of L-alanine, naphthalene and xylitol. The models, which are lattice-dynamical models derived at the Γ point of the Brillouin zone, are able to describe the atomic vibrations of L-alanine and naphthalene to a level where the residual densities are similar to those obtained from the independent atom model. For the more flexible molecule xylitol, larger deviations are found. Hydrogen ADPs (anisotropic displacement parameters) derived from the models are in similar or better agreement with neutron diffraction results than ADPs obtained by other procedures. The heat capacity calculated after normal mode refinement for naphthalene is in reasonable agreement with the heat capacity obtained from calorimetric measurements (to less than 1 cal mol -1  K -1 below 300 K), with deviations at higher temperatures indicating anharmonicity. Standard uncertainties and correlation of the refined parameters have been derived based on a Monte Carlo procedure. The uncertainties are quite small and probably underestimated.

  1. Exploiting structure similarity in refinement: automated NCS and target-structure restraints in BUSTER

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Oliver S., E-mail: osmart@globalphasing.com; Womack, Thomas O.; Flensburg, Claus; Keller, Peter; Paciorek, Włodek; Sharff, Andrew; Vonrhein, Clemens; Bricogne, Gérard [Global Phasing Ltd, Sheraton House, Castle Park, Cambridge CB3 0AX (United Kingdom)

    2012-04-01

    Local structural similarity restraints (LSSR) provide a novel method for exploiting NCS or structural similarity to an external target structure. Two examples are given where BUSTER re-refinement of PDB entries with LSSR produces marked improvements, enabling further structural features to be modelled. Maximum-likelihood X-ray macromolecular structure refinement in BUSTER has been extended with restraints facilitating the exploitation of structural similarity. The similarity can be between two or more chains within the structure being refined, thus favouring NCS, or to a distinct ‘target’ structure that remains fixed during refinement. The local structural similarity restraints (LSSR) approach considers all distances less than 5.5 Å between pairs of atoms in the chain to be restrained. For each, the difference from the distance between the corresponding atoms in the related chain is found. LSSR applies a restraint penalty on each difference. A functional form that reaches a plateau for large differences is used to avoid the restraints distorting parts of the structure that are not similar. Because LSSR are local, there is no need to separate out domains. Some restraint pruning is still necessary, but this has been automated. LSSR have been available to academic users of BUSTER since 2009 with the easy-to-use -autoncs and @@target target.pdb options. The use of LSSR is illustrated in the re-refinement of PDB entries http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -target enables the correct ligand-binding structure to be found, and http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -autoncs contributes to the location of an additional copy of the cyclic peptide ligand.

  2. Quantum mechanics new approaches to selected topics

    CERN Document Server

    Lipkin, Harry Jeannot

    1973-01-01

    Acclaimed as ""excellent"" (Nature) and ""very original and refreshing"" (Physics Today), this collection of self-contained studies is geared toward advanced undergraduates and graduate students. Its broad selection of topics includes the Mössbauer effect, many-body quantum mechanics, scattering theory, Feynman diagrams, and relativistic quantum mechanics.Author Harry J. Lipkin, a well-known teacher at Israel's Weizmann Institute, takes an unusual approach by introducing many interesting physical problems and mathematical techniques at a much earlier point than in conventional texts. This meth

  3. Modified process for refining niobium by electron beam

    International Nuclear Information System (INIS)

    Pinatti, D.G.; Takano, C.

    1982-01-01

    The experimental results, thermodynamic equilibrium and kinetic theory of the metal/gas interaction in refractory metals are reviewed. The adsorption and desorption of nitrogen, hydrogen and CO are reversible, whereas those of oxygen are irreversible, with adsorption of an oxygen atom and volatilisation of the metal oxide. Based upon this fact, a new electron beam refining technology is proposed for niobium, consisting of four points: preparation of an electrode by aluminothermic reduction; zone refining in the first melt; kinetic refining in subsequent melts and compact design of the refining plant. Experimental results from a 300 kW pilot plant were in complete agreement with the technology proposed, giving 2.4 times the productivity predicted by the conventional technology. (Author) [pt

  4. Validation and Refinement of a Pain Information Model from EHR Flowsheet Data.

    Science.gov (United States)

    Westra, Bonnie L; Johnson, Steven G; Ali, Samira; Bavuso, Karen M; Cruz, Christopher A; Collins, Sarah; Furukawa, Meg; Hook, Mary L; LaFlamme, Anne; Lytle, Kay; Pruinelli, Lisiane; Rajchel, Tari; Settergren, Theresa Tess; Westman, Kathryn F; Whittenburg, Luann

    2018-01-01

    Secondary use of electronic health record (EHR) data can reduce costs of research and quality reporting. However, EHR data must be consistent within and across organizations. Flowsheet data provide a rich source of interprofessional data and represents a high volume of documentation; however, content is not standardized. Health care organizations design and implement customized content for different care areas creating duplicative data that is noncomparable. In a prior study, 10 information models (IMs) were derived from an EHR that included 2.4 million patients. There was a need to evaluate the generalizability of the models across organizations. The pain IM was selected for evaluation and refinement because pain is a commonly occurring problem associated with high costs for pain management. The purpose of our study was to validate and further refine a pain IM from EHR flowsheet data that standardizes pain concepts, definitions, and associated value sets for assessments, goals, interventions, and outcomes. A retrospective observational study was conducted using an iterative consensus-based approach to map, analyze, and evaluate data from 10 organizations. The aggregated metadata from the EHRs of 8 large health care organizations and the design build in 2 additional organizations represented flowsheet data from 6.6 million patients, 27 million encounters, and 683 million observations. The final pain IM has 30 concepts, 4 panels (classes), and 396 value set items. Results are built on Logical Observation Identifiers Names and Codes (LOINC) pain assessment terms and extend the need for additional terms to support interoperability. The resulting pain IM is a consensus model based on actual EHR documentation in the participating health systems. The IM captures the most important concepts related to pain. Schattauer GmbH Stuttgart.

  5. Aesthetic refinements in reconstructive microsurgery of the lower leg.

    Science.gov (United States)

    Rainer, Christian; Schwabegger, Anton H; Gardetto, Alexander; Schoeller, Thomas; Hussl, Heribert; Ninkovic, Milomir M

    2004-02-01

    Even if a surgical procedure is performed for reconstructive and functional reasons, a plastic surgeon must be responsible for the visible result of the work and for the social reintegration of the patient; therefore, the aesthetic appearance of a microsurgically reconstructed lower leg must be considered. Based on the experience of 124 free-tissue transfers to the lower leg performed in 112 patients between January 1994 and March 2001 (110 [88.7 percent] were transferred successfully), three cases are presented. Considerations concerning flap selection and technical refinements in designing and tailoring microvascular flaps to improve the quality of reconstruction, also according to the aesthetic appearance, are discussed.

  6. Grain refinement of AZ31 magnesium alloy by electromagnetic ...

    Indian Academy of Sciences (India)

    Low-frequency electromagnetic field; AZ31 magnesium alloy; Al4C3; grain refinement. Abstract. The effects of electromagnetic stirring and Al4C3 grain refiner on the grain refinement of semicontinuously cast AZ31 magnesium alloy were discussed in this investigation. The results indicate that electromagnetic stirring has an ...

  7. Zone refining high-purity germanium

    International Nuclear Information System (INIS)

    Hubbard, G.S.; Haller, E.E.; Hansen, W.L.

    1977-10-01

    The effects of various parameters on germanium purification by zone refining have been examined. These parameters include the germanium container and container coatings, ambient gas and other operating conditions. Four methods of refining are presented which reproducibly yield 3.5 kg germanium ingots from which high purity (vertical barN/sub A/ - N/sub D/vertical bar less than or equal to2 x 10 10 cm -3 ) single crystals can be grown. A qualitative model involving binary and ternary complexes of Si, O, B, and Al is shown to account for the behavior of impurities at these low concentrations

  8. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  9. A Grey Fuzzy Logic Approach for Cotton Fibre Selection

    Science.gov (United States)

    Chakraborty, Shankar; Das, Partha Protim; Kumar, Vidyapati

    2017-06-01

    It is a well known fact that the quality of ring spun yarn predominantly depends on various physical properties of cotton fibre. Any variation in these fibre properties may affect the strength and unevenness of the final yarn. Thus, so as to achieve the desired yarn quality and characteristics, it becomes imperative for the spinning industry personnel to identify the most suitable cotton fibre from a set of feasible alternatives in presence of several conflicting properties/attributes. This cotton fibre selection process can be modelled as a Multi-Criteria Decision Making (MCDM) problem. In this paper, a grey fuzzy logic-based approach is proposed for selection of the most apposite cotton fibre from 17 alternatives evaluated based on six important fibre properties. It is observed that the preference order of the top-ranked cotton fibres derived using the grey fuzzy logic approach closely matches with that attained by the past researchers which proves the application potentiality of this method in solving varying MCDM problems in textile industries.

  10. Chinese refining capacity for Canadian heavy oil

    International Nuclear Information System (INIS)

    Bruce, G.W.

    2006-01-01

    This paper discussed China's refining capacity in relation to exports of Canadian heavy oil. Demand for oil is increasing throughout the world, and China is expected to consume 25 per cent of the projected yearly oil supplies. Alberta currently has an estimated 174 billion barrels of recoverable bitumen, and produces 1.06 million barrels per day. Production is expected to increase to 4.5 million barrels per day by the year 2020. Currently bitumen blends are refined and diluted with naphtha and sweet synthetic crude oil. Bitumen is a challenging feedstock for refineries, and requires thermal production methods or gasification processes. Primary conversion into sour synthetic crude is typically followed by hydrocracking and further refining into finished petroleum products. There are currently 50 refineries in China with a 7.4 million barrel per day capacity. Coastal refineries using imported crude oil have a 4 million barrel per day capacity. New facilities are being constructed and existing plants are being upgraded in order to process heavier and more sour crude oils. However, current refining capabilities in Chinese refineries have a limited ability for resid conversion. It was concluded that while China has a refining infrastructure, only refineries on the coast will use oil sands-derived feedstocks. However, there are currently opportunities to design refineries to match future feedstocks. tabs., figs

  11. On macromolecular refinement at subatomic resolution with interatomic scatterers

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, BLDG 64R0121, Berkeley, CA 94720 (United States); Lunin, Vladimir Y. [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino 142290 (Russian Federation); Urzhumtsev, Alexandre [IGMBC, 1 Rue L. Fries, 67404 Illkirch and IBMC, 15 Rue R. Descartes, 67084 Strasbourg (France); Faculty of Sciences, Nancy University, 54506 Vandoeuvre-lès-Nancy (France); Lawrence Berkeley National Laboratory, One Cyclotron Road, BLDG 64R0121, Berkeley, CA 94720 (United States)

    2007-11-01

    Modelling deformation electron density using interatomic scatters is simpler than multipolar methods, produces comparable results at subatomic resolution and can easily be applied to macromolecules. A study of the accurate electron-density distribution in molecular crystals at subatomic resolution (better than ∼1.0 Å) requires more detailed models than those based on independent spherical atoms. A tool that is conventionally used in small-molecule crystallography is the multipolar model. Even at upper resolution limits of 0.8–1.0 Å, the number of experimental data is insufficient for full multipolar model refinement. As an alternative, a simpler model composed of conventional independent spherical atoms augmented by additional scatterers to model bonding effects has been proposed. Refinement of these mixed models for several benchmark data sets gave results that were comparable in quality with the results of multipolar refinement and superior to those for conventional models. Applications to several data sets of both small molecules and macromolecules are shown. These refinements were performed using the general-purpose macromolecular refinement module phenix.refine of the PHENIX package.

  12. Evaluation of peptide selection approaches for epitope‐based vaccine design

    DEFF Research Database (Denmark)

    Schubert, B.; Lund, Ole; Nielsen, Morten

    2013-01-01

    A major challenge in epitope-based vaccine (EV) design stems from the vast genomic variation of pathogens and the diversity of the host cellular immune system. Several computational approaches have been published to assist the selection of potential T cell epitopes for EV design. So far, no thoro......A major challenge in epitope-based vaccine (EV) design stems from the vast genomic variation of pathogens and the diversity of the host cellular immune system. Several computational approaches have been published to assist the selection of potential T cell epitopes for EV design. So far...... in terms of in silico measurements simulating important vaccine properties like the ability of inducing protection against a multivariant pathogen in a population; the predicted immunogenicity; pathogen, allele, and population coverage; as well as the conservation of selected epitopes. Additionally, we...... evaluate the use of human leukocyte antigen (HLA) supertypes with regards to their applicability for population-spanning vaccine design. The results showed that in terms of induced protection methods that simultaneously aim to optimize pathogen and HLA coverage significantly outperform methods focusing...

  13. Conforming to interface structured adaptive mesh refinement: 3D algorithm and implementation

    Science.gov (United States)

    Nagarajan, Anand; Soghrati, Soheil

    2018-03-01

    A new non-iterative mesh generation algorithm named conforming to interface structured adaptive mesh refinement (CISAMR) is introduced for creating 3D finite element models of problems with complex geometries. CISAMR transforms a structured mesh composed of tetrahedral elements into a conforming mesh with low element aspect ratios. The construction of the mesh begins with the structured adaptive mesh refinement of elements in the vicinity of material interfaces. An r-adaptivity algorithm is then employed to relocate selected nodes of nonconforming elements, followed by face-swapping a small fraction of them to eliminate tetrahedrons with high aspect ratios. The final conforming mesh is constructed by sub-tetrahedralizing remaining nonconforming elements, as well as tetrahedrons with hanging nodes. In addition to studying the convergence and analyzing element-wise errors in meshes generated using CISAMR, several example problems are presented to show the ability of this method for modeling 3D problems with intricate morphologies.

  14. Review of Grain Refinement of Cast Metals Through Inoculation: Theories and Developments

    Science.gov (United States)

    Liu, Zhilin

    2017-10-01

    The inoculation method of grain refinement is widely used in research and industry. Because of its commercial and engineering importance, extensive research on the mechanisms/theories of grain refinement and development of effective grain refiners for diverse cast metals/alloys has been conducted. In 1999, Easton and St. John reviewed the mechanisms of grain refinement of cast Al alloys. Since then, grain refinement in alloys of Al, Mg, Fe, Ti, Cu, and Zn has evolved a lot. However, there is still no full consensus on the mechanisms/theories of grain refinement. Moreover, some new grain refiners developed based on the theories do not ensure efficient grain refinement. Thus, the factors that contribute to grain refinement are still not fully understood. Clarification of the prerequisite issues that occur in grain refinement is required using recent theories. This review covers multiple metals/alloys and developments in grain refinement from the last twenty years. The characteristics of effective grain refiners are considered from four perspectives: effective particle/matrix wetting configuration, sufficiently powerful segregating elements, preferential crystallographic matching, and geometrical features of effective nucleants. Then, recent mechanisms/theories on the grain refinement of cast metals/alloys are reviewed, including the peritectic-related, hypernucleation, inert nucleant, and constitutional supercooling-driven theories. Further, developments of deterministic and probabilistic modeling and nucleation crystallography in the grain refinement of cast metals are reviewed. Finally, the latest progress in the grain refinement of cast Zn and its alloys is described, and future work on grain refinement is summarized.

  15. Refining the ischemic penumbra with topography.

    Science.gov (United States)

    Thirugnanachandran, Tharani; Ma, Henry; Singhal, Shaloo; Slater, Lee-Anne; Davis, Stephen M; Donnan, Geoffrey A; Phan, Thanh

    2018-04-01

    It has been 40 years since the ischemic penumbra was first conceptualized through work on animal models. The topography of penumbra has been portrayed as an infarcted core surrounded by penumbral tissue and an extreme rim of oligemic tissue. This picture has been used in many review articles and textbooks before the advent of modern imaging. In this paper, we review our understanding of the topography of the ischemic penumbra from the initial experimental animal models to current developments with neuroimaging which have helped to further define the temporal and spatial evolution of the penumbra and refine our knowledge. The concept of the penumbra has been successfully applied in clinical trials of endovascular therapies with a time window as long as 24 h from onset. Further, there are reports of "good" outcome even in patients with a large ischemic core. This latter observation of good outcome despite having a large core requires an understanding of the topography of the penumbra and the function of the infarcted regions. It is proposed that future research in this area takes departure from a time-dependent approach to a more individualized tissue and location-based approach.

  16. Total antioxidant content of alternatives to refined sugar.

    Science.gov (United States)

    Phillips, Katherine M; Carlsen, Monica H; Blomhoff, Rune

    2009-01-01

    Oxidative damage is implicated in the etiology of cancer, cardiovascular disease, and other degenerative disorders. Recent nutritional research has focused on the antioxidant potential of foods, while current dietary recommendations are to increase the intake of antioxidant-rich foods rather than supplement specific nutrients. Many alternatives to refined sugar are available, including raw cane sugar, plant saps/syrups (eg, maple syrup, agave nectar), molasses, honey, and fruit sugars (eg, date sugar). Unrefined sweeteners were hypothesized to contain higher levels of antioxidants, similar to the contrast between whole and refined grain products. To compare the total antioxidant content of natural sweeteners as alternatives to refined sugar. The ferric-reducing ability of plasma (FRAP) assay was used to estimate total antioxidant capacity. Major brands of 12 types of sweeteners as well as refined white sugar and corn syrup were sampled from retail outlets in the United States. Substantial differences in total antioxidant content of different sweeteners were found. Refined sugar, corn syrup, and agave nectar contained minimal antioxidant activity (sugar had a higher FRAP (0.1 mmol/100 g). Dark and blackstrap molasses had the highest FRAP (4.6 to 4.9 mmol/100 g), while maple syrup, brown sugar, and honey showed intermediate antioxidant capacity (0.2 to 0.7 mmol FRAP/100 g). Based on an average intake of 130 g/day refined sugars and the antioxidant activity measured in typical diets, substituting alternative sweeteners could increase antioxidant intake an average of 2.6 mmol/day, similar to the amount found in a serving of berries or nuts. Many readily available alternatives to refined sugar offer the potential benefit of antioxidant activity.

  17. An integrated MCDM approach to green supplier selection

    Directory of Open Access Journals (Sweden)

    Morteza Yazdani

    2014-06-01

    Full Text Available Supplier selection management has been considered as an important subject for industrial organizations. In order to remain on the market, to gain profitability and to retain competitive advantage, business units need to establish an integrated and structured supplier selection system. In addition, environmental protection problems have been big solicitudes for organizations to consider green approach in supplier selection problem. However, finding proper suppliers involves several variables and it is critically a complex process. In this paper, the main attention is focused on finding the right supplier based on fuzzy multi criteria decision making (MCDM process. The weights of criteria are calculated by analytical hierarchical process (AHP and the final ranking is achieved by fuzzy technique for order preference by similarity to an ideal solution (TOPSIS. TOPSIS advantage among the other similar methods is to obtain the best solution close to ideal solution. The paper attempts to express better understanding by an example of an automobile manufacturing supply chain.

  18. A Bayesian Approach to Model Selection in Hierarchical Mixtures-of-Experts Architectures.

    Science.gov (United States)

    Tanner, Martin A.; Peng, Fengchun; Jacobs, Robert A.

    1997-03-01

    There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable; investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive "divide-and-conquer" strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.

  19. Investment in exploration-production and refining 2015

    International Nuclear Information System (INIS)

    Maisonnier, G.; Hureau, G.; Serbutoviez, S.; Silva, C.

    2016-01-01

    IFPEN analyses in this study the 2015 evolution of global investment in the field of exploration-production and refining: - Changes in oil and gas prices; - Investment in Exploration/Production: the end of an upward cycle; - Drilling and the global drilling market, upstream activities and markets; - 2015, a breath of fresh air for refining

  20. Selecting concepts for a concept-based curriculum: application of a benchmark approach.

    Science.gov (United States)

    Giddens, Jean Foret; Wright, Mary; Gray, Irene

    2012-09-01

    In response to a transformational movement in nursing education, faculty across the country are considering changes to curricula and approaches to teaching. As a result, an emerging trend in many nursing programs is the adoption of a concept-based curriculum. As part of the curriculum development process, the selection of concepts, competencies, and exemplars on which to build courses and base content is needed. This article presents a benchmark approach used to validate and finalize concept selection among educators developing a concept-based curriculum for a statewide nursing consortium. These findings are intended to inform other nurse educators who are currently involved with or are considering this curriculum approach. Copyright 2012, SLACK Incorporated.

  1. Creating a simplified model of oil refining and solution of the model using a linear programming method. Part 1, Model presentation

    Energy Technology Data Exchange (ETDEWEB)

    Duric, M; Novakovic, M; Stojkanovic, L

    1983-01-01

    Based on a detailed analysis of technological limitations in oil refining at an oil refinery (NPZ) in the separation of petroleum products, specifications with regard to quality and market limitations, a general model is created of oil refining. The number of actual variables is reduced to a minimum, while the number of linear limitations is adapted to the selected technology and raw material quality. The nonlinearity of certain characteristics is overcome using the so called movement factor.

  2. An evaluation of the effects of the tax on refined petroleum products in the Philippines

    International Nuclear Information System (INIS)

    Uri, N.D.; Boyd, R.

    1993-01-01

    This paper uses an aggregate modelling approach to assess the effect of taxes on refined petroleum products on the Philippine economy. The approach used in the analysis consists of a general equilibrium model comprising 14 producing sectors, 14 consuming sectors, 3 household categories classified by income and government. The effects of removing the 48% tax on premium and regular gasoline and the 24% tax on other refined petroleum products on prices and quantities are examined. The results are revealing. For example, the consequences of a complete elimination of refined petroleum product taxes would be an increase in output by all producing sectors of about 3.7% or about 2.65 hundred billion Philippine pesos, a rise in the consumption of goods and services by about 13.6% or 4.2 hundred billion Philippine pesos, a rise in total utility by 14.3% or 4.5 hundred billion Philippine pesos and lower tax revenue for the government of 62.4% or 2.8 hundred billion Philippine pesos. When subjected to a sensitivity analysis, the results are reasonably robust with regard to the assumption of the values of the substitution elasticities. That is, while the model's equilibrium values do vary in response to different assumptions of the values of these elasticities, the fluctuations are not so enormous to suggest that the model is unrealistically sensitive to these parameters. (Author)

  3. Hirshfeld atom refinement.

    Science.gov (United States)

    Capelli, Silvia C; Bürgi, Hans-Beat; Dittrich, Birger; Grabowsky, Simon; Jayatilaka, Dylan

    2014-09-01

    Hirshfeld atom refinement (HAR) is a method which determines structural parameters from single-crystal X-ray diffraction data by using an aspherical atom partitioning of tailor-made ab initio quantum mechanical molecular electron densities without any further approximation. Here the original HAR method is extended by implementing an iterative procedure of successive cycles of electron density calculations, Hirshfeld atom scattering factor calculations and structural least-squares refinements, repeated until convergence. The importance of this iterative procedure is illustrated via the example of crystalline ammonia. The new HAR method is then applied to X-ray diffraction data of the dipeptide Gly-l-Ala measured at 12, 50, 100, 150, 220 and 295 K, using Hartree-Fock and BLYP density functional theory electron densities and three different basis sets. All positions and anisotropic displacement parameters (ADPs) are freely refined without constraints or restraints - even those for hydrogen atoms. The results are systematically compared with those from neutron diffraction experiments at the temperatures 12, 50, 150 and 295 K. Although non-hydrogen-atom ADPs differ by up to three combined standard uncertainties (csu's), all other structural parameters agree within less than 2 csu's. Using our best calculations (BLYP/cc-pVTZ, recommended for organic molecules), the accuracy of determining bond lengths involving hydrogen atoms from HAR is better than 0.009 Å for temperatures of 150 K or below; for hydrogen-atom ADPs it is better than 0.006 Å(2) as judged from the mean absolute X-ray minus neutron differences. These results are among the best ever obtained. Remarkably, the precision of determining bond lengths and ADPs for the hydrogen atoms from the HAR procedure is comparable with that from the neutron measurements - an outcome which is obtained with a routinely achievable resolution of the X-ray data of 0.65 Å.

  4. Biomolecular structure refinement using the GROMOS simulation software

    International Nuclear Information System (INIS)

    Schmid, Nathan; Allison, Jane R.; Dolenc, Jožica; Eichenberger, Andreas P.; Kunz, Anna-Pitschna E.; Gunsteren, Wilfred F. van

    2011-01-01

    For the understanding of cellular processes the molecular structure of biomolecules has to be accurately determined. Initial models can be significantly improved by structure refinement techniques. Here, we present the refinement methods and analysis techniques implemented in the GROMOS software for biomolecular simulation. The methodology and some implementation details of the computation of NMR NOE data, 3 J-couplings and residual dipolar couplings, X-ray scattering intensities from crystals and solutions and neutron scattering intensities used in GROMOS is described and refinement strategies and concepts are discussed using example applications. The GROMOS software allows structure refinement combining different types of experimental data with different types of restraining functions, while using a variety of methods to enhance conformational searching and sampling and the thermodynamically calibrated GROMOS force field for biomolecular simulation.

  5. Biomolecular structure refinement using the GROMOS simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, Nathan; Allison, Jane R.; Dolenc, Jozica; Eichenberger, Andreas P.; Kunz, Anna-Pitschna E.; Gunsteren, Wilfred F. van, E-mail: wfvgn@igc.phys.chem.ethz.ch [Swiss Federal Institute of Technology ETH, Laboratory of Physical Chemistry (Switzerland)

    2011-11-15

    For the understanding of cellular processes the molecular structure of biomolecules has to be accurately determined. Initial models can be significantly improved by structure refinement techniques. Here, we present the refinement methods and analysis techniques implemented in the GROMOS software for biomolecular simulation. The methodology and some implementation details of the computation of NMR NOE data, {sup 3}J-couplings and residual dipolar couplings, X-ray scattering intensities from crystals and solutions and neutron scattering intensities used in GROMOS is described and refinement strategies and concepts are discussed using example applications. The GROMOS software allows structure refinement combining different types of experimental data with different types of restraining functions, while using a variety of methods to enhance conformational searching and sampling and the thermodynamically calibrated GROMOS force field for biomolecular simulation.

  6. Pakistan stepping up expansion of refining, transportation sectors

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that Pakistan is taking steps to speed expansion of its refining and oil transportation infrastructure. While the country has made significant progress toward energy self-efficiency by boosting oil and gas production it still must modernize and expand an aging, inadequate refining sector to meet rapidly growing demand for refined products. Pakistan's government has disclosed plans to build two refineries in the country, one at Rawalpindi near a string of recent oil discoveries, the other somewhere in the southern part of the country, likely Karachi. At the same time, efforts are proceeding to upgrade Pakistan's refineries. In addition, Pakistani state companies continue to press joint ventures in refining and marketing with foreign companies and expand downstream ties with neighbors that are key oil and gas exporters

  7. New Process for Grain Refinement of Aluminum. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Joseph A. Megy

    2000-09-22

    A new method of grain refining aluminum involving in-situ formation of boride nuclei in molten aluminum just prior to casting has been developed in the subject DOE program over the last thirty months by a team consisting of JDC, Inc., Alcoa Technical Center, GRAS, Inc., Touchstone Labs, and GKS Engineering Services. The Manufacturing process to make boron trichloride for grain refining is much simpler than preparing conventional grain refiners, with attendant environmental, capital, and energy savings. The manufacture of boride grain refining nuclei using the fy-Gem process avoids clusters, salt and oxide inclusions that cause quality problems in aluminum today.

  8. Selection bias in species distribution models: An econometric approach on forest trees based on structural modeling

    Science.gov (United States)

    Martin-StPaul, N. K.; Ay, J. S.; Guillemot, J.; Doyen, L.; Leadley, P.

    2014-12-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global changes on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of applications on forest trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8km). We also compared the outputs of the SSDM with outputs of a classical SDM (i.e. Biomod ensemble modelling) in terms of bioclimatic response curves and potential distributions under current climate and climate change scenarios. The shapes of the bioclimatic response curves and the modelled species distribution maps differed markedly between SSDM and classical SDMs, with contrasted patterns according to species and spatial resolutions. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents

  9. Improvement of neutronic calculations on a Masurca core using adaptive mesh refinement capabilities

    International Nuclear Information System (INIS)

    Fournier, D.; Archier, P.; Le Tellier, R.; Suteau, C.

    2011-01-01

    The simulation of 3D cores with homogenized assemblies in transport theory remains time and memory consuming for production calculations. With a multigroup discretization for the energy variable and a discrete ordinate method for the angle, a system of about 10"4 coupled hyperbolic transport equations has to be solved. For these equations, we intend to optimize the spatial discretization. In the framework of the SNATCH solver used in this study, the spatial problem is dealt with by using a structured hexahedral mesh and applying a Discontinuous Galerkin Finite Element Method (DGFEM). This paper shows the improvements due to the development of Adaptive Mesh Refinement (AMR) methods. As the SNATCH solver uses a hierarchical polynomial basis, p−refinement is possible but also h−refinement thanks to non conforming capabilities. Besides, as the flux spatial behavior is highly dependent on the energy, we propose to adapt differently the spatial discretization according to the energy group. To avoid dealing with too many meshes, some energy groups are joined and share the same mesh. The different energy-dependent AMR strategies are compared to each other but also with the classical approach of a conforming and highly refined spatial mesh. This comparison is carried out on different quantities such as the multiplication factor, the flux or the current. The gain in time and memory is shown for 2D and 3D benchmarks coming from the ZONA2B experimental core configuration of the MASURCA mock-up at CEA Cadarache. (author)

  10. An efficient Adaptive Mesh Refinement (AMR) algorithm for the Discontinuous Galerkin method: Applications for the computation of compressible two-phase flows

    Science.gov (United States)

    Papoutsakis, Andreas; Sazhin, Sergei S.; Begg, Steven; Danaila, Ionut; Luddens, Francky

    2018-06-01

    We present an Adaptive Mesh Refinement (AMR) method suitable for hybrid unstructured meshes that allows for local refinement and de-refinement of the computational grid during the evolution of the flow. The adaptive implementation of the Discontinuous Galerkin (DG) method introduced in this work (ForestDG) is based on a topological representation of the computational mesh by a hierarchical structure consisting of oct- quad- and binary trees. Adaptive mesh refinement (h-refinement) enables us to increase the spatial resolution of the computational mesh in the vicinity of the points of interest such as interfaces, geometrical features, or flow discontinuities. The local increase in the expansion order (p-refinement) at areas of high strain rates or vorticity magnitude results in an increase of the order of accuracy in the region of shear layers and vortices. A graph of unitarian-trees, representing hexahedral, prismatic and tetrahedral elements is used for the representation of the initial domain. The ancestral elements of the mesh can be split into self-similar elements allowing each tree to grow branches to an arbitrary level of refinement. The connectivity of the elements, their genealogy and their partitioning are described by linked lists of pointers. An explicit calculation of these relations, presented in this paper, facilitates the on-the-fly splitting, merging and repartitioning of the computational mesh by rearranging the links of each node of the tree with a minimal computational overhead. The modal basis used in the DG implementation facilitates the mapping of the fluxes across the non conformal faces. The AMR methodology is presented and assessed using a series of inviscid and viscous test cases. Also, the AMR methodology is used for the modelling of the interaction between droplets and the carrier phase in a two-phase flow. This approach is applied to the analysis of a spray injected into a chamber of quiescent air, using the Eulerian

  11. Technological innovation and environmental regulation at the petroleum refining industry: the Paulinia refinery case; Inovacao tecnologica e regulacao ambiental na industria de refino de petroleo: o caso da Refinaria de Paulinia

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, Adalberto Mantovani Martiniano de; Pereira, Newton Mueller [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Inst. de Geociencias. Dept. de Politica Cientifica e Tecnologica]. E-mails: adalba@ige.unicamp.br; newpe@ige.unicamp.br

    2006-07-01

    This article discusses the influence of environmental regulation on the adoption of new production techniques and on the improvement of existing techniques in the refining petroleum industry, namely at the Paulinia refinery (REPLAN). Describes the techniques adopted in order to fit refining processes into the regulation about environmental impacts (related to the protection of resources like water, air and soil), and also techniques adopted in order to produce less pollutant diesel and gasoline. This article has support on bibliographic research and data collected in REPLAN and CENPES, which permit characterize technologies adopted in REPLAN at the end of the 90s and the regulatory rules that drive them. The regulation is presented under an evolutionary approach, considering that technology develops along whit the socio-economic context, the environmental regulation is a related element which determines the search and selection of technologies able to comply with regulation ensuring economic viability. Regulation is also a determinant factor for the adoption of innovations in the refining industry. Specifically in REPLAN, the environmental regulation has required large investments in order to comply processes and products with the established standards. (author)

  12. Method of optimization of the natural gas refining process

    Energy Technology Data Exchange (ETDEWEB)

    Sadykh-Zade, E.S.; Bagirov, A.A.; Mardakhayev, I.M.; Razamat, M.S.; Tagiyev, V.G.

    1980-01-01

    The SATUM (automatic control system of technical operations) system introduced at the Shatlyk field should assure good quality of gas refining. In order to optimize the natural gas refining processes and experimental-analytical method is used in compiling the mathematical descriptions. The program, compiled in Fortran language, in addition to parameters of optimal conditions gives information on the yield of concentrate and water, concentration and consumption of DEG, composition and characteristics of the gas and condensate. The algorithm for calculating optimum engineering conditions of gas refining is proposed to be used in ''advice'' mode, and also for monitoring progress of the gas refining process.

  13. Optimal algebraic multilevel preconditioning for local refinement along a line

    NARCIS (Netherlands)

    Margenov, S.D.; Maubach, J.M.L.

    1995-01-01

    The application of some recently proposed algebraic multilevel methods for the solution of two-dimensional finite element problems on nonuniform meshes is studied. The locally refined meshes are created by the newest vertex mesh refinement method. After the introduction of this refinement technique

  14. FPGA Congestion-Driven Placement Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Vicente de, J.

    2005-07-01

    The routing congestion usually limits the complete proficiency of the FPGA logic resources. A key question can be formulated regarding the benefits of estimating the congestion at placement stage. In the last years, it is gaining acceptance the idea of a detailed placement taking into account congestion. In this paper, we resort to the Thermodynamic Simulated Annealing (TSA) algorithm to perform a congestion-driven placement refinement on the top of the common Bounding-Box pre optimized solution. The adaptive properties of TSA allow the search to preserve the solution quality of the pre optimized solution while improving other fine-grain objectives. Regarding the cost function two approaches have been considered. In the first one Expected Occupation (EO), a detailed probabilistic model to account for channel congestion is evaluated. We show that in spite of the minute detail of EO, the inherent uncertainty of this probabilistic model impedes to relieve congestion beyond the sole application of the Bounding-Box cost function. In the second approach we resort to the fast Rectilinear Steiner Regions algorithm to perform not an estimation but a measurement of the global routing congestion. This second strategy allows us to successfully reduce the requested channel width for a set of benchmark circuits with respect to the widespread Versatile Place and Route (VPR) tool. (Author) 31 refs.

  15. RBT—A Tool for Building Refined Buneman Trees

    DEFF Research Database (Denmark)

    Besenbacher, Søren; Mailund; Westh-Nielsen, Lasse

    2005-01-01

    We have developed a tool implementing an efficient algorithm for refined Buneman tree reconstruction. The algorithm—which has the same complexity as the neighbour-joining method and the (plain) Buneman tree construction—enables refined Buneman tree reconstruction on large taxa sets....

  16. Refining a self-assessment of informatics competency scale using Mokken scaling analysis.

    Science.gov (United States)

    Yoon, Sunmoo; Shaffer, Jonathan A; Bakken, Suzanne

    2015-01-01

    Healthcare environments are increasingly implementing health information technology (HIT) and those from various professions must be competent to use HIT in meaningful ways. In addition, HIT has been shown to enable interprofessional approaches to health care. The purpose of this article is to describe the refinement of the Self-Assessment of Nursing Informatics Competencies Scale (SANICS) using analytic techniques based upon item response theory (IRT) and discuss its relevance to interprofessional education and practice. In a sample of 604 nursing students, the 93-item version of SANICS was examined using non-parametric IRT. The iterative modeling procedure included 31 steps comprising: (1) assessing scalability, (2) assessing monotonicity, (3) assessing invariant item ordering, and (4) expert input. SANICS was reduced to an 18-item hierarchical scale with excellent reliability. Fundamental skills for team functioning and shared decision making among team members (e.g. "using monitoring systems appropriately," "describing general systems to support clinical care") had the highest level of difficulty, and "demonstrating basic technology skills" had the lowest difficulty level. Most items reflect informatics competencies relevant to all health professionals. Further, the approaches can be applied to construct a new hierarchical scale or refine an existing scale related to informatics attitudes or competencies for various health professions.

  17. India's refining prospects linked to economic growth

    International Nuclear Information System (INIS)

    Lewis, E.

    1996-01-01

    International investors assess refining ventures in India the same way they do comparable projects elsewhere in the world: according to their expectations about investment returns. By that standard, India's appeal is mixed, although its need for some measure of additional refining capacity seems certain. The success of future refinery investments will depend heavily on the government's commitment to policies allowing the economy to grow faster than the population. Unless accompanied by economic growth, expected increases in the population will not automatically raise demand for petroleum products. Decisions about investments in India's refining sector, therefore, must carefully weigh market fundamentals, the business environment, and likely investment performance. This paper reviews the market for the various products and predicts new economic trends

  18. Fuzzy hybrid MCDM approach for selection of wind turbine service technicians

    Directory of Open Access Journals (Sweden)

    Goutam Kumar Bose

    2016-01-01

    Full Text Available This research paper is aimed to present a fuzzy Hybrid Multi-criteria decision making (MCDM methodology for selecting employees. The present study aspires to present the hybrid approach of Fuzzy multiple MCDM techniques with tactical viewpoint to support the recruitment process of wind turbine service technicians. The methodology is based on the application of Fuzzy ARAS (Additive Ratio Assessment and Fuzzy MOORA (Multi-Objective Optimization on basis of Ratio Analysis which are integrated through group decision making (GDM method in the model for selection of wind turbine service technicians’ ranking. Here a group of experts from different fields of expertise are engaged to finalize the decision. Series of tests are conducted regarding physical fitness, technical written test, practical test along with general interview and medical examination to facilitate the final selection using the above techniques. In contrast to single decision making approaches, the proposed group decision making model efficiently supports the wind turbine service technicians ranking process. The effectiveness of the proposed approach manifest from the case study of service technicians required for the maintenance department of wind power plant using Fuzzy ARAS and Fuzzy MOORA. This set of potential technicians is evaluated based on five main criteria.

  19. Cleaning Data with OpenRefine

    Directory of Open Access Journals (Sweden)

    Seth van Hooland

    2013-08-01

    Full Text Available Duplicate records, empty values and inconsistent formats are phenomena we should be prepared to deal with when using historical data sets. This lesson will teach you how to discover inconsistencies in data contained within a spreadsheet or a database. As we increasingly share, aggregate and reuse data on the web, historians will need to respond to data quality issues which inevitably pop up. Using a program called OpenRefine, you will be able to easily identify systematic errors such as blank cells, duplicates, spelling inconsistencies, etc. OpenRefine not only allows you to quickly diagnose the accuracy of your data, but also to act upon certain errors in an automated manner.

  20. Outlook for Canadian refining

    International Nuclear Information System (INIS)

    Boje, G.

    1998-01-01

    The petroleum supply and demand balance was discussed and a comparison between Canadian and U.S. refineries was provided. The impact of changing product specifications on the petroleum industry was also discussed. The major changes include sulphur reductions in gasoline, benzene and MMT additives. These changes have been made in an effort to satisfy environmental needs. Geographic margin variations in refineries between east and west were reviewed. An overview of findings from the Solomon Refining Study of Canadian and American refineries, which has been very complimentary of the Canadian refining industry, was provided. From this writer's point of view refinery utilization has improved but there is a threat from increasing efficiency of US competitors. Environmental issues will continue to impact upon the industry and while the chances for making economic returns on investment are good for the years ahead, it will be a challenge to maintain profitability

  1. Oil refining in U.S. foreign-trade zones

    International Nuclear Information System (INIS)

    Powell, S.J.; Potter, T.J.

    1991-01-01

    With the crude-oil import supply being as inexpensive as it is today, relative to domestic supply, many independents have been sourcing their crude-oil needs from abroad and have found it an opportune time to step up their level of refining activity. To further enhance their competitive position with respect to foreign refineries, certain domestic refiners have discovered the operational benefits and savings that result from having a refinery designated as a foreign-trade zone (FTZ) under the Foreign-Trade Zones Act of 1934, as amended. This paper examines the history and use of foreign-trade subzones for refining activities

  2. Utilization of waste materials, non-refined materials, and renewable energy in in situ remediation and their sustainability benefits.

    Science.gov (United States)

    Favara, Paul; Gamlin, Jeff

    2017-12-15

    In the ramp-up to integrating sustainability into remediation, a key industry focus area has been to reduce the environmental footprint of treatment processes. The typical approach to integrating sustainability into remediation projects has been a top-down approach, which involves developing technology options and then applying sustainability thinking to the technology, after it has been conceptualized. A bottom-up approach allows for systems thinking to be included in remedy selection and could potentially result in new or different technologies being considered. When using a bottom-up approach, there is room to consider the utilization of waste materials, non-refined materials, and renewable energy in remediation technology-all of which generally have a smaller footprint than processed materials and traditional forms of energy. By integrating more systems thinking into remediation projects, practitioners can think beyond the traditional technologies typically used and how technologies are deployed. To compare top-down and bottom-up thinking, a traditional technology that is considered very sustainable-enhanced in situ bioremediation-is compared to a successful, but infrequently deployed technology-subgrade biogeochemical reactors. Life Cycle Assessment is used for the evaluation and shows the footprint of the subgrade biogeochemical reactor to be lower in all seven impact categories evaluated, sometimes to a significant degree. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. English-Chinese oil refining dictionary. [English-Chinese

    Energy Technology Data Exchange (ETDEWEB)

    Chou, P; Zing, Z [eds.

    1979-01-01

    The dictionary is a collection of many disciplines but specialized in the terminology related to petroleum refining. It contains terms in areas such as refining, factory equipment and installation, petroleum products and test analysis, and instrument automation. It also contains terms in areas of petrochemistry, oil storage and transport, computer technology, and environmental protection. The total number of terms collected was approximately 53,000.

  4. A Framework for Six Sigma Project Selection in Higher Educational Institutions, Using a Weighted Scorecard Approach

    Science.gov (United States)

    Holmes, Monica C.; Jenicke, Lawrence O.; Hempel, Jessica L.

    2015-01-01

    Purpose: This paper discusses the importance of the Six Sigma selection process, describes a Six Sigma project in a higher educational institution and presents a weighted scorecard approach for project selection. Design/Methodology/Approach: A case study of the Six Sigma approach being used to improve student support at a university computer help…

  5. Assessment of the impact of the tax on refined petroleum products in the Philippines

    International Nuclear Information System (INIS)

    Uri, N.D.; Boyd, R.

    1993-01-01

    This paper uses an aggregate modelling approach to assess the impact of taxes on refined-petroleum products on the Philippine's economy. The approach used in the analysis consists of a general equilibrium model composed of 14 producing sectors, 14 consuming sectors, three household categories classified by income, and a government. The effects of removing the 48% tax on premium and regular gasoline and the 24% tax on other refined petroleum products on prices and quantities are examined. The results are revealing. For example, the consequences of a complete elimination of refined petroleum product taxes would be an increase in output by all producing sectors of about 3.7% or about 2.65 hundred billion (2.65 x 10 11 ) Philippine pesos, a rise in the consumption of goods and services by about 13.6% or 4.2 hundred billion (4.2 x 10 11 ) Philippine pesos, a rise in total utility by 14.3% or 4.5 hundred billion (4.5 x 10 11 ) Philippine pesos and a lower tax revenue for the government of 62.4% and 2.8 hundred billion (2.8 x 10 11 ) Philippine pesos. When subjected to a sensitivity analysis, the results are reasonably robust with regard to the assumption of the values of the substitution elasticities. That is, while the model's equilibrium values do vary in response to different assumptions of the values of these elasticities, the fluctuations are not so large as to suggest that the model is unrealistically sensitive to these parameters. (Author)

  6. Effects of die profile on grain refinement in Al–Mg alloy processed by repetitive corrugation and straightening

    Energy Technology Data Exchange (ETDEWEB)

    Thangapandian, N., E-mail: erpandian@gmail.com [Department of Mechanical Engineering, College of Engineering Guindy, Anna University, Chennai 600025 (India); Balasivanandha Prabu, S. [Department of Mechanical Engineering, College of Engineering Guindy, Anna University, Chennai 600025 (India); Padmanabhan, K.A. [Centre for Nanotechnology, University of Hyderabad, Hyderabad 500046 (India)

    2016-01-01

    It is shown that a proper selection of corrugation die profile and die parameters is essential for achieving homogeneous grain refinement in materials subjected to repetitive corrugation and straightening (RCS). An Al–Mg (AA 5083) alloy was subjected to the RCS process using three different corrugation die profiles (V-groove, Flat groove, and Semi-circular groove), followed by straightening to determine the allowable maximum number of passes prior to surface cracking/fracture. Mechanical properties, i.e., hardness and tensile strength of the RCS samples were measured and compared as functions of corrugation die profiles and number of passes and the changes in microstructure. Grain refinement was studied using Electron Back Scattered Diffraction (EBSD) analysis and Transmission Electron Microscopy (TEM).

  7. Element Partition Trees For H-Refined Meshes to Optimize Direct Solver Performance. Part I: Dynamic Programming

    KAUST Repository

    AbouEisha, Hassan M.

    2017-07-13

    We consider a class of two-and three-dimensional h-refined meshes generated by an adaptive finite element method. We introduce an element partition tree, which controls the execution of the multi-frontal solver algorithm over these refined grids. We propose and study algorithms with polynomial computational cost for the optimization of these element partition trees. The trees provide an ordering for the elimination of unknowns. The algorithms automatically optimize the element partition trees using extensions of dynamic programming. The construction of the trees by the dynamic programming approach is expensive. These generated trees cannot be used in practice, but rather utilized as a learning tool to propose fast heuristic algorithms. In this first part of our paper we focus on the dynamic programming approach, and draw a sketch of the heuristic algorithm. The second part will be devoted to a more detailed analysis of the heuristic algorithm extended for the case of hp-adaptive

  8. Element Partition Trees For H-Refined Meshes to Optimize Direct Solver Performance. Part I: Dynamic Programming

    KAUST Repository

    AbouEisha, Hassan M.; Calo, Victor Manuel; Jopek, Konrad; Moshkov, Mikhail; Paszyńka, Anna; Paszyński, Maciej; Skotniczny, Marcin

    2017-01-01

    We consider a class of two-and three-dimensional h-refined meshes generated by an adaptive finite element method. We introduce an element partition tree, which controls the execution of the multi-frontal solver algorithm over these refined grids. We propose and study algorithms with polynomial computational cost for the optimization of these element partition trees. The trees provide an ordering for the elimination of unknowns. The algorithms automatically optimize the element partition trees using extensions of dynamic programming. The construction of the trees by the dynamic programming approach is expensive. These generated trees cannot be used in practice, but rather utilized as a learning tool to propose fast heuristic algorithms. In this first part of our paper we focus on the dynamic programming approach, and draw a sketch of the heuristic algorithm. The second part will be devoted to a more detailed analysis of the heuristic algorithm extended for the case of hp-adaptive

  9. Contextual Distance Refining for Image Retrieval

    KAUST Repository

    Islam, Almasri

    2014-01-01

    Recently, a number of methods have been proposed to improve image retrieval accuracy by capturing context information. These methods try to compensate for the fact that a visually less similar image might be more relevant because it depicts the same object. We propose a new quick method for refining any pairwise distance metric, it works by iteratively discovering the object in the image from the most similar images, and then refine the distance metric accordingly. Test show that our technique improves over the state of art in terms of accuracy over the MPEG7 dataset.

  10. Contextual Distance Refining for Image Retrieval

    KAUST Repository

    Islam, Almasri

    2014-09-16

    Recently, a number of methods have been proposed to improve image retrieval accuracy by capturing context information. These methods try to compensate for the fact that a visually less similar image might be more relevant because it depicts the same object. We propose a new quick method for refining any pairwise distance metric, it works by iteratively discovering the object in the image from the most similar images, and then refine the distance metric accordingly. Test show that our technique improves over the state of art in terms of accuracy over the MPEG7 dataset.

  11. Radiator selection for Space Station Solar Dynamic Power Systems

    Science.gov (United States)

    Fleming, Mike; Hoehn, Frank

    A study was conducted to define the best radiator for heat rejection of the Space Station Solar Dynamic Power System. Included in the study were radiators for both the Organic Rankine Cycle and Closed Brayton Cycle heat engines. A number of potential approaches were considered for the Organic Rankine Cycle and a constructable radiator was chosen. Detailed optimizations of this concept were conducted resulting in a baseline for inclusion into the ORC Preliminary Design. A number of approaches were also considered for the CBC radiator. For this application a deployed pumped liquid radiator was selected which was also refined resulting in a baseline for the CBC preliminary design. This paper reports the results and methodology of these studies and describes the preliminary designs of the Space Station Solar Dynamic Power System radiators for both of the candidate heat engine cycles.

  12. Singapore refiners in midst of huge construction campaign

    International Nuclear Information System (INIS)

    Land, R.

    1992-01-01

    This paper reports that Singapore's downstream capacity continues to mushroom. Singapore refiners, upbeat about long term prospects for petroleum products demand in the Asia-Pacific region, and are pressing plans to boost processing capacity. Their plans go beyond capacity expansions. They are proceeding with projects to upgrade refineries to emphasize production of higher value products and to further integrate refining capabilities wit the region's petrochemical industry. Planned expansion and upgrading projects at Singapore refineries call for outlays of more than $1 billion to boost total capacity to about 1.1 million b/d in 1993 and 1.27 million b/d by 1995. That would be the highest level since the mid-1980s, when refiners such as Shell Singapore cut capacity amid an oil glut. Singapore refineries currently are running at effective full capacity of 1.04 million b/d. Meanwhile, Singapore refiners are aggressively courting customers in the Indochina subcontinent, where long isolated centrally planned economies are turning gradually to free markets

  13. Directions in refining and upgrading of heavy oil and bitumen

    International Nuclear Information System (INIS)

    Dawson, B.; Parker, R. J.; Flint, L.

    1997-01-01

    The expansion of heavy oil transportation, marketing and refining facilities over the past two decades have been reviewed to show the strides that several Canadian refiners have taken to build up the facilities required to process synthetic crude oil (SCO). Key points made at a conference, convened by the National Centre for Upgrading Technology (NCUT), held in Edmonton during September 1997 to discuss current and future directions in the refining and marketing of heavy oil, bitumen and SCO, were summarized. Among the key points mentioned were: (1) the high entry barriers faced by centralized upgraders, (2) the advantages of integrating SCO or heavy oil production with downstream refining, (3) the stiff competition from Venezuela and Mexico that both SCO and heavy oil will face in the U.S. PADD II market, (4) the differences between Canadian refiners who have profited from hydrocracking and are better able to handle coker-based SCO, and American refiners who rely chiefly on catalytic cracking and are less able to process the highly aromatic SCO, and (5) the disproportionate cost in the upgrading process represented by the conversion of asphaltenes. Challenges and opportunities for key stakeholders, i.e. producers, refiners, marketers and technology licensors also received much attention at the Edmonton conference

  14. 30 CFR 208.4 - Royalty oil sales to eligible refiners.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Royalty oil sales to eligible refiners. 208.4... MANAGEMENT SALE OF FEDERAL ROYALTY OIL General Provisions § 208.4 Royalty oil sales to eligible refiners. (a... and defense. The Secretary will review these items and will determine whether eligible refiners have...

  15. Development of an efficient grain refiner for Al-7Si alloy

    Energy Technology Data Exchange (ETDEWEB)

    Kori, S.A.; Murty, B.S.; Chakraborty, M. [Indian Inst. of Technol., Kharagpur (India). Dept. of Metall. and Mater. Eng.

    2000-03-15

    The response of Al-7Si alloy towards grain refinement by Al-Ti-B master alloys (with different Ti-B ratios) at different addition levels has been studied in detail. The results indicate that high B-containing master alloys are powerful grain refiners when compared to conventional grain refiners like Al-5Ti-lB master alloys. In the present study, indigenously developed master alloys have been used for the grain refinement of alloys Al-7Si and LM-25. Significant improvements in mechanical properties have been obtained with a combination of grain refiner and Sr as modifier. (orig.)

  16. Local grid refinement for free-surface flow simulations

    NARCIS (Netherlands)

    van der Plas, Peter

    2017-01-01

    The principal goal of the current study is to explore and investigate the potential of local grid refinement for increasing the numerical efficiency of free-surface flow simulations in a practical context. In this thesis we propose a method for local grid refinement in the free-surface flow model

  17. Refinements of the column generation process for the Vehicle Routing Problem with Time Windows

    DEFF Research Database (Denmark)

    Larsen, Jesper

    2004-01-01

    interval denoted the time window. The objective is to determine routes for the vehicles that minimizes the accumulated cost (or distance) with respect to the above mentioned constraints. Currently the best approaches for determining optimal solutions are based on column generation and Branch......-and-Bound, also known as Branch-and-Price. This paper presents two ideas for run-time improvements of the Branch-and-Price framework for the Vehicle Routing Problem with Time Windows. Both ideas reveal a significant potential for using run-time refinements when speeding up an exact approach without compromising...

  18. Oil refining in South Asia and Australasia

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.

    2000-01-01

    An overview of the oil markets of Southeast Asia and Australasia is presented focussing on oil refining. Key statistics of both areas are tabulated, and figures providing information on GDP/capita, crude production, comparison of demand barrels, and product demand are provided. Crude oil production and supply, oil product demand, and the refining industries are examined with details given of evolution of capacity and cracking to distillation ratios

  19. Investment in Exploration-Production and Refining - 2016

    International Nuclear Information System (INIS)

    Maisonnier, Guy; Hureau, Geoffroy; Serbutoviez, Sylvain; Silva, Constancio

    2017-03-01

    IFPEN analyses in this study the 2016 evolution of global investment in the field of exploration-production and refining: - Trends in oil and gas prices; - Investment in exploration/production: in sharp decline for the second consecutive year - the first time this has happened since 1986; - The global drilling market; - Geophysical: global activity and markets; - Offshore construction: market and business; - A significant reduction in refining projects (atmospheric distillation and conversion)

  20. Using high-order methods on adaptively refined block-structured meshes - discretizations, interpolations, and filters.

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Jaideep; Lefantzi, Sophia; Najm, Habib N.; Kennedy, Christopher A.

    2006-01-01

    Block-structured adaptively refined meshes (SAMR) strive for efficient resolution of partial differential equations (PDEs) solved on large computational domains by clustering mesh points only where required by large gradients. Previous work has indicated that fourth-order convergence can be achieved on such meshes by using a suitable combination of high-order discretizations, interpolations, and filters and can deliver significant computational savings over conventional second-order methods at engineering error tolerances. In this paper, we explore the interactions between the errors introduced by discretizations, interpolations and filters. We develop general expressions for high-order discretizations, interpolations, and filters, in multiple dimensions, using a Fourier approach, facilitating the high-order SAMR implementation. We derive a formulation for the necessary interpolation order for given discretization and derivative orders. We also illustrate this order relationship empirically using one and two-dimensional model problems on refined meshes. We study the observed increase in accuracy with increasing interpolation order. We also examine the empirically observed order of convergence, as the effective resolution of the mesh is increased by successively adding levels of refinement, with different orders of discretization, interpolation, or filtering.

  1. Problems persist for French refining sector

    International Nuclear Information System (INIS)

    Beck, R.J.

    1992-01-01

    This paper reports that France's refiners face a continuing shortfall of middle distillate capacity and a persistent surplus of heavy fuel oil. That's the main conclusion of the official Hydrocarbon Directorate's report on how France's refining sector performed in 1991. Imports up---The directorate noted that although net production of refined products in French refineries rose to 1.534 million b/d in 1991 from 1.48 million b/d in 1990, products imports jumped 9.7% to 602,000 b/d in the period. The glut of heavy fuel oil eased to some extent last year because French nuclear power capacity, heavily dependent on ample water supplies, was crimped by drought. That spawned fuel switching. The most note worthy increase in imports was for motor diesel, climbing to 176,000 b/d from 148,000 b/d in 1990. Tax credits are spurring French consumption of that fuel. For the first time, consumption of motor diesel in 1991 outstripped that of gasoline at 374,000 b/d and 356,000 b/d respectively

  2. Refining glass structure in two dimensions

    Science.gov (United States)

    Sadjadi, Mahdi; Bhattarai, Bishal; Drabold, D. A.; Thorpe, M. F.; Wilson, Mark

    2017-11-01

    Recently determined atomistic scale structures of near-two dimensional bilayers of vitreous silica (using scanning probe and electron microscopy) allow us to refine the experimentally determined coordinates to incorporate the known local chemistry more precisely. Further refinement is achieved by using classical potentials of varying complexity: one using harmonic potentials and the second employing an electrostatic description incorporating polarization effects. These are benchmarked against density functional calculations. Our main findings are that (a) there is a symmetry plane between the two disordered layers, a nice example of an emergent phenomena, (b) the layers are slightly tilted so that the Si-O-Si angle between the two layers is not 180∘ as originally thought but rather 175 ±2∘ , and (c) while interior areas that are not completely imagined can be reliably reconstructed, surface areas are more problematic. It is shown that small crystallites that appear are just as expected statistically in a continuous random network. This provides a good example of the value that can be added to disordered structures imaged at the atomic level by implementing computer refinement.

  3. Segmental Refinement: A Multigrid Technique for Data Locality

    KAUST Repository

    Adams, Mark F.; Brown, Jed; Knepley, Matt; Samtaney, Ravi

    2016-01-01

    We investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. We present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinement and report performance results with up to 64K cores on a Cray XC30.

  4. Segmental Refinement: A Multigrid Technique for Data Locality

    KAUST Repository

    Adams, Mark F.

    2016-08-04

    We investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. We present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinement and report performance results with up to 64K cores on a Cray XC30.

  5. SCJ-Circus: a refinement-oriented formal notation for Safety-Critical Java

    Directory of Open Access Journals (Sweden)

    Alvaro Miyazawa

    2016-06-01

    Full Text Available Safety-Critical Java (SCJ is a version of Java whose goal is to support the development of real-time, embedded, safety-critical software. In particular, SCJ supports certification of such software by introducing abstractions that enforce a simpler architecture, and simpler concurrency and memory models. In this paper, we present SCJ-Circus, a refinement-oriented formal notation that supports the specification and verification of low-level programming models that include the new abstractions introduced by SCJ. SCJ-Circus is part of the family of state-rich process algebra Circus, as such, SCJ-Circus includes the Circus constructs for modelling sequential and concurrent behaviour, real-time and object orientation. We present here the syntax and semantics of SCJ-Circus, which is defined by mapping SCJ-Circus constructs to those of standard Circus. This is based on an existing approach for modelling SCJ programs. We also extend an existing Circus-based refinement strategy that targets SCJ programs to account for the generation of SCJ-Circus models close to implementations in SCJ.

  6. External and Internal Citation Analyses Can Provide Insight into Serial/Monograph Ratios when Refining Collection Development Strategies in Selected STEM Disciplines

    Directory of Open Access Journals (Sweden)

    Stephanie Krueger

    2016-12-01

    Full Text Available A Review of: Kelly, M. (2015. Citation patterns of engineering, statistics, and computer science researchers: An internal and external citation analysis across multiple engineering subfields. College and Research Libraries, 76(7, 859-882. http://doi.org/10.5860/crl.76.7.859 Objective – To determine internal and external citation analysis methods and their potential applicability to the refinement of collection development strategies at both the institutional and cross-institutional levels for selected science, technology, engineering, and mathematics (STEM subfields. Design – Multidimensional citation analysis; specifically, analysis of citations from 1 key scholarly journals in selected STEM subfields (external analysis compared to those from 2 local doctoral dissertations in similar subfields (internal analysis. Setting – Medium-sized, STEM-dominant public research university in the United States of America. Subjects – Two citation datasets: 1 14,149 external citations from16 journals (i.e., 2 journals per subfield; citations from 2012 volumes representing bioengineering, civil engineering, computer science (CS, electrical engineering, environmental engineering, operations research, statistics (STAT, and systems engineering; and 2 8,494 internal citations from 99 doctoral dissertations (18-22 per subfield published between 2008-–2012 from CS, electrical and computer engineering (ECE, and applied information technology (AIT and published between 2005-–2012 for systems engineering and operations research (SEOR and STAT. Methods – Citations, including titles and publication dates, were harvested from source materials and stored in Excel and then manually categorized according to format (book, book chapter, journal, conference proceeding, website, and several others. To analyze citations, percentages of occurrence by subfield were calculated for variables including format, age (years since date cited, journal distribution, and the

  7. Solving the Sophistication-Population Paradox of Game Refinement Theory

    OpenAIRE

    Xiong , Shuo; Tiwary , Parth ,; Iida , Hiroyuki

    2016-01-01

    Part 4: Short Papers; International audience; A mathematical model of game refinement was proposed based on uncertainty of game outcome. This model has been shown to be useful in measuring the entertainment element in the domains such as boardgames and sport games. However, game refinement theory has not been able to explain the correlation between the popularity of a game and the game refinement value. This paper introduces another aspect in the study of game entertainment, the concept of “a...

  8. The future of refining industry is in Asia

    International Nuclear Information System (INIS)

    Dupin, L.

    2010-01-01

    The decision of Total Group to close down the refining activity of its Flandres' site at Dunkerque (France) is a testimony of the deep restructuring that is affecting this sector. The migration of the refining activity towards Middle-East and Asia has started several years ago. The World capacities are increasing: in 2009 4.36 billion tons of petroleum products were refined in the world, representing a 1.9% rise with respect to 2008. 661 refineries exist in the world, among which 6 were inaugurated in 2009, 115 are located in Europe and 12 in France. The bad health of the European refining activity is beneficial to Middle and Asia where investments in new production capacities follow on from one another. From now to 2030, these areas will benefit from 70% of the worldwide investments compared to 11% only for Europe and North America. The decline of the European refining industry is directly linked to decay of the automotive fuel consumption and to the increase of the diesel fuel share with respect to gasoline. On the other hand, Asia, and in particular China and India, follow the exactly opposite trend thanks to their population and economic growths. Therefore, European oil companies try to invest in Asia and are looking for new production capacities in China and India. (J.S.)

  9. Optimal Subinterval Selection Approach for Power System Transient Stability Simulation

    Directory of Open Access Journals (Sweden)

    Soobae Kim

    2015-10-01

    Full Text Available Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modal analysis using a single machine infinite bus (SMIB system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. The performance of the proposed method is demonstrated with the GSO 37-bus system.

  10. Simulation Experiment on Landing Site Selection Using a Simple Geometric Approach

    Science.gov (United States)

    Zhao, W.; Tong, X.; Xie, H.; Jin, Y.; Liu, S.; Wu, D.; Liu, X.; Guo, L.; Zhou, Q.

    2017-07-01

    Safe landing is an important part of the planetary exploration mission. Even fine scale terrain hazards (such as rocks, small craters, steep slopes, which would not be accurately detected from orbital reconnaissance) could also pose a serious risk on planetary lander or rover and scientific instruments on-board it. In this paper, a simple geometric approach on planetary landing hazard detection and safe landing site selection is proposed. In order to achieve full implementation of this algorithm, two easy-to-compute metrics are presented for extracting the terrain slope and roughness information. Unlike conventional methods which must do the robust plane fitting and elevation interpolation for DEM generation, in this work, hazards is identified through the processing directly on LiDAR point cloud. For safe landing site selection, a Generalized Voronoi Diagram is constructed. Based on the idea of maximum empty circle, the safest landing site can be determined. In this algorithm, hazards are treated as general polygons, without special simplification (e.g. regarding hazards as discrete circles or ellipses). So using the aforementioned method to process hazards is more conforming to the real planetary exploration scenario. For validating the approach mentioned above, a simulated planetary terrain model was constructed using volcanic ash with rocks in indoor environment. A commercial laser scanner mounted on a rail was used to scan the terrain surface at different hanging positions. The results demonstrate that fairly hazard detection capability and reasonable site selection was obtained compared with conventional method, yet less computational time and less memory usage was consumed. Hence, it is a feasible candidate approach for future precision landing selection on planetary surface.

  11. SIMULATION EXPERIMENT ON LANDING SITE SELECTION USING A SIMPLE GEOMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    W. Zhao

    2017-07-01

    Full Text Available Safe landing is an important part of the planetary exploration mission. Even fine scale terrain hazards (such as rocks, small craters, steep slopes, which would not be accurately detected from orbital reconnaissance could also pose a serious risk on planetary lander or rover and scientific instruments on-board it. In this paper, a simple geometric approach on planetary landing hazard detection and safe landing site selection is proposed. In order to achieve full implementation of this algorithm, two easy-to-compute metrics are presented for extracting the terrain slope and roughness information. Unlike conventional methods which must do the robust plane fitting and elevation interpolation for DEM generation, in this work, hazards is identified through the processing directly on LiDAR point cloud. For safe landing site selection, a Generalized Voronoi Diagram is constructed. Based on the idea of maximum empty circle, the safest landing site can be determined. In this algorithm, hazards are treated as general polygons, without special simplification (e.g. regarding hazards as discrete circles or ellipses. So using the aforementioned method to process hazards is more conforming to the real planetary exploration scenario. For validating the approach mentioned above, a simulated planetary terrain model was constructed using volcanic ash with rocks in indoor environment. A commercial laser scanner mounted on a rail was used to scan the terrain surface at different hanging positions. The results demonstrate that fairly hazard detection capability and reasonable site selection was obtained compared with conventional method, yet less computational time and less memory usage was consumed. Hence, it is a feasible candidate approach for future precision landing selection on planetary surface.

  12. Structural exploration for the refinement of anticancer matrix metalloproteinase-2 inhibitor designing approaches through robust validated multi-QSARs

    Science.gov (United States)

    Adhikari, Nilanjan; Amin, Sk. Abdul; Saha, Achintya; Jha, Tarun

    2018-03-01

    Matrix metalloproteinase-2 (MMP-2) is a promising pharmacological target for designing potential anticancer drugs. MMP-2 plays critical functions in apoptosis by cleaving the DNA repair enzyme namely poly (ADP-ribose) polymerase (PARP). Moreover, MMP-2 expression triggers the vascular endothelial growth factor (VEGF) having a positive influence on tumor size, invasion, and angiogenesis. Therefore, it is an urgent need to develop potential MMP-2 inhibitors without any toxicity but better pharmacokinetic property. In this article, robust validated multi-quantitative structure-activity relationship (QSAR) modeling approaches were attempted on a dataset of 222 MMP-2 inhibitors to explore the important structural and pharmacophoric requirements for higher MMP-2 inhibition. Different validated regression and classification-based QSARs, pharmacophore mapping and 3D-QSAR techniques were performed. These results were challenged and subjected to further validation to explain 24 in house MMP-2 inhibitors to judge the reliability of these models further. All these models were individually validated internally as well as externally and were supported and validated by each other. These results were further justified by molecular docking analysis. Modeling techniques adopted here not only helps to explore the necessary structural and pharmacophoric requirements but also for the overall validation and refinement techniques for designing potential MMP-2 inhibitors.

  13. Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites

    Science.gov (United States)

    2015-03-19

    Bioremediation Approaches at Chlorinated Solvent Sites March 19, 2015 SERDP & ESTCP Webinar Series (#11) SERDP & ESTCP Webinar Series Welcome and...Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites Ms. Carmen Lebrón, Independent Consultant (20 minutes + Q&A) Dr...ESTCP Webinar Series Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated

  14. Refining - Panorama 2008; Raffinage - Panorama 2008

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    Investment rallied in 2007, and many distillation and conversion projects likely to reach the industrial stage were announced. With economic growth sustained in 2006 and still pronounced in 2007, oil demand remained strong - especially in emerging countries - and refining margins stayed high. Despite these favorable business conditions, tensions persisted in the refining sector, which has fallen far behind in terms of investing in refinery capacity. It will take renewed efforts over a long period to catch up. Looking at recent events that have affected the economy in many countries (e.g. the sub-prime crisis), prudence remains advisable.

  15. Refinement of the gross theory of nuclear {beta}-decay, and hindrance of the first-forbidden transition of rank 1

    Energy Technology Data Exchange (ETDEWEB)

    Nakata, Hidehiko [Waseda Univ., Tokyo (Japan). Dept. of Physics; Tachibana, Takahiro; Yamada, Masami

    1997-03-01

    Recently the gross theory of nuclear {beta}-decay was refined for odd-odd nuclei. In this refinement, the effect of the selection rule of {beta}-transitions from the ground states of odd-odd nuclei to those of even-even nuclei was taken into account based on a statistical consideration. The transitions to the first 2{sup +} excited states in even-even nuclei were also taken into account according to the selection rule approximately. In that study, it was found that the transitions between 1{sup -} ground states of the odd-odd nuclei and 0{sup +} ground states of even-even nuclei, belonging to the first-forbidden transitions of rank 1, are strongly hindered. A reduction factor was introduced for the transitions to the ground states of even-even nuclei to take into account this hindrance. It was also found that the strength functions of the Gamow-Teller transitions obtained from the conventional gross theory are underestimated by a factor of about 3. In order to improve this underestimation, the Lorentz-type function was adopted for the one-particle strength function in the model instead of the hyperbolic-secant-type function. In the present study we have newly analyzed the experimental ft-values of odd-A nuclei, and found that the first-forbidden transitions of rank 1 are also considerably hindered between the ground states. Following the above refinement we have calculated the {beta}-ray spectra of some odd-odd short-lived fission products with the use of the refined gross theory. These results are compared not only with the experiments by Rudstam et al. but also with the conventional gross theory. (author)

  16. Refining the statistical model for quantitative immunostaining of surface-functionalized nanoparticles by AFM.

    Science.gov (United States)

    MacCuspie, Robert I; Gorka, Danielle E

    2013-10-01

    Recently, an atomic force microscopy (AFM)-based approach for quantifying the number of biological molecules conjugated to a nanoparticle surface at low number densities was reported. The number of target molecules conjugated to the analyte nanoparticle can be determined with single nanoparticle fidelity using antibody-mediated self-assembly to decorate the analyte nanoparticles with probe nanoparticles (i.e., quantitative immunostaining). This work refines the statistical models used to quantitatively interpret the observations when AFM is used to image the resulting structures. The refinements add terms to the previous statistical models to account for the physical sizes of the analyte nanoparticles, conjugated molecules, antibodies, and probe nanoparticles. Thus, a more physically realistic statistical computation can be implemented for a given sample of known qualitative composition, using the software scripts provided. Example AFM data sets, using horseradish peroxidase conjugated to gold nanoparticles, are presented to illustrate how to implement this method successfully.

  17. The Charfuel coal refining process

    International Nuclear Information System (INIS)

    Meyer, L.G.

    1991-01-01

    The patented Charfuel coal refining process employs fluidized hydrocracking to produce char and liquid products from virtually all types of volatile-containing coals, including low rank coal and lignite. It is not gasification or liquefaction which require the addition of expensive oxygen or hydrogen or the use of extreme heat or pressure. It is not the German pyrolysis process that merely 'cooks' the coal, producing coke and tar-like liquids. Rather, the Charfuel coal refining process involves thermal hydrocracking which results in the rearrangement of hydrogen within the coal molecule to produce a slate of co-products. In the Charfuel process, pulverized coal is rapidly heated in a reducing atmosphere in the presence of internally generated process hydrogen. This hydrogen rearrangement allows refinement of various ranks of coals to produce a pipeline transportable, slurry-type, environmentally clean boiler fuel and a slate of value-added traditional fuel and chemical feedstock co-products. Using coal and oxygen as the only feedstocks, the Charfuel hydrocracking technology economically removes much of the fuel nitrogen, sulfur, and potential air toxics (such as chlorine, mercury, beryllium, etc.) from the coal, resulting in a high heating value, clean burning fuel which can increase power plant efficiency while reducing operating costs. The paper describes the process, its thermal efficiency, its use in power plants, its pipeline transport, co-products, environmental and energy benefits, and economics

  18. A Potential Approach for Low Flow Selection in Water Resource Supply and Management

    Science.gov (United States)

    Ying Ouyang

    2012-01-01

    Low flow selections are essential to water resource management, water supply planning, and watershed ecosystem restoration. In this study, a new approach, namely the frequent-low (FL) approach (or frequent-low index), was developed based on the minimum frequent-low flow or level used in minimum flows and/or levels program in northeast Florida, USA. This FL approach was...

  19. An intutionistic fuzzy optimization approach to vendor selection problem

    Directory of Open Access Journals (Sweden)

    Prabjot Kaur

    2016-09-01

    Full Text Available Selecting the right vendor is an important business decision made by any organization. The decision involves multiple criteria and if the objectives vary in preference and scope, then nature of decision becomes multiobjective. In this paper, a vendor selection problem has been formulated as an intutionistic fuzzy multiobjective optimization where appropriate number of vendors is to be selected and order allocated to them. The multiobjective problem includes three objectives: minimizing the net price, maximizing the quality, and maximizing the on time deliveries subject to supplier's constraints. The objection function and the demand are treated as intutionistic fuzzy sets. An intutionistic fuzzy set has its ability to handle uncertainty with additional degrees of freedom. The Intutionistic fuzzy optimization (IFO problem is converted into a crisp linear form and solved using optimization software Tora. The advantage of IFO is that they give better results than fuzzy/crisp optimization. The proposed approach is explained by a numerical example.

  20. Asian oil refining. Demand growth and deregulation - an uncertain future

    International Nuclear Information System (INIS)

    Sameer Nawaz.

    1996-01-01

    The objective of the report is to identify the most important features of the oil refining industry in Asia. Major developments in consumption patterns changes in regional importance of countries are discussed, highlighting potential future developments. The first chapter introduces the various refining processes and presents a simple model for the analysis of complex refineries. Chapter 2 examines the development of the Asian refining industry against a background of economic growth and analyses trends in consumption of all products in Asian countries. In Chapter 3, the key issues concerning the refining industry are examined, among them the forces driving consumption, including the importance of economic development, and electricity and transport demand. The importance of product imports and international trade is discussed, and the extent of government involvement and the effects of changing retail and market prices are analysed. Chapter 4 looks at the strategies that oil and gas companies are following in the Asian refining industry. Particular significance is attached to the vertical integration of the oil majors, Japanese and Middle Eastern oil companies. A brief overview of the importance of the petrochemical industry is presented. The countries of Asia that are involved in the refining industry are profiled in Chapter 5. The future trend in oil consumption is examined in Chapter 6. There follows a brief discussion of the plans to expand crude refining capacity in the various countries and a forecast of the state of overcapacity which will result. In the final chapter, brief profiles of some of the most important companies in the Asian refining industry are presented, discussing their major activities and future plans. (Author)

  1. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    Science.gov (United States)

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable

  2. Grain refinement of permanent mold cast copper base alloys. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sadayappan, M.; Thomson, J. P.; Elboujdaini, M.; Gu, G. Ping; Sahoo, M.

    2004-04-29

    Grain refinement behavior of copper alloys cast in permanent molds was investigated. This is one of the least studied subjects in copper alloy castings. Grain refinement is not widely practiced for leaded copper alloys cast in sand molds. Aluminum bronzes and high strength yellow brasses, cast in sand and permanent molds, were usually fine grained due to the presence of more than 2% iron. Grain refinement of the most common permanent mold casting alloys, leaded yellow brass and its lead-free replacement EnviroBrass III, is not universally accepted due to the perceived problem of hard spots in finished castings and for the same reason these alloys contain very low amounts of iron. The yellow brasses and Cu-Si alloys are gaining popularity in North America due to their low lead content and amenability for permanent mold casting. These alloys are prone to hot tearing in permanent mold casting. Grain refinement is one of the solutions for reducing this problem. However, to use this technique it is necessary to understand the mechanism of grain refinement and other issues involved in the process. The following issues were studied during this three year project funded by the US Department of Energy and the copper casting industry: (1) Effect of alloying additions on the grain size of Cu-Zn alloys and their interaction with grain refiners; (2) Effect of two grain refining elements, boron and zirconium, on the grain size of four copper alloys, yellow brass, EnviroBrass II, silicon brass and silicon bronze and the duration of their effect (fading); (3) Prediction of grain refinement using cooling curve analysis and use of this method as an on-line quality control tool; (4) Hard spot formation in yellow brass and EnviroBrass due to grain refinement; (5) Corrosion resistance of the grain refined alloys; (6) Transfer the technology to permanent mold casting foundries; It was found that alloying elements such as tin and zinc do not change the grain size of Cu-Zn alloys

  3. The present state of refining in France

    International Nuclear Information System (INIS)

    1996-01-01

    The european refining industry suffers from a production over-capacity and closures are inevitable; the situation is even worse in France due to the imbalance between gas oil and gasoline prices and the weak margin for distributors. The French refining industry is however an important and essential link for its strategic fuel and petroleum product supply, and represent 17000 jobs. Several measures are introduced by the French Industry department towards restructuring, capacity reduction and fuel price harmonization

  4. A new process of electron beam refining of niobium

    International Nuclear Information System (INIS)

    Pinatti, D.G.

    1981-01-01

    A review of thermodynamic equilibrium, the kinetic theory and experimental results of the metal-gas interaction in refractory metals is presented. N 2 , H 2 and CO absorption and desorption take place by a reversible process while O 2 takes place by a irreversible process with atom absorption and metal oxide desorption. A new technology of electron beam refining of Niobium is proposed based on four points: 1) preparation of the aluminothermic reduced electrode, 2) zone refining in the first melt, 3) kinetic theory of refining in the following melts and 4) design of a compact furnace. Experimental results in a pilot plant of 300 KW have shown complete agreement with the proposed technology yielding a productivity 2.4 times larger than the value predicted by the conventional technology of electron beam refining of Niobium. (Author) [pt

  5. European refining: evolution or revolution?

    International Nuclear Information System (INIS)

    Cuthbert, N.

    1999-01-01

    A recent detailed analysis of the refining business in Europe (by Purvin and Gurtz) was used to highlight some key issues facing the industry. The article was written under five sub-sections: (i) economic environment (assessment of the economic prospects for Europe), (ii) energy efficiency and global warming (lists the four points of the EU car makers' voluntary agreement), (iii) fuel quality and refinery investment (iv) refinery capacity and utilisation and (v) industry structure and development. Diagrams show GDP per capita for East and West, European road fuel demand to 2015 and European net trade and European refinery ownership by crude capacity. It was concluded that the future of refining in Europe is 'exciting and challenging' and there are likely to be more large joint venture refineries. (UK)

  6. An Examination of HR Strategic Recruitment and Selection Approaches in China

    OpenAIRE

    Zhou, Guozhen

    2006-01-01

    Abstract In the past two decades, the manner in which organisations in the People's Republic of China (PRC) managed their human resources has changed dramatically (Braun and Warner, 2002). As the economy grows and moves into higher value-added work, strategic recruitment and selection are vital to an organisation's success. This dissertation seeks to examine the recruitment and selection strategy approaches in China. This research is based on 15 well-known firms, of which 11 are multinati...

  7. A genetic algorithm-based framework for wavelength selection on sample categorization.

    Science.gov (United States)

    Anzanello, Michel J; Yamashita, Gabrielli; Marcelo, Marcelo; Fogliatto, Flávio S; Ortiz, Rafael S; Mariotti, Kristiane; Ferrão, Marco F

    2017-08-01

    In forensic and pharmaceutical scenarios, the application of chemometrics and optimization techniques has unveiled common and peculiar features of seized medicine and drug samples, helping investigative forces to track illegal operations. This paper proposes a novel framework aimed at identifying relevant subsets of attenuated total reflectance Fourier transform infrared (ATR-FTIR) wavelengths for classifying samples into two classes, for example authentic or forged categories in case of medicines, or salt or base form in cocaine analysis. In the first step of the framework, the ATR-FTIR spectra were partitioned into equidistant intervals and the k-nearest neighbour (KNN) classification technique was applied to each interval to insert samples into proper classes. In the next step, selected intervals were refined through the genetic algorithm (GA) by identifying a limited number of wavelengths from the intervals previously selected aimed at maximizing classification accuracy. When applied to Cialis®, Viagra®, and cocaine ATR-FTIR datasets, the proposed method substantially decreased the number of wavelengths needed to categorize, and increased the classification accuracy. From a practical perspective, the proposed method provides investigative forces with valuable information towards monitoring illegal production of drugs and medicines. In addition, focusing on a reduced subset of wavelengths allows the development of portable devices capable of testing the authenticity of samples during police checking events, avoiding the need for later laboratorial analyses and reducing equipment expenses. Theoretically, the proposed GA-based approach yields more refined solutions than the current methods relying on interval approaches, which tend to insert irrelevant wavelengths in the retained intervals. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Uranium refining process using ion exchange membrane

    International Nuclear Information System (INIS)

    Yamaguchi, Akira

    1977-01-01

    As for the method of refining uranium ore being carried out in Europe and America at present, uranium ore is roughly refined at the mine sites to yellow cake, then this is transported to refineries and refined by dry method. This method has the following faults, namely the number of processes is large, it requires expensive corrosion-resistant materials because of high temperature treatment, and the impurities in uranium tend to increase. On the other hand, in case of EXCER method, treatment is carried out at low temperature, and high purity uranium can be obtained, but the efficiency of electrolytic reduction process is extremely low, and economically infeasible. In the wet refining method called PNC process, uranium tetrafluoride is produced from uranium ore without making yellow cake, therefore the process is rationalized largely, and highly economical. The electrolytic reduction process in this method was developed by Asahi Chemical Industry Co., Ltd. by constructing the pilot plant in Ningyotoge Mine. The ion exchange membrane, the electrodes, and the problems concerning the process and the engineering for commercial plants were investigated. The electrolytic reduction process, the pilot plant, the development of the elements of electrolytic cells, the establishment of analytical process, the measurement of the electrolytic characteristics, the demonstration operation, and the life time of the electrolytic diaphragm are reported. (Kako, I.)

  9. Evaluation of unrestrained replica-exchange simulations using dynamic walkers in temperature space for protein structure refinement.

    Directory of Open Access Journals (Sweden)

    Mark A Olson

    Full Text Available A central problem of computational structural biology is the refinement of modeled protein structures taken from either comparative modeling or knowledge-based methods. Simulations are commonly used to achieve higher resolution of the structures at the all-atom level, yet methodologies that consistently yield accurate results remain elusive. In this work, we provide an assessment of an adaptive temperature-based replica exchange simulation method where the temperature clients dynamically walk in temperature space to enrich their population and exchanges near steep energetic barriers. This approach is compared to earlier work of applying the conventional method of static temperature clients to refine a dataset of conformational decoys. Our results show that, while an adaptive method has many theoretical advantages over a static distribution of client temperatures, only limited improvement was gained from this strategy in excursions of the downhill refinement regime leading to an increase in the fraction of native contacts. To illustrate the sampling differences between the two simulation methods, energy landscapes are presented along with their temperature client profiles.

  10. Method of preparing an Al-Ti-B grain refiner for aluminium-comprising products, and a method of casting aluminium products

    NARCIS (Netherlands)

    Brinkman, H.J.; Duszczyk, J.; Katgerman, L.

    1999-01-01

    The invention relates to a method of preparing an Al-Ti-B grain refiner for cast aluminium-comprising products. According to the invention the preparation is realized by mixing powders selected from the group comprising aluminium, titanium, boron, and alloys and intermetallic compounds thereof,

  11. Russian refining - an industry in transition

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, E [CentreInvest, Moscow (Russian Federation)

    1999-02-01

    In the old Soviet Union (now called the CIS), the refining industry is undergoing much modernisation, although the process is far from complete. Eventually, the CIS is expected to have a market-responsive competitive refining business. The expected transformation is discussed according to a five-stage plan. The stages are (i) the change from horizontally integrated entity to vertically integrated global concerns, (ii) the change from over-manned dinosaurs to modern efficient businesses, (iii) the move towards smaller, more advanced market-orientated processes, (iv) improving the transport and storage infrastructures and (v) improving accountability and profitability. The predictions for 2005 onwards are for sustained profitability. (UK)

  12. Variability-Specific Abstraction Refinement for Family-Based Model Checking

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Wasowski, Andrzej

    2017-01-01

    and property, while the number of possible scenarios is very large. In this work, we present an automatic iterative abstraction refinement procedure for family-based model checking. We use Craig interpolation to refine abstract variational models based on the obtained spurious counterexamples (traces...

  13. Zone refining of sintered, microwave-derived YBCO superconductors

    International Nuclear Information System (INIS)

    Warrier, K.G.K.; Varma, H.K.; Mani, T.V.; Damodaran, A.D.; Balachandran, U.

    1993-07-01

    Post-sintering treatments such as zone melting under thermal gradient has been conducted on sintered YBCO tape cast films. YBCO precursor powder was derived through decomposition of a mixture of nitrates of cations in a microwave oven for ∼4 min. The resulting powder was characterized and made into thin sheets by tape casting and then sintered at 945 C for 5 h. The sintered tapes were subjected to repeated zone refining operations at relatively high speeds of ∼30 mm/h. A microstructure having uniformly oriented grains in the a-b plane throughout the bulk of the sample was obtained by three repeated zone refining operations. Details of precursor preparation, microwave processing and its advantages, zone refining conditions, and microstructural features are presented in this paper

  14. Relay discovery and selection for large-scale P2P streaming.

    Directory of Open Access Journals (Sweden)

    Chengwei Zhang

    Full Text Available In peer-to-peer networks, application relays have been commonly used to provide various networking services. The service performance often improves significantly if a relay is selected appropriately based on its network location. In this paper, we studied the location-aware relay discovery and selection problem for large-scale P2P streaming networks. In these large-scale and dynamic overlays, it incurs significant communication and computation cost to discover a sufficiently large relay candidate set and further to select one relay with good performance. The network location can be measured directly or indirectly with the tradeoffs between timeliness, overhead and accuracy. Based on a measurement study and the associated error analysis, we demonstrate that indirect measurements, such as King and Internet Coordinate Systems (ICS, can only achieve a coarse estimation of peers' network location and those methods based on pure indirect measurements cannot lead to a good relay selection. We also demonstrate that there exists significant error amplification of the commonly used "best-out-of-K" selection methodology using three RTT data sets publicly available. We propose a two-phase approach to achieve efficient relay discovery and accurate relay selection. Indirect measurements are used to narrow down a small number of high-quality relay candidates and the final relay selection is refined based on direct probing. This two-phase approach enjoys an efficient implementation using the Distributed-Hash-Table (DHT. When the DHT is constructed, the node keys carry the location information and they are generated scalably using indirect measurements, such as the ICS coordinates. The relay discovery is achieved efficiently utilizing the DHT-based search. We evaluated various aspects of this DHT-based approach, including the DHT indexing procedure, key generation under peer churn and message costs.

  15. Accounting for selection bias in species distribution models: An econometric approach on forested trees based on structural modeling

    Science.gov (United States)

    Ay, Jean-Sauveur; Guillemot, Joannès; Martin-StPaul, Nicolas K.; Doyen, Luc; Leadley, Paul

    2015-04-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global change on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of application on forested trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8 km). We also compared the output of the SSDM with outputs of a classical SDM in term of bioclimatic response curves and potential distribution under current climate. According to the species and the spatial resolution of the calibration dataset, shapes of bioclimatic response curves the modelled species distribution maps differed markedly between the SSDM and classical SDMs. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents a crucial step to account for economic constraints on tree

  16. The Refinement as the Moral Precondition for the Man’s Intellectual Creative Activity (the Historical Aspect

    Directory of Open Access Journals (Sweden)

    A. S. Frants

    2012-01-01

    Full Text Available The paper is devoted to the ethical and cultural facilitation of intellectual creative activities. The methodology basis of the research compiles the differentiated analy- sis of Russian moral culture and axiological analysis of its educational potential. The authors describe the specifics of the system of aristocratic moral qualities and refinement; their main characteristics being observed. The novelty of the approach involves the understanding of the aristocratic moral values as a necessary condition for the productive intellectual and creative activity. The authors investigate the historic origin of the aristocratic moral values, and define the functions and specifics of the Russian type of aristocratic culture; the objective and subjective conditions of its for- mation are highlighted, as well as the integrity of the refinement inherent in people en- gaged in intellectual and creative activities. The authors believe that revival of the refinement, as one of the aspects of the Russian moral culture, depends on both the development of our own nation and the the world society as a whole. Nowadays, when the postindustrial society is giving way to the informational one, the production of information takes the leading part in social life. The information and knowledge, being its unified products, provide new ways for evolving of the phenomenon of refinement. Its pedagogic potential should be imple- mented in the process of education and upbringing. 

  17. Connected Vehicle Pilot Deployment Program Independent Evaluation: Mobility, Environmental, and Public Agency Efficiency Refined Evaluation Plan - New York City

    Science.gov (United States)

    2018-03-01

    The purpose of this report is to provide a refined evaluation plan detailing the approach to be used by the Texas A&M Transportation Institute Connected Vehicle Pilot Deployment Evaluation Team for evaluating the mobility, environmental, and public a...

  18. Oil refining and product marketing developments in southeast Asia

    International Nuclear Information System (INIS)

    Szabo, A.M.

    1992-01-01

    Views on the future are based on supplies from a relatively stable Middle East and continued economic growth in the southeast Asian and Pacific countries. Under these circumstances the oil market for the Association of Southeast Asian Nations (ASEAN) will expand considerably during the decade of the 90's. Pacific country demand, 5.92 MMB/D, in 1990 is likely to grow to 7.06 MMB/D in 2000. Regional production could supply about 40% of this. The Asia-Pacific shortage of refining capacity could lead to high regional refined product prices and health refining profit margins. (author)

  19. Electromyogram refinement using muscle synergy based regulation of uncertain information.

    Science.gov (United States)

    Min, Kyuengbo; Shin, Duk; Lee, Jongho; Kakei, Shinji

    2018-04-27

    Electromyogram signal (EMG) measurement frequently experiences uncertainty attributed to issues caused by technical constraints such as cross talk and maximum voluntary contraction. Due to these problems, individual EMGs exhibit uncertainty in representing their corresponding muscle activations. To regulate this uncertainty, we proposed an EMG refinement, which refines EMGs with regulating the contribution redundancy of the signals from EMGs to approximating torques through EMG-driven torque estimation (EDTE) using the muscular skeletal forward dynamic model. To regulate this redundancy, we must consider the synergistic contribution redundancy of muscles, including "unmeasured" muscles, to approximating torques, which primarily causes redundancy of EDTE. To suppress this redundancy, we used the concept of muscle synergy, which is a key concept of analyzing the neurophysiological regulation of contribution redundancy of muscles to exerting torques. Based on this concept, we designed a muscle-synergy-based EDTE as a framework for EMG refinement, which regulates the abovementioned uncertainty of individual EMGs in consideration of unmeasured muscles. In achieving the proposed EMG refinement, the most considerable point is to suppress a large change such as overestimation attributed to enhancement of the contribution of particular muscles to estimating torques. Therefore it is reasonable to refine EMGs by minimizing the change in EMGs. To evaluate this model, we used a Bland-Altman plot, which quantitatively evaluates the proportional bias of refined signals to EMGs. Through this evaluation, we showed that the proposed EDTE minimizes the bias while approximating torques. Therefore this minimization optimally regulates the uncertainty of EMGs and thereby leads to optimal EMG refinement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Decontamination of transuranic contaminated metals by melt refining

    International Nuclear Information System (INIS)

    Heshmatpour, B.; Copeland, G.L.; Heestand, R.L.

    1983-01-01

    Melt refining of transuranic contaminated metals is a possible decontamination process with the potential advantages of producing metal for reuse and of simplifying chemical analyses. By routinely achieving the 10 nCi/g( about0.1ppm) level by melt refining, scrap metal can be removed from the transuranic waste category. (To demonstrate the effectiveness of this melt refining process, mild steel, stainless steel, nickel, and copper were contaminated with 500 ppm (μg/g) PuO 2 and melted with various fluxes. The solidified slags and metals were analyzed for their plutonium contents, and corresponding partition ratios for plutonium were calculated. Some metals were double refined in order to study the effect of secondary slag treatment. The initial weight of the slags was also varied to investigate the effect of slag weight on the degree of plutonium removal. In general, all four metals could be decontaminated below 1 ppm (μg/g) Pu ( about100 nCi/g) by a single slag treatment. Doubling the slag weight did not improve decontamination significantly; however, double slag treatment using 5 wt.% slag did decontaminate the metals to below 0.1 ppm (μg/g) Pu (10 nCi/g).)

  1. Role of manganese on the grain refining efficiency of AZ91D magnesium alloy refined by Al4C3

    International Nuclear Information System (INIS)

    Liu Shengfa; Zhang Yuan; Han Hui

    2010-01-01

    A novel Mg-50% Al 4 C 3 (hereafter in wt.%) master alloy has been developed by powder in situ synthesis process, the role of manganese on the grain refining efficiency of AZ91D magnesium alloy refined by this master alloy has been investigated. X-ray diffraction (XRD) and energy dispersive X-ray spectroscopy (EDS) results show the existence of Al 4 C 3 particles in this master alloy. After addition of 0.6% Al 4 C 3 or combined addition of 0.6% Al 4 C 3 and 0.27% Mn, the average grain size of AZ91D decreased dramatically from 360 μm to 210 μm, and from 360 μm to130 μm, respectively. However, no further refinement of grain size was achieved with additional amount of Mn exceeding 0.27% for AZ91D alloy refined by 0.6% Al 4 C 3 in the present investigation. Al-C-O-Mn-Fe-rich intermetallic particles with an Al-C-O-rich coating film, often observed in the central region of magnesium grains of the AZ91D alloy treated by the combination of Al 4 C 3 and Mn, are proposed to be the potent nucleating substrates for primary α-Mg.

  2. Enhancement of signal sensitivity in a heterogeneous neural network refined from synaptic plasticity

    Energy Technology Data Exchange (ETDEWEB)

    Li Xiumin; Small, Michael, E-mail: ensmall@polyu.edu.h, E-mail: 07901216r@eie.polyu.edu.h [Department of Electronic and Information Engineering, Hong Kong Polytechnic University, Hung Hom, Kowloon (Hong Kong)

    2010-08-15

    Long-term synaptic plasticity induced by neural activity is of great importance in informing the formation of neural connectivity and the development of the nervous system. It is reasonable to consider self-organized neural networks instead of prior imposition of a specific topology. In this paper, we propose a novel network evolved from two stages of the learning process, which are respectively guided by two experimentally observed synaptic plasticity rules, i.e. the spike-timing-dependent plasticity (STDP) mechanism and the burst-timing-dependent plasticity (BTDP) mechanism. Due to the existence of heterogeneity in neurons that exhibit different degrees of excitability, a two-level hierarchical structure is obtained after the synaptic refinement. This self-organized network shows higher sensitivity to afferent current injection compared with alternative archetypal networks with different neural connectivity. Statistical analysis also demonstrates that it has the small-world properties of small shortest path length and high clustering coefficients. Thus the selectively refined connectivity enhances the ability of neuronal communications and improves the efficiency of signal transmission in the network.

  3. Enhancement of signal sensitivity in a heterogeneous neural network refined from synaptic plasticity

    International Nuclear Information System (INIS)

    Li Xiumin; Small, Michael

    2010-01-01

    Long-term synaptic plasticity induced by neural activity is of great importance in informing the formation of neural connectivity and the development of the nervous system. It is reasonable to consider self-organized neural networks instead of prior imposition of a specific topology. In this paper, we propose a novel network evolved from two stages of the learning process, which are respectively guided by two experimentally observed synaptic plasticity rules, i.e. the spike-timing-dependent plasticity (STDP) mechanism and the burst-timing-dependent plasticity (BTDP) mechanism. Due to the existence of heterogeneity in neurons that exhibit different degrees of excitability, a two-level hierarchical structure is obtained after the synaptic refinement. This self-organized network shows higher sensitivity to afferent current injection compared with alternative archetypal networks with different neural connectivity. Statistical analysis also demonstrates that it has the small-world properties of small shortest path length and high clustering coefficients. Thus the selectively refined connectivity enhances the ability of neuronal communications and improves the efficiency of signal transmission in the network.

  4. A semiparametric graphical modelling approach for large-scale equity selection.

    Science.gov (United States)

    Liu, Han; Mulvey, John; Zhao, Tianqi

    2016-01-01

    We propose a new stock selection strategy that exploits rebalancing returns and improves portfolio performance. To effectively harvest rebalancing gains, we apply ideas from elliptical-copula graphical modelling and stability inference to select stocks that are as independent as possible. The proposed elliptical-copula graphical model has a latent Gaussian representation; its structure can be effectively inferred using the regularized rank-based estimators. The resulting algorithm is computationally efficient and scales to large data-sets. To show the efficacy of the proposed method, we apply it to conduct equity selection based on a 16-year health care stock data-set and a large 34-year stock data-set. Empirical tests show that the proposed method is superior to alternative strategies including a principal component analysis-based approach and the classical Markowitz strategy based on the traditional buy-and-hold assumption.

  5. Modeling of the inhomogeneity of grain refinement during combined metal forming process by finite element and cellular automata methods

    Energy Technology Data Exchange (ETDEWEB)

    Majta, Janusz; Madej, Łukasz; Svyetlichnyy, Dmytro S.; Perzyński, Konrad; Kwiecień, Marcin, E-mail: mkwiecie@agh.edu.pl; Muszka, Krzysztof

    2016-08-01

    The potential of discrete cellular automata technique to predict the grain refinement in wires produced using combined metal forming process is presented and discussed within the paper. The developed combined metal forming process can be treated as one of the Severe Plastic Deformation (SPD) techniques that consists of three different modes of deformation: asymmetric drawing with bending, namely accumulated angular drawing (AAD), wire drawing (WD) and wire flattening (WF). To accurately replicate complex stress state both at macro and micro scales during subsequent deformations two stage modeling approach was used. First, the Finite Element Method (FEM), implemented in commercial ABAQUS software, was applied to simulate entire combined forming process at the macro scale level. Then, based on FEM results, the Cellular Automata (CA) method was applied for simulation of grain refinement at the microstructure level. Data transferred between FEM and CA methods included set of files with strain tensor components obtained from selected integration points in the macro scale model. As a result of CA simulation, detailed information on microstructure evolution during severe plastic deformation conditions was obtained, namely: changes of shape and sizes of modeled representative volume with imposed microstructure, changes of the number of grains, subgrains and dislocation cells, development of grain boundaries angle distribution as well as changes in the pole figures. To evaluate CA model predictive capabilities, results of computer simulation were compared with scanning electron microscopy and electron back scattered diffraction images (SEM/EBSD) studies of samples after AAD+WD+WF process.

  6. Projections of the impact of expansion of domestic heavy oil production on the U.S. refining industry from 1990 to 2010. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, D.K.; Ramzel, E.B.; Strycker, A.R. [National Institute for Petroleum and Energy Research, Bartlesville, OK (United States). ITT Research Institute; Guariguata, G.; Salmen, F.G. [Bonner and Moore Management Science, Houston, TX (United States)

    1994-12-01

    This report is one of a series of publications assessing the feasibility of increasing domestic heavy oil (10{degrees} to 20{degrees} API gravity) production. This report provides a compendium of the United States refining industry and analyzes the industry by Petroleum Administration for Defense District (PADD) and by ten smaller refining areas. The refining capacity, oil source and oil quality are analyzed, and projections are made for the U.S. refining industry for the years 1990 to 2010. The study used publicly available data as background. A linear program model of the U.S. refining industry was constructed and validated using 1990 U.S. refinery performance. Projections of domestic oil production (decline) and import of crude oil (increases) were balanced to meet anticipated demand to establish a base case for years 1990 through 2010. The impact of additional domestic heavy oil production, (300 MB/D to 900 MB/D, originating in select areas of the U.S.) on the U.S. refining complex was evaluated. This heavy oil could reduce the import rate and the balance of payments by displacing some imported, principally Mid-east, medium crude. The construction cost for refining units to accommodate this additional domestic heavy oil production in both the low and high volume scenarios is about 7 billion dollars for bottoms conversion capacity (delayed coking) with about 50% of the cost attributed to compliance with the Clean Air Act Amendment of 1990.

  7. 40 CFR 80.555 - What provisions are available to a large refiner that acquires a small refiner or one or more of...

    Science.gov (United States)

    2010-07-01

    ... FUELS AND FUEL ADDITIVES Motor Vehicle Diesel Fuel; Nonroad, Locomotive, and Marine Diesel Fuel; and ECA... May 31, 2010 for a refinery acquired from a motor vehicle diesel fuel small refiner or beyond the... motor vehicle diesel fuel small refiner or beyond the dates specified in § 80.554(a) or (b), as...

  8. Tri-Clustered Tensor Completion for Social-Aware Image Tag Refinement.

    Science.gov (United States)

    Tang, Jinhui; Shu, Xiangbo; Qi, Guo-Jun; Li, Zechao; Wang, Meng; Yan, Shuicheng; Jain, Ramesh

    2017-08-01

    Social image tag refinement, which aims to improve tag quality by automatically completing the missing tags and rectifying the noise-corrupted ones, is an essential component for social image search. Conventional approaches mainly focus on exploring the visual and tag information, without considering the user information, which often reveals important hints on the (in)correct tags of social images. Towards this end, we propose a novel tri-clustered tensor completion framework to collaboratively explore these three kinds of information to improve the performance of social image tag refinement. Specifically, the inter-relations among users, images and tags are modeled by a tensor, and the intra-relations between users, images and tags are explored by three regularizations respectively. To address the challenges of the super-sparse and large-scale tensor factorization that demands expensive computing and memory cost, we propose a novel tri-clustering method to divide the tensor into a certain number of sub-tensors by simultaneously clustering users, images and tags into a bunch of tri-clusters. And then we investigate two strategies to complete these sub-tensors by considering (in)dependence between the sub-tensors. Experimental results on a real-world social image database demonstrate the superiority of the proposed method compared with the state-of-the-art methods.

  9. Log-Optimal Portfolio Selection Using the Blackwell Approachability Theorem

    OpenAIRE

    V'yugin, Vladimir

    2014-01-01

    We present a method for constructing the log-optimal portfolio using the well-calibrated forecasts of market values. Dawid's notion of calibration and the Blackwell approachability theorem are used for computing well-calibrated forecasts. We select a portfolio using this "artificial" probability distribution of market values. Our portfolio performs asymptotically at least as well as any stationary portfolio that redistributes the investment at each round using a continuous function of side in...

  10. Cell-Averaged discretization for incompressible Navier-Stokes with embedded boundaries and locally refined Cartesian meshes: a high-order finite volume approach

    Science.gov (United States)

    Bhalla, Amneet Pal Singh; Johansen, Hans; Graves, Dan; Martin, Dan; Colella, Phillip; Applied Numerical Algorithms Group Team

    2017-11-01

    We present a consistent cell-averaged discretization for incompressible Navier-Stokes equations on complex domains using embedded boundaries. The embedded boundary is allowed to freely cut the locally-refined background Cartesian grid. Implicit-function representation is used for the embedded boundary, which allows us to convert the required geometric moments in the Taylor series expansion (upto arbitrary order) of polynomials into an algebraic problem in lower dimensions. The computed geometric moments are then used to construct stencils for various operators like the Laplacian, divergence, gradient, etc., by solving a least-squares system locally. We also construct the inter-level data-transfer operators like prolongation and restriction for multi grid solvers using the same least-squares system approach. This allows us to retain high-order of accuracy near coarse-fine interface and near embedded boundaries. Canonical problems like Taylor-Green vortex flow and flow past bluff bodies will be presented to demonstrate the proposed method. U.S. Department of Energy, Office of Science, ASCR (Award Number DE-AC02-05CH11231).

  11. A Novel Approach to Selecting Contractor in Agent-based Multi-sensor Battlefield Reconnaissance Simulation

    Directory of Open Access Journals (Sweden)

    Xiong Li

    2012-11-01

    Full Text Available This paper presents a novel approach towards showing how contractor in agent-based simulation for complex warfare system such as multi-sensor battlefield reconnaissance system can be selected in Contract Net Protocol (CNP with high efficiency. We first analyze agent and agent-based simulation framework, CNP and collaborators, and present agents interaction chain used to actualize CNP and establish agents trust network. We then obtain contractor's importance weight and dynamic trust by presenting fuzzy similarity-based algorithm and trust modifying algorithm, thus we propose contractor selecting approach based on maximum dynamic integrative trust. We validate the feasibility and capability of this approach by implementing simulation, analyzing compared results and checking the model.

  12. The Influence of Grain Refiners on the Efficiency of Ceramic Foam Filters

    Science.gov (United States)

    Towsey, Nicholas; Schneider, Wolfgang; Krug, Hans-Peter; Hardman, Angela; Keegan, Neil J.

    An extensive program of work has been carried out to evaluate the efficiency of ceramic foam filters under carefully controlled conditions. Work reported at previous TMS meetings showed that in the absence of grain refiners, ceramic foam filters have the capacity for high filtration efficiency and consistent, reliable performance. The current phase of the investigation focuses on the impact grain refiner additions have on filter performance. The high filtration efficiencies obtained using 50 or 80ppi CFF's in the absence of grain refiners diminish when Al-3%Ti-1%B grain refiners are added. This, together with the impact of incoming inclusion loading on filter performance and the level of grain refiner addition are considered in detail. The new generation Al-3%Ti-0.15%C grain refiner has also been included. At typical addition levels (1kg/tonne) the effect on filter efficiency is similar to that for TiB2based grain refiners. The work was again conducted on a production scale using AA1050 alloy. Metal quality was determined using LiMCA and PoDFA. Spent filters were also analysed.

  13. Strategic issues and implications for the refining and marketing sector

    International Nuclear Information System (INIS)

    Jeffe, R.A.

    1995-01-01

    Refiners have faced a challenging business environment for the past decade. During this period, the industry has made approximately $25 billion of capital expenditures primarily to comply with increased governmental mandates, faced highly volatile petroleum product prices and garnered a return to equity of only 5%. While worldwide and US refining capacity has been flat in recent years, demand for refined petroleum products has been on the upswing and domestic supplies have also increased due to improved US capacity utilization rates (76% in 1984 and 93% in 1994) and increased imports (gasoline sales up 11% since 1984). The result has been highly volatile and generally weak refining margins (net Gulf Coast crack spread ranging from ($.95)/bbl in 1984 to $1.84/bbl in 1990 and averaging $.81/bbl since 1984). In response to the sustained difficulties in the marketplace, one has recently witnessed some strategic realignment in the industry. Several of the integrated companies, frustrated with the required capital expenditures and meager returns, have decided to shed non-core, non-strategic refining assets. For the most part, these assets have been bought by independents at, by historical measures, very attractive terms. This paper will provide an overview of the economics of the refining business, discuss the recent trends in refinery M and A activity and summarize possible implications of the recent strategic realignment

  14. Usefulness of selective cerebral intra-arterial digital subtraction angiography by transbrachial approach

    International Nuclear Information System (INIS)

    Matsunaga, Naofumi; Hayashi, Kuniaki; Uetani, Masataka; Hirao, Koichi; Fukuda, Toshio; Aikawa, Hisayuki; Iwao, Masaaki; Hombo, Zen-ichiro

    1988-01-01

    Selective cerebral intra-arterial digital subtraction angiography (IA-DSA) by the transbrachial approach was performed on 53 patients (including 34 outpatients) with suspected cerebrovascular diseases or brain tumors. 80-cm-long, 4F modified Simmons catheter was used. Success rates of selective catheterization to the common carotid and vertebral arteries were 86.0 % from right transbrachial approach (35 cases) and 79.6 % from left approach (18 cases). Successful catheterization to the common carotid and ipsilateral vertebral arteries is obtained in 91.3 % from right transbrachial approach, and 78.7 % from left approach. Righ common carotid artery could be catheterized in all 55 cases from right transbrachial approach, but in only 6 of 15 patients (40 %) from left approach. As for contrast material, 4 or 6 ml of Iopamidol 300 mgI/ml were mechanically injected into common carotid artery at a flow rate of 2 - 3 ml/sec, and 9 ml two-fold diluted Iopamidol were injected into the vertebral artery at a flow rate of 6 ml/sec. There was no recoil of the catheter. Visualization of the relatively small vessels such as cortical branches was excellent in most cases. However, smaller vessel such as meningohypophyseal trunk was not well visualized with IA-DSA. Spatial resolution of IA-DSA was generally satisfactory. However, conventional angiography was still required, particularly to clearly delineate small cerebral aneurysms. Major complications were never experienced. It was concluded that this procedure is useful, particularly for the screening and postoperative follow-up studies, and can also be applied to outpatients. (author)

  15. Breeding approaches in simultaneous selection for multiple stress tolerance of maize in tropical environments

    Directory of Open Access Journals (Sweden)

    Denić M.

    2007-01-01

    Full Text Available Maize is the principal crop and major staple food in the most countries of Sub-Saharan Africa. However, due to the influence of abiotic and biotic stress factors, maize production faces serious constraints. Among the agro-ecological conditions, the main constraints are: lack and poor distribution of rainfall; low soil fertility; diseases (maize streak virus, downy mildew, leaf blights, rusts, gray leaf spot, stem/cob rots and pests (borers and storage pests. Among the socio-economic production constraints are: poor economy, serious shortage of trained manpower; insufficient management expertise, lack of use of improved varieties and poor cultivation practices. To develop desirable varieties, and thus consequently alleviate some of these constraints, appropriate breeding approaches and field-based methodologies in selection for multiple stress tolerance, were implemented. These approaches are mainly based on: a Crossing selected genotypes with more desirable stress tolerant and other agronomic traits; b Using the disease/pest spreader row method, combined with testing and selection of created progenies under strong to intermediate pressure of drought and low soil fertility in nurseries; and c Evaluation of the varieties developed in multi-location trials under low and "normal" inputs. These approaches provide testing and selection of large number of progenies, which is required for simultaneous selection for multiple stress tolerance. Data obtained revealed that remarkable improvement of the traits under selection was achieved. Biggest progress was obtained in selection for maize streak virus and downy mildew resistance, flintiness and earliness. In the case of drought stress, statistical analyses revealed significant negative correlation between yield and anthesis-silking interval, and between yield and days to silk, but positive correlation between yield and grain weight per ear.

  16. Preparation of Al-Ti-B grain refiner by SHS technology

    International Nuclear Information System (INIS)

    Nikitin, V.I.; Wanqi, J.I.E.; Kandalova, E.G.; Makarenko, A.G.; Yong, L.

    2000-01-01

    Since the discovery of the grain refinement effect of aluminum by titanium, especially with the existence of B or C in 1950, grain refiners are widely accepted in industry for microstructure control of aluminum alloys. Research on this topic is to obtain the highest grain refinement efficiency with the lowest possible addition of master alloy. It is widely accepted that the morphology and size of TiAl 3 particles, which are known as heterogeneous nucleation centers, are important factors deterring the grain refinement efficiency. Fine TiAl 3 particles are favorable. The grain refinement process shows a heredity phenomenon, which means that structural information from initial materials transfers through a melt to the final product. It is important to find the connection between microstructural parameters of the master alloy and the final product. To improve the quality of Al-Ti-B master alloys for the use as a grain refiner, a new method based on SHS (self-propagating high-temperature synthesis) technology has been developed in Samara State Technical University to produce the master alloys. SHS, as a new method for preparation of materials, was first utilized by Merzhanov in 1967. This method uses the energy from highly exothermic reactions to sustain the chemical reaction in a combustion wave. The advantages of SHS include simplicity, low energy requirement, and higher product purity. Because SHS reactions can take place between elemental reactants, it is easy to control product composition. The purposes of this investigation were to fabricate an SHS Al-5%Ti-1%B master alloy, to analyze its structure and to test its grain refining performance

  17. Mirror of the refined topological vertex from a matrix model

    CERN Document Server

    Eynard, B

    2011-01-01

    We find an explicit matrix model computing the refined topological vertex, starting from its representation in terms of plane partitions. We then find the spectral curve of that matrix model, and thus the mirror symmetry of the refined vertex. With the same method we also find a matrix model for the strip geometry, and we find its mirror curve. The fact that there is a matrix model shows that the refined topological string amplitudes also satisfy the remodeling the B-model construction.

  18. Structural Refinement of Proteins by Restrained Molecular Dynamics Simulations with Non-interacting Molecular Fragments.

    Directory of Open Access Journals (Sweden)

    Rong Shen

    2015-10-01

    Full Text Available The knowledge of multiple conformational states is a prerequisite to understand the function of membrane transport proteins. Unfortunately, the determination of detailed atomic structures for all these functionally important conformational states with conventional high-resolution approaches is often difficult and unsuccessful. In some cases, biophysical and biochemical approaches can provide important complementary structural information that can be exploited with the help of advanced computational methods to derive structural models of specific conformational states. In particular, functional and spectroscopic measurements in combination with site-directed mutations constitute one important source of information to obtain these mixed-resolution structural models. A very common problem with this strategy, however, is the difficulty to simultaneously integrate all the information from multiple independent experiments involving different mutations or chemical labels to derive a unique structural model consistent with the data. To resolve this issue, a novel restrained molecular dynamics structural refinement method is developed to simultaneously incorporate multiple experimentally determined constraints (e.g., engineered metal bridges or spin-labels, each treated as an individual molecular fragment with all atomic details. The internal structure of each of the molecular fragments is treated realistically, while there is no interaction between different molecular fragments to avoid unphysical steric clashes. The information from all the molecular fragments is exploited simultaneously to constrain the backbone to refine a three-dimensional model of the conformational state of the protein. The method is illustrated by refining the structure of the voltage-sensing domain (VSD of the Kv1.2 potassium channel in the resting state and by exploring the distance histograms between spin-labels attached to T4 lysozyme. The resulting VSD structures are in good

  19. Real-time fiber selection using the Wii remote

    Science.gov (United States)

    Klein, Jan; Scholl, Mike; Köhn, Alexander; Hahn, Horst K.

    2010-02-01

    In the last few years, fiber tracking tools have become popular in clinical contexts, e.g., for pre- and intraoperative neurosurgical planning. The efficient, intuitive, and reproducible selection of fiber bundles still constitutes one of the main issues. In this paper, we present a framework for a real-time selection of axonal fiber bundles using a Wii remote control, a wireless controller for Nintendo's gaming console. It enables the user to select fiber bundles without any other input devices. To achieve a smooth interaction, we propose a novel spacepartitioning data structure for efficient 3D range queries in a data set consisting of precomputed fibers. The data structure which is adapted to the special geometry of fiber tracts allows for queries that are many times faster compared with previous state-of-the-art approaches. In order to extract reliably fibers for further processing, e.g., for quantification purposes or comparisons with preoperatively tracked fibers, we developed an expectationmaximization clustering algorithm that can refine the range queries. Our initial experiments have shown that white matter fiber bundles can be reliably selected within a few seconds by the Wii, which has been placed in a sterile plastic bag to simulate usage under surgical conditions.

  20. Effect of Chemical Refining on Citrullus Colocynthis and Pongamia ...

    African Journals Online (AJOL)

    Oil from the both plant seeds was evaluated (both before and after refining) for different physico-chemical parameters like free fatty acids, iodine value, peroxide value, saponification value, unsaponifiable matter and fatty acid composition. Oil yield (30-35 %) in both plants was found average. After refining, per cent reduction ...

  1. Linkages between the markets for crude oil and the markets for refined products

    International Nuclear Information System (INIS)

    Didziulis, V.S.

    1990-01-01

    To understand the crude oil price determination process it is necessary to extend the analysis beyond the markets for petroleum. Crude oil prices are determined in two closely related markets: the markets for crude oil and the markets for refined products. An econometric-linear programming model was developed to capture the linkages between the markets for crude oil and refined products. In the LP refiners maximize profits given crude oil supplies, refining capacities, and prices of refined products. The objective function is profit maximization net of crude oil prices. The shadow price on crude oil gives the netback price. Refined product prices are obtained from the econometric models. The model covers the free world divided in five regions. The model is used to analyze the impacts on the markets of policies that affect crude oil supplies, the demands for refined products, and the refining industry. For each scenario analyzed the demand for crude oil is derived from the equilibrium conditions in the markets for products. The demand curve is confronted with a supply curve which maximizes revenues providing an equilibrium solution for both crude oil and product markets. The model also captures crude oil price differentials by quality. The results show that the demands for crude oil are different across regions due to the structure of the refining industries and the characteristics of the demands for refined products. Changes in the demands for products have a larger impact on the markets than changes in the refining industry. Since markets for refined products and crude oil are interrelated they can't be analyzed individually if an accurate and complete assessment of a policy is to be made. Changes in only one product market in one region affect the other product markets and the prices of crude oil

  2. Interactive visual exploration and refinement of cluster assignments.

    Science.gov (United States)

    Kern, Michael; Lex, Alexander; Gehlenborg, Nils; Johnson, Chris R

    2017-09-12

    With ever-increasing amounts of data produced in biology research, scientists are in need of efficient data analysis methods. Cluster analysis, combined with visualization of the results, is one such method that can be used to make sense of large data volumes. At the same time, cluster analysis is known to be imperfect and depends on the choice of algorithms, parameters, and distance measures. Most clustering algorithms don't properly account for ambiguity in the source data, as records are often assigned to discrete clusters, even if an assignment is unclear. While there are metrics and visualization techniques that allow analysts to compare clusterings or to judge cluster quality, there is no comprehensive method that allows analysts to evaluate, compare, and refine cluster assignments based on the source data, derived scores, and contextual data. In this paper, we introduce a method that explicitly visualizes the quality of cluster assignments, allows comparisons of clustering results and enables analysts to manually curate and refine cluster assignments. Our methods are applicable to matrix data clustered with partitional, hierarchical, and fuzzy clustering algorithms. Furthermore, we enable analysts to explore clustering results in context of other data, for example, to observe whether a clustering of genomic data results in a meaningful differentiation in phenotypes. Our methods are integrated into Caleydo StratomeX, a popular, web-based, disease subtype analysis tool. We show in a usage scenario that our approach can reveal ambiguities in cluster assignments and produce improved clusterings that better differentiate genotypes and phenotypes.

  3. Refined analysis results for multimedia network costs and profits

    DEFF Research Database (Denmark)

    Tahkokorpi, M.; Falch, Morten; Skouby, Knud Erik

    This deliverable describes the techno-economic business model developed in EURORIM WP3 and presents the refined results of the multimedia service delivery cost-profit calculations......This deliverable describes the techno-economic business model developed in EURORIM WP3 and presents the refined results of the multimedia service delivery cost-profit calculations...

  4. Refining and petrochemicals

    International Nuclear Information System (INIS)

    Benazzi, E.

    2003-01-01

    Down sharply in 2002, refining margins showed a clear improvement in the first half-year of 2003. As a result, the earnings reported by oil companies for financial year 2002 were significantly lower than in 2001, but the prospects are brighter for 2003. In the petrochemicals sector, slow demand and higher feedstock prices eroded margins in 2002, especially in Europe and the United States. The financial results for the first part of 2003 seem to indicate that sector profitability will not improve before 2004. (author)

  5. Strategic project selection based on evidential reasoning approach for high-end equipment manufacturing industry

    Directory of Open Access Journals (Sweden)

    Lu Guangyan

    2017-01-01

    Full Text Available With the rapid development of science and technology, emerging information technologies have significantly changed the daily life of people. In such context, strategic project selection for high-end equipment manufacturing industries faces more and more complexities and uncertainties with the consideration of several complex criteria. For example, a group of experts rather than a single expert should be invited to select strategic project for high-end equipment manufacturing industries and the experts may feel difficulty to express their preferences towards different strategic projects due to their limited cognitive capabilities. In order to handle these complexities and uncertainties, the criteria framework of strategic project selection is firstly constructed based on the characteristics of high-end equipment manufacturing industries and then evidential reasoning (ER approach is introduced in this paper to help experts express their uncertain preferences and aggregate these preferences to generate an appropriate strategic project. A real case of strategic project selection in a high-speed train manufacturing enterprise is investigated to demonstrate the validity of the ER approach in solving strategic project selection problem.

  6. Refining of biodiesel by ceramic membrane separation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yong; Ou, Shiyi; Tan, Yanlai; Tang, Shuze [Department of Food Science and Engineering, Jinan University, Guangzhou 510632 (China); Wang, Xingguo; Liu, Yuanfa [School of Food Science and Technology, Jiangnan University, Wuxi 214112 (China)

    2009-03-15

    A ceramic membrane separation process for biodiesel refining was developed to reduce the considerable usage of water needed in the conventional water washing process. Crude biodiesel produced by refined palm oil was micro-filtered by ceramic membranes of the pore size of 0.6, 0.2 and 0.1 {mu}m to remove the residual soap and free glycerol, at the transmembrane pressure of 0.15 MPa and temperature of 60 C. The flux through membrane maintained at 300 L m{sup -} {sup 2} h{sup -} {sup 1} when the volumetric concentrated ratio reached 4. The content of potassium, sodium, calcium and magnesium in the whole permeate was 1.40, 1.78, 0.81 and 0.20 mg/kg respectively, as determined by inductively coupled plasma-atomic emission spectroscopy. These values are lower than the EN 14538 specifications. The residual free glycerol in the permeate was estimated by water extraction, its value was 0.0108 wt.%. This ceramic membrane technology was a potential environmental process for the refining of biodiesel. (author)

  7. Production of Al-Ti-C grain refiners with the addition of elemental carbon

    International Nuclear Information System (INIS)

    Gezer, Berke Turgay; Toptan, Fatih; Daglilar, Sibel; Kerti, Isil

    2010-01-01

    Grain refining process used in aluminium alloys, has an important role for preventing columnar, coarse grains and encouraging fine, equiaxed grain formation. Al-Ti-B grain refiners are widely used as aluminium grain refiners despite the problems in application Al-Ti-C refiners have an increasing demand in recent years. In the present work, Al-Ti-C grain refiners with different Ti:C ratios were produced by insitu method with the addition of elemental carbon. Microstructures were characterised by optic microscope and scanning electron microscope equipped with energy dispersive spectroscopy. The effects of temperature, holding time and Ti:C ratio on the grain refinement process were investigated and optimum conditions were determined.

  8. European oil refining: strategies for a competitive future

    International Nuclear Information System (INIS)

    MacDonald, James.

    1997-07-01

    European Oil Refining investigates how the industry came to be in crisis and what the future holds. As well as an extensive analysis of past and present market shifts, the report predicts likely future developments and their consequences for investors. The report reviews the European oil sector in a global context, calculates the cost to refiners of key environmental legislation, assesses the problems caused by changing product demand and crude supply, examines possible solutions to the problems of low margins and overcapacity, evaluates the key players' main strategies to increase their competitiveness, analyses the western European oil refining industry by country, details the refinery operations of the major countries of central and eastern Europe, profiles 15 of the major oil companies and estimates the increase in investment required as a result of legislative and demand changes. (author)

  9. GPGPU Implementation of a Genetic Algorithm for Stereo Refinement

    Directory of Open Access Journals (Sweden)

    Álvaro Arranz

    2015-03-01

    Full Text Available During the last decade, the general-purpose computing on graphics processing units Graphics (GPGPU has turned out to be a useful tool for speeding up many scientific calculations. Computer vision is known to be one of the fields with more penetration of these new techniques. This paper explores the advantages of using GPGPU implementation to speedup a genetic algorithm used for stereo refinement. The main contribution of this paper is analyzing which genetic operators take advantage of a parallel approach and the description of an efficient state- of-the-art implementation for each one. As a result, speed-ups close to x80 can be achieved, demonstrating to be the only way of achieving close to real-time performance.

  10. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  11. Analysis model for forecasting extreme temperature using refined rank set pair

    Directory of Open Access Journals (Sweden)

    Qiao Ling-Xia

    2013-01-01

    Full Text Available In order to improve the precision of forecasting extreme temperature time series, a refined rank set pair analysis model with a refined rank transformation function is proposed to improve precision of its prediction. The measured values of the annual highest temperature of two China’s cities, Taiyuan and Shijiazhuang, in July are taken to examine the performance of a refined rank set pair model.

  12. A grey DEMATEL approach to develop third-party logistics provider selection criteria

    DEFF Research Database (Denmark)

    Govindan, Kannan; Khodaverdi, Roohollah; Vafadarnikjoo, Amin

    2016-01-01

    - The paper's results help managers of automotive industries, particularly in developing countries, to outsource logistics activities to 3PL providers effectively and to create a significant competitive advantage. Originality/value - The main contributions of this paper are twofold. First, this paper proposes...... an integrated grey DEMATEL method to consider interdependent relationships among the 3PL provider selection criteria. Second, this study is one of the first studies to consider 3PL provider selection in a developing country like Iran....... identifies important criteria for 3PL provider selection and evaluation, and the purpose of this paper is to select 3PL providers from the viewpoint of firms which were already outsourcing their logistics services. Design/methodology/approach - This study utilized the grey decision-making trial...

  13. Trends in heavy oil production and refining in California

    International Nuclear Information System (INIS)

    Olsen, D.K.; Ramzel, E.B.; Pendergrass, R.A. II.

    1992-07-01

    This report is one of a series of publications assessing the feasibility of increasing domestic heavy oil production and is part of a study being conducted for the US Department of Energy. This report summarizes trends in oil production and refining in Canada. Heavy oil (10 degrees to 20 degrees API gravity) production in California has increased from 20% of the state's total oil production in the early 1940s to 70% in the late 1980s. In each of the three principal petroleum producing districts (Los Angeles Basin, Coastal Basin, and San Joaquin Valley) oil production has peaked then declined at different times throughout the past 30 years. Thermal production of heavy oil has contributed to making California the largest producer of oil by enhanced oil recovery processes in spite of low oil prices for heavy oil and stringent environmental regulation. Opening of Naval Petroleum Reserve No. 1, Elk Hills (CA) field in 1976, brought about a major new source of light oil at a time when light oil production had greatly declined. Although California is a major petroleum-consuming state, in 1989 the state used 13.3 billion gallons of gasoline or 11.5% of US demand but it contributed substantially to the Nation's energy production and refining capability. California is the recipient and refines most of Alaska's 1.7 million barrel per day oil production. With California production, Alaskan oil, and imports brought into California for refining, California has an excess of oil and refined products and is a net exporter to other states. The local surplus of oil inhibits exploitation of California heavy oil resources even though the heavy oil resources exist. Transportation, refining, and competition in the market limit full development of California heavy oil resources

  14. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  15. Achieving aesthetic results in facial reconstructive microsurgery: planning and executing secondary refinements.

    Science.gov (United States)

    Haddock, Nicholas T; Saadeh, Pierre B; Siebert, John W

    2012-12-01

    Free tissue transfer to improve bulk and contour in facial deformities has been proven useful, yet refinements that turn an acceptable result into an excellent result are essential to reconstruction. The authors reviewed their experience and described these refinements. The charts of 371 free tissue transfer cases (1989 to 2010) performed by the senior author (J.W.S.) were reviewed. Free tissue transfer of a circumflex scapular variant flap or superficial inferior epigastric was performed to treat deformities arising from hemifacial atrophy (n = 126), hemifacial microsomia (n = 89), radiation therapy (n = 40), bilateral malformations including lupus and polymyositis (n = 50), other congenital anomalies (n = 25), facial palsy (n = 17), and burns and trauma (n = 24). Revision surgery planning began at initial flap operation where the flap was stretched maximally and interdigitated with recipient tissue. More tissue was required in the malar region. Revision refinement was indicated in all cases (after 6 months). Flap revision involved liposuction, debulking, reelevation, and release of tethering, followed by tissue rearrangement by means of advancement, rotation, transposition, and/or turnover flaps of subcutaneous tissues from the previous free flap. The jawline frequently required more debulking. Periorbital reconstruction was combined with lower lid support with or without canthal repositioning. Conventional face-lift techniques with the flap as superficial musculoaponeurotic system augmented the result. Autologous fat injection to the alar rim, medial canthus, upper eyelid, and lip was a useful adjunct. Severe lip deficiencies were addressed with local flaps. The keys to improving results were continual critical reassessment, open-mindedness to new approaches, and maintaining high expectations. Therapeutic, V.

  16. An angularly refineable phase space finite element method with approximate sweeping procedure

    International Nuclear Information System (INIS)

    Kophazi, J.; Lathouwers, D.

    2013-01-01

    An angularly refineable phase space finite element method is proposed to solve the neutron transport equation. The method combines the advantages of two recently published schemes. The angular domain is discretized into small patches and patch-wise discontinuous angular basis functions are restricted to these patches, i.e. there is no overlap between basis functions corresponding to different patches. This approach yields block diagonal Jacobians with small block size and retains the possibility for S n -like approximate sweeping of the spatially discontinuous elements in order to provide efficient preconditioners for the solution procedure. On the other hand, the preservation of the full FEM framework (as opposed to collocation into a high-order S n scheme) retains the possibility of the Galerkin interpolated connection between phase space elements at arbitrary levels of discretization. Since the basis vectors are not orthonormal, a generalization of the Riemann procedure is introduced to separate the incoming and outgoing contributions in case of unstructured meshes. However, due to the properties of the angular discretization, the Riemann procedure can be avoided at a large fraction of the faces and this fraction rapidly increases as the level of refinement increases, contributing to the computational efficiency. In this paper the properties of the discretization scheme are studied with uniform refinement using an iterative solver based on the S 2 sweep order of the spatial elements. The fourth order convergence of the scalar flux is shown as anticipated from earlier schemes and the rapidly decreasing fraction of required Riemann faces is illustrated. (authors)

  17. Fetching and Parsing Data from the Web with OpenRefine

    Directory of Open Access Journals (Sweden)

    Evan Peter Williamson

    2017-08-01

    Full Text Available OpenRefine is a powerful tool for exploring, cleaning, and transforming data. An earlier Programming Historian lesson, “Cleaning Data with OpenRefine”, introduced the basic functionality of Refine to efficiently discover and correct inconsistency in a data set. Building on those essential data wrangling skills, this lesson focuses on Refine’s ability to fetch URLs and parse web content. Examples introduce some of the advanced features to transform and enhance a data set including: - fetch URLs using Refine - construct URL queries to retrieve information from a simple web API - parse HTML and JSON responses to extract relevant data - use array functions to manipulate string values - use Jython to extend Refine’s functionality It will be helpful to have basic familiarity with OpenRefine, HTML, and programming concepts such as variables and loops to complete this lesson.

  18. A deductive approach to select or rank journals in multifaceted subject, Oceanography

    Digital Repository Service at National Institute of Oceanography (India)

    Sahu, S.R.; Panda, K.C.

    journal) whereas Bradford’s differential approach (articles in the bibliographies of specific subject field) to account/rank the core journals. Both these methods make sense in the journal selection/ranking process to a specific subject field...

  19. High-capacity, selective solid sequestrants for innovative chemical separation: Inorganic ion exchange approach

    International Nuclear Information System (INIS)

    Bray, L.

    1995-01-01

    The approach of this task is to develop high-capacity, selective solid inorganic ion exchangers for the recovery of cesium and strontium from nuclear alkaline and acid wastes. To achieve this goal, Pacific Northwest Laboratories (PNL) is collaborating with industry and university participants to develop high capacity, selective, solid ion exchangers for the removal of specific contaminants from nuclear waste streams

  20. Genomic multiple sequence alignments: refinement using a genetic algorithm

    Directory of Open Access Journals (Sweden)

    Lefkowitz Elliot J

    2005-08-01

    Full Text Available Abstract Background Genomic sequence data cannot be fully appreciated in isolation. Comparative genomics – the practice of comparing genomic sequences from different species – plays an increasingly important role in understanding the genotypic differences between species that result in phenotypic differences as well as in revealing patterns of evolutionary relationships. One of the major challenges in comparative genomics is producing a high-quality alignment between two or more related genomic sequences. In recent years, a number of tools have been developed for aligning large genomic sequences. Most utilize heuristic strategies to identify a series of strong sequence similarities, which are then used as anchors to align the regions between the anchor points. The resulting alignment is globally correct, but in many cases is suboptimal locally. We describe a new program, GenAlignRefine, which improves the overall quality of global multiple alignments by using a genetic algorithm to improve local regions of alignment. Regions of low quality are identified, realigned using the program T-Coffee, and then refined using a genetic algorithm. Because a better COFFEE (Consistency based Objective Function For alignmEnt Evaluation score generally reflects greater alignment quality, the algorithm searches for an alignment that yields a better COFFEE score. To improve the intrinsic slowness of the genetic algorithm, GenAlignRefine was implemented as a parallel, cluster-based program. Results We tested the GenAlignRefine algorithm by running it on a Linux cluster to refine sequences from a simulation, as well as refine a multiple alignment of 15 Orthopoxvirus genomic sequences approximately 260,000 nucleotides in length that initially had been aligned by Multi-LAGAN. It took approximately 150 minutes for a 40-processor Linux cluster to optimize some 200 fuzzy (poorly aligned regions of the orthopoxvirus alignment. Overall sequence identity increased only

  1. Genetic variation of Lymnaea stagnalis tolerance to copper: A test of selection hypotheses and its relevance for ecological risk assessment

    International Nuclear Information System (INIS)

    Côte, Jessica; Bouétard, Anthony; Pronost, Yannick; Besnard, Anne-Laure; Coke, Maïra; Piquet, Fabien; Caquet, Thierry; Coutellec, Marie-Agnès

    2015-01-01

    The use of standardized monospecific testing to assess the ecological risk of chemicals implicitly relies on the strong assumption that intraspecific variation in sensitivity is negligible or irrelevant in this context. In this study, we investigated genetic variation in copper sensitivity of the freshwater snail Lymnaea stagnalis, using lineages stemming from eight natural populations or strains found to be genetically differentiated at neutral markers. Copper-induced mortality varied widely among populations, as did the estimated daily death rate and time to 50% mortality (LT50). Population genetic divergence in copper sensitivity was compared to neutral differentiation using the Q ST -F ST approach. No evidence for homogenizing selection could be detected. This result demonstrates that species-level extrapolations from single population studies are highly unreliable. The study provides a simple example of how evolutionary principles could be incorporated into ecotoxicity testing in order to refine ecological risk assessment. - Highlights: • Genetic variation in copper tolerance occurs between Lymnaea stagnalis populations. • We used the Q ST -F ST approach to test evolutionary patterns in copper tolerance. • No evidence for uniform selection was found. • Results suggest that extrapolations to the species level are not safe. • A method is proposed to refine ecological risk assessment using genetic parameters. - Genetic variation in copper tolerance occurs in Lymnaea stagnalis. A method is proposed for considering evolutionary parameters in ecological risk assessment

  2. Need for refining capacity creates opportunities for producers in Middle East

    International Nuclear Information System (INIS)

    Ali, M.S.S.

    1994-01-01

    Oil industry interest in refining has revived in the past few years in response to rising oil consumption. The trend creates opportunities, for countries in the Middle East, which do not own refining assets nearly in proportion to their crude oil reserved. By closing this gap between reserves and refining capacity, the countries can ease some of the instability now characteristic of the oil market. Some major oil producing countries have begun to move downstream. During the 1980s, Venezuela, Kuwait, Saudi Arabia, Libya, and other members of the Organization of Petroleum Exporting Countries acquired refining assets through direct total purchase or joint ventures. Nevertheless, the oil industry remains largely unintegrated, with the Middle East holding two thirds of worldwide oil reserves but only a small share downstream. As worldwide refining capacity swings from a period of surplus toward one in which the need for new capacity will be built. The paper discusses background of the situation, shrinking surplus, investment requirements, sources of capital, and shipping concerns

  3. A grid-enabled web service for low-resolution crystal structure refinement.

    Science.gov (United States)

    O'Donovan, Daniel J; Stokes-Rees, Ian; Nam, Yunsun; Blacklow, Stephen C; Schröder, Gunnar F; Brunger, Axel T; Sliz, Piotr

    2012-03-01

    Deformable elastic network (DEN) restraints have proved to be a powerful tool for refining structures from low-resolution X-ray crystallographic data sets. Unfortunately, optimal refinement using DEN restraints requires extensive calculations and is often hindered by a lack of access to sufficient computational resources. The DEN web service presented here intends to provide structural biologists with access to resources for running computationally intensive DEN refinements in parallel on the Open Science Grid, the US cyberinfrastructure. Access to the grid is provided through a simple and intuitive web interface integrated into the SBGrid Science Portal. Using this portal, refinements combined with full parameter optimization that would take many thousands of hours on standard computational resources can now be completed in several hours. An example of the successful application of DEN restraints to the human Notch1 transcriptional complex using the grid resource, and summaries of all submitted refinements, are presented as justification.

  4. Refining and petrochemicals

    International Nuclear Information System (INIS)

    Benazzi, E.; Alario, F.

    2004-01-01

    In 2003, refining margins showed a clear improvement that continued throughout the first three quarters of 2004. Oil companies posted significantly higher earnings in 2003 compared to 2002, with the results of first quarter 2004 confirming this trend. Due to higher feedstock prices, the implementation of new capacity and more intense competition, the petrochemicals industry was not able to boost margins in 2003. In such difficult business conditions, aggravated by soaring crude prices, the petrochemicals industry is not likely to see any improvement in profitability before the second half of 2004. (author)

  5. Initiating technical refinements in high-level golfers: Evidence for contradictory procedures.

    Science.gov (United States)

    Carson, Howie J; Collins, Dave; Richards, Jim

    2016-01-01

    When developing motor skills there are several outcomes available to an athlete depending on their skill status and needs. Whereas the skill acquisition and performance literature is abundant, an under-researched outcome relates to the refinement of already acquired and well-established skills. Contrary to current recommendations for athletes to employ an external focus of attention and a representative practice design,  Carson and  Collins' (2011) [Refining and regaining skills in fixation/diversification stage performers: The Five-A Model. International Review of Sport and Exercise Psychology, 4, 146-167. doi: 10.1080/1750984x.2011.613682 ] Five-A Model requires an initial narrowed internal focus on the technical aspect needing refinement: the implication being that environments which limit external sources of information would be beneficial to achieving this task. Therefore, the purpose of this paper was to (1) provide a literature-based explanation for why techniques counter to current recommendations may be (temporarily) appropriate within the skill refinement process and (2) provide empirical evidence for such efficacy. Kinematic data and self-perception reports are provided from high-level golfers attempting to consciously initiate technical refinements while executing shots onto a driving range and into a close proximity net (i.e. with limited knowledge of results). It was hypothesised that greater control over intended refinements would occur when environmental stimuli were reduced in the most unrepresentative practice condition (i.e. hitting into a net). Results confirmed this, as evidenced by reduced intra-individual movement variability for all participants' individual refinements, despite little or no difference in mental effort reported. This research offers coaches guidance when working with performers who may find conscious recall difficult during the skill refinement process.

  6. MULTIPLE CRITERIA DECISION MAKING APPROACH FOR INDUSTRIAL ENGINEER SELECTION USING FUZZY AHP-FUZZY TOPSIS

    OpenAIRE

    Deliktaş, Derya; ÜSTÜN, Özden

    2018-01-01

    In this study, a fuzzy multiple criteria decision-making approach is proposed to select an industrial engineer among ten candidates in a manufacturing environment. The industrial engineer selection problem is a special case of the personal selection problem. This problem, which has hierarchical structure of criteria and many decision makers, contains many criteria. The evaluation process of decision makers also includes ambiguous parameters. The fuzzy AHP is used to determin...

  7. Future prospects for palm oil refining and modifications

    Directory of Open Access Journals (Sweden)

    Gibon Véronique

    2009-07-01

    Full Text Available Palm oil is rich in minor components that impart unique nutritional properties and need to be preserved. In this context, refining technologies have been improved, with the dual temperature deodorizer, the double condensing unit and the ice condensing system. The DOBI is a good tool to assess quality of the crude palm oil and its ability to be properly refined. Specially refined oils open a market for new high quality products (golden palm oil, red palm oil, white soaps, etc.. Palm oil is a good candidate for the multi-step dry fractionation process, aiming to the production of commodity oils and specialty fats (cocoa butter replacers. New technological developments allow quality and yield improvements. Palm oil and fractions are also valuable feedstock for enzymatic interesterification in which applications are for commodity oil (low-trans margarines and shortenings and for special products (cocoa butter equivalents, infant formulation, ….

  8. Unilever food safety assurance system for refined vegetable oils and fats

    Directory of Open Access Journals (Sweden)

    van Duijn Gerrit

    2010-03-01

    Full Text Available The Unilever Food Safety Assurance system for refined oils and fats is based on risk assessments for the presence of contaminants or pesticide residues in crude oils, and refining process studies to validate the removal of these components. Crude oil risk assessments were carried out by combining supply chain visits, and analyses of the contaminant and pesticide residue levels in a large number of crude oil samples. Contaminants like poly-aromatic hydrocarbons and hydrocarbons of mineral origin, and pesticide residues can largely be removed by refining. For many years, this Food Safety Assurance System has proven to be effective in controlling contaminant levels in refined vegetable oils and fats.

  9. Hirshfeld atom refinement for modelling strong hydrogen bonds.

    Science.gov (United States)

    Woińska, Magdalena; Jayatilaka, Dylan; Spackman, Mark A; Edwards, Alison J; Dominiak, Paulina M; Woźniak, Krzysztof; Nishibori, Eiji; Sugimoto, Kunihisa; Grabowsky, Simon

    2014-09-01

    High-resolution low-temperature synchrotron X-ray diffraction data of the salt L-phenylalaninium hydrogen maleate are used to test the new automated iterative Hirshfeld atom refinement (HAR) procedure for the modelling of strong hydrogen bonds. The HAR models used present the first examples of Z' > 1 treatments in the framework of wavefunction-based refinement methods. L-Phenylalaninium hydrogen maleate exhibits several hydrogen bonds in its crystal structure, of which the shortest and the most challenging to model is the O-H...O intramolecular hydrogen bond present in the hydrogen maleate anion (O...O distance is about 2.41 Å). In particular, the reconstruction of the electron density in the hydrogen maleate moiety and the determination of hydrogen-atom properties [positions, bond distances and anisotropic displacement parameters (ADPs)] are the focus of the study. For comparison to the HAR results, different spherical (independent atom model, IAM) and aspherical (free multipole model, MM; transferable aspherical atom model, TAAM) X-ray refinement techniques as well as results from a low-temperature neutron-diffraction experiment are employed. Hydrogen-atom ADPs are furthermore compared to those derived from a TLS/rigid-body (SHADE) treatment of the X-ray structures. The reference neutron-diffraction experiment reveals a truly symmetric hydrogen bond in the hydrogen maleate anion. Only with HAR is it possible to freely refine hydrogen-atom positions and ADPs from the X-ray data, which leads to the best electron-density model and the closest agreement with the structural parameters derived from the neutron-diffraction experiment, e.g. the symmetric hydrogen position can be reproduced. The multipole-based refinement techniques (MM and TAAM) yield slightly asymmetric positions, whereas the IAM yields a significantly asymmetric position.

  10. Oil price scenarios and refining profitability

    International Nuclear Information System (INIS)

    Sweeney, B.

    1993-01-01

    Currently refining profitability is low because there has been an overbuilding of conversion capacity in Western Europe in the last round. Oil marketing, the chemicals business and the fundamental economy itself are at low points in their cycles which have not coincided, at least in the UK, since 1975. Against that gloomy background, it is predicted that downstream profitability will recover in the mid-1990s. Crude oil prices will remain low until the call on OPEC crude increases again and takes up the capacity which has been brought on stream in response to the Gulf War. When this happens, it is likely to trigger another price spike and another round of investment in production capacity. Environmentally driven investments in desulphurisation or emissions reduction will be poorly remunerated all the way through the value chain. Refining margins will recover when white oil demand growth tightens up the need for conversion capacity. Marketing will need to reduce the retail network overcapacity in the mature markets if it is to improve its profitability. In this period of low profitability, even with the light at the end of the tunnel for refiners in the middle of the decade, the industry structure is under threat. There is a strong argument for new modes of competitive behaviour which are backed by strong elements of cooperation. (author)

  11. Influences of different degassing processes on refining effect and properties of 4004 Al alloy

    Directory of Open Access Journals (Sweden)

    Wang Liping

    2013-03-01

    Full Text Available In order to improve the plasticity of 4004 Al alloy and subsequently the productivity of 4004 Al foil, the research studied in detail the influence of the rotary impeller degassing process on the refining effect of 4004 Al alloy, in which the impacts of four major parameters: gas flow, rotational speed, refining time, and stewing time, on degassing rate of 4004 Al alloy was systematically studied by using an orthogonal experiment methodology. Results show that the rotational speed has the greatest impact on the degassing of 4004 Al alloy, followed by gas flow and refining time; stewing time has the least impact. The optimum purification parameters obtained by current orthogonal analysis were: rotor speed of 500 r·min-1, inert gas flow of 0.4 mL·h-1, refining time of 15 min, and stewing time of 6 min. Degassing rate using the optimum parameters reaches 68%. In addition, the comparison experiments among C2Cl6 refining, rotary impeller degassing, and combined treatment of C2Cl6 refining and rotary impeller degassing for 4004 Al alloy were performed. The experimental data indicated that the combined treatment of C2Cl6 refining and rotary impeller degassing has the best degassing effect. Degassing rate of C2Cl6 refining, rotary impeller degassing and combined refining treatment is 39%, 69.1% and 76.9%, respectively. The mechanical properties of the specimen refined by rotary impeller degassing were higher than those by C2Cl6 refining, but lower than those by combined refining treatment.

  12. A Feasibility Assessment of Behavioral-based Interviewing to Improve Candidate Selection for a Pulmonary and Critical Care Medicine Fellowship Program.

    Science.gov (United States)

    Tatem, Geneva; Kokas, Maria; Smith, Cathy L; DiGiovine, Bruno

    2017-04-01

    Traditional interviews for residency and fellowship training programs are an important component in the selection process, but can be of variable value due to a nonstandardized approach. We redesigned the candidate interview process for our large pulmonary and critical care medicine fellowship program in the United States using a behavioral-based interview (BBI) structure. The primary goal of this approach was to standardize the assessment of candidates within noncognitive domains with the goal of selecting those with the best fit for our institution's fellowship program. Eight faculty members attended two BBI workshops. The first workshop identified our program's "best fit" criteria using the framework of the Accreditation Council for Graduate Medical Education's six core competencies and additional behaviors that fit within our programs. BBI questions were then selected from a national database and refined based on the attributes deemed most important by our faculty. In the second workshop, faculty practiced the BBI format in mock interviews with third-year fellows. The interview process was further refined based on feedback from the interviewees, and then applied with fellowship candidates for the 2014 recruitment season. The 1-year pilot of behavioral-based interviewing allowed us to achieve consensus on the traits sought for our incoming fellows and to standardize the interview process for our program using the framework of the Accreditation Council for Graduate Medical Education core competencies. Although the effects of this change on the clinical performance of our fellows have not yet been assessed, this description of our development and implementation processes may be helpful for programs seeking to redesign their applicant interviews.

  13. Utilization integrated Fuzzy-QFD and TOPSIS approach in supplier selection

    Directory of Open Access Journals (Sweden)

    2016-02-01

    Full Text Available Supplier selection is a typical multi-attribute problem that involves both qualitative and quantitative factors. To deal with this problem, different techniques have suggested. Being based on purely mathematical data, these techniques have significant drawbacks especially when we want to consider qualitative factors, which are very important in supplier selection and are not easy to measure. Some innovative approaches, based on artificial intelligence techniques such as Fuzzy Logic match very well with decision-making situations especially when decision makers express heterogeneous judgments. In this research, by the combination of Fuzzy logic and the House of Quality (HOQ, qualitative criteria are considered in the forward parts of car suppliers’ selection process in Sazehgostar SAIPA Company. Then, TOPSIS technique is adopted to consider quantitative metrics. Finally, by combining of Fuzzy QFD and TOPSIS techniques, these suppliers will be selected and ranked in this Company. Concern to the both qualitative and quantitative criteria, is the important point used in this research and also methodology utilized, counts innovative aspect. Limited number of experts associated with each piece and unavailability of some quantitative criteria has been limitations across of this study’s accomplishment.

  14. The Analysis of the Refined Financial Management of Modern Enterprises

    Directory of Open Access Journals (Sweden)

    Li Ran

    2016-01-01

    Full Text Available This paper briefly introduces the concept of the refined financial management, elaborates on its characteristics and puts forward some main points about it. It also comes up with some personal suggestions for reference on effective ways of refining financial management.

  15. Profex: a graphical user interface for the Rietveld refinement program BGMN

    OpenAIRE

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-01-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal str...

  16. Process for refining hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Risenfeld, E H

    1924-11-26

    A process is disclosed for the refining of hydrocarbons or other mixtures through treatment in vapor form with metal catalysts, characterized by such metals being used as catalysts, which are obtained by reduction of the oxide of minerals containing the iron group, and by the vapors of the hydrocarbons, in the presence of the water vapor, being led over these catalysts at temperatures from 200 to 300/sup 0/C.

  17. A robust fuzzy possibilistic AHP approach for partner selection in international strategic alliance

    Directory of Open Access Journals (Sweden)

    Vahid Reza Salamat

    2018-09-01

    Full Text Available The international strategic alliance is an inevitable solution for making competitive advantage and reducing the risk in today’s business environment. Partner selection is an important part in success of partnerships, and meanwhile it is a complicated decision because of various dimensions of the problem and inherent conflicts of stockholders. The purpose of this paper is to provide a practical approach to the problem of partner selection in international strategic alliances, which fulfills the gap between theories of inter-organizational relationships and quantitative models. Thus, a novel Robust Fuzzy Possibilistic AHP approach is proposed for combining the benefits of two complementary theories of inter-organizational relationships named, (1 Resource-based view, and (2 Transaction-cost theory and considering Fit theory as the perquisite of alliance success. The Robust Fuzzy Possibilistic AHP approach is a novel development of Interval-AHP technique employing robust formulation; aimed at handling the ambiguity of the problem and let the use of intervals as pairwise judgments. The proposed approach was compared with existing approaches, and the results show that it provides the best quality solutions in terms of minimum error degree. Moreover, the framework implemented in a case study and its applicability were discussed.

  18. Selection of suitable e-learning approach using TOPSIS technique with best ranked criteria weights

    Science.gov (United States)

    Mohammed, Husam Jasim; Kasim, Maznah Mat; Shaharanee, Izwan Nizal Mohd

    2017-11-01

    This paper compares the performances of four rank-based weighting assessment techniques, Rank Sum (RS), Rank Reciprocal (RR), Rank Exponent (RE), and Rank Order Centroid (ROC) on five identified e-learning criteria to select the best weights method. A total of 35 experts in a public university in Malaysia were asked to rank the criteria and to evaluate five e-learning approaches which include blended learning, flipped classroom, ICT supported face to face learning, synchronous learning, and asynchronous learning. The best ranked criteria weights are defined as weights that have the least total absolute differences with the geometric mean of all weights, were then used to select the most suitable e-learning approach by using TOPSIS method. The results show that RR weights are the best, while flipped classroom approach implementation is the most suitable approach. This paper has developed a decision framework to aid decision makers (DMs) in choosing the most suitable weighting method for solving MCDM problems.

  19. Environmental monitoring program design for uranium refining and conversion operations

    International Nuclear Information System (INIS)

    1984-08-01

    The objective of this study was to develop recommendations for the design of environmental monitoring programs at Canadian uranium refining and conversion operations. In order to develop monitoring priorities, chemical and radioactive releases to the air and water were developed for reference uranium refining and conversion facilities. The relative significance of the radioactive releases was evaluated through a pathways analysis which estimated dose to individual members of the critical receptor group. The effects of chemical releases to the environment were assessed by comparing predicted air and water contaminant levels to appropriate standards or guidelines. For the reference facilities studied, the analysis suggested that environmental effects are likely to be dominated by airborne release of both radioactive and nonradioactive contaminants. Uranium was found to be the most important radioactive species released to the air and can serve as an overall indicator of radiological impacts for any of the plants considered. The most important nonradioactive air emission was found to be fluoride (as hydrogen fluoride) from the uranium hexafluoride plant. For the uranium trioxide and uranium dioxide plants, air emissions of oxides of nitrogen were considered to be most important. The study recommendations for the design of an environmental monitoring program are based on consideration of those factors most likely to affect local air and water quality, and human radiation exposure. Site- and facility-specific factors will affect monitoring program design and the selection of components such as sampling media, locations and frequency, and analytical methods

  20. Investment in exploration-production and refining 2014

    International Nuclear Information System (INIS)

    Hureau, Geoffroy; Serbutoviez, Sylvain; Silva, Constancio; Maisonnier, Guy

    2014-11-01

    IFPEN analyses in this study the 2014 evolution of global investment in the field of exploration-production and refining: 1 - Changes in oil and gas prices: General background: weak economy and global disorder, Oil prices: fundamentals that could help to relax oil prices?, Gas prices: fall in Europe, stability in Japan, increase in the US; 2 - Exploration and production - Slowdown in growth: moderate rise in investment in 2014, exploration - Discoveries in 2014, Russia: sanctions will have limited short term impact, implications of the reforms to the Mexican energy sector; 3 - Drilling activity and market throughout the world: onshore and offshore drilling (Number of wells drilled throughout the world, Number of onshore wells, Number of offshore wells, Drilling, equipment and well services markets, Onshore drilling market, Offshore drilling market, Fracking market), Geophysical activity and the geophysical market, Offshore construction activity and the offshore construction market (Offshore construction activities, Rig construction activity, Floating Platform Systems (FPS), Sub-sea constructions, Offshore construction market); 4 - Refining - Significant increase in spending: increase in industrial costs, a slowdown in the increase in excess capacity in the future?, A bleak future for the european refining sector

  1. P-Refinement and P-Threads (Preprint)

    National Research Council Canada - National Science Library

    Dong, Steven; Karniadakis, George E

    2002-01-01

    ...]) in d dimensions, which is higher than lower-order methods. In this paper, we demonstrate that by employing multi-threading within MPI processes we manage to counter- balance the cost increase associated with P-refinement...

  2. Optimization of breast reconstruction results using TMG flap in 30 cases: Evaluation of several refinements addressing flap design, shaping techniques, and reduction of donor site morbidity.

    Science.gov (United States)

    Nickl, Stefanie; Nedomansky, Jakob; Radtke, Christine; Haslik, Werner; Schroegendorfer, Klaus F

    2018-01-31

    The transverse myocutaneous gracilis (TMG) flap is a widely used alternative to abdominal flaps in autologous breast reconstruction. However, secondary procedures for aesthetic refinement are frequently necessary. Herein, we present our experience with an optimized approach in TMG breast reconstruction to enhance aesthetic outcome and to reduce the need for secondary refinements. We retrospectively analyzed 37 immediate or delayed reconstructions with TMG flaps in 34 women, performed between 2009 and 2015. Four patients (5 flaps) constituted the conventional group (non-optimized approach). Thirty patients (32 flaps; modified group) underwent an optimized procedure consisting of modified flap harvesting and shaping techniques and methods utilized to reduce denting after rib resection and to diminish donor site morbidity. Statistically significant fewer secondary procedures (0.6 ± 0.9 versus 4.8 ± 2.2; P < .001) and fewer trips to the OR (0.4 ± 0.7 versus 2.3 ± 1.0 times; P = .001) for aesthetic refinement were needed in the modified group as compared to the conventional group. In the modified group, 4 patients (13.3%) required refinement of the reconstructed breast, 7 patients (23.3%) underwent mastopexy/mammoplasty or lipofilling of the contralateral breast, and 4 patients (13.3%) required refinement of the contralateral thigh. Total flap loss did not occur in any patient. Revision surgery was needed once. Compared to the conventional group, enhanced aesthetic results with consecutive reduction of secondary refinements could be achieved when using our modified flap harvesting and shaping techniques, as well as our methods for reducing contour deformities after rib resection and for overcoming donor site morbidities. © 2017 Wiley Periodicals, Inc.

  3. Carcinogenicity of petroleum lubricating oil distillates: effects of solvent refining, hydroprocessing, and blending.

    Science.gov (United States)

    Halder, C A; Warne, T M; Little, R Q; Garvin, P J

    1984-01-01

    Certain refining processes were investigated to determine their influence on the dermal carcinogenic activity of petroleum-derived lubricating oil distillates. Specifically, the effects of solvent refining, hydroprocessing, a combination of both processes, and the blending of oils processed using each technique were evaluated in standard mouse skin-painting bioassays. The refining process used as well as the level or severity of treatment greatly influenced the carcinogenic outcome of processed lubricating oils. Solvent refining at severities normally used appeared to eliminate carcinogenicity. In contrast, hydroprocessing alone at mild levels of treatment was successful only in reducing the carcinogenic potency; severe hydroprocessing conditions were necessary to eliminate carcinogenic activity without the use of additional refining processes. Carcinogenic activity could also be eliminated by following moderate solvent refining with mild hydroprocessing. Blending of hydroprocessed oils with solvent-refined oils resulted in a substantial reduction or even elimination of carcinogenic activity. However, the degree of protection obtained varied with the particular distillates used and appeared largely dependent on the inherent biological activity of the hydroprocessed oil.

  4. Refining in the 1990's: Restructuring and resurgence

    International Nuclear Information System (INIS)

    Cobb, C.B.

    1994-01-01

    After two years of uncertainty in dealing with the 1990 Clean Air Act Amendments coupled with the shutdown of 5% of total US refining capacity, the industry is now positioning itself for continued operations throughout the remainder of the decade. However, refineries are experiencing a shift in the mode of operations to a period of more restructuring (closings, ventures, alliances, etc.) followed by a resurgence in financial performance. The purpose of this paper is to examine the current industry and highlight the reasons for industry's current plans. The authors also speculate about the strategies companies will choose to better their financial performance. Fundamentally, the characteristics of a mature domestic business remain the driving force that shape decision making. In responding to the maturing of refining, the authors suggest that refiners will change the way they conduct business over the next few years. Building on the theme of the 1993 NPRA paper, strategies will target the domestic side of the business while simultaneously shifting to a global perspective

  5. India beckons participants in burgeoning refining sector

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that India has opened its refining sector to full private investment for the first time in more than 2 decades. The government again gave a green light to construction of three 120,000 b/d grassroots refineries in East, West, and Central India. The projects had won various governmental approvals in the past few years, but never moved off high center for a variety of economic and regulatory reasons. The difference this time is that the government is offering interests in the projects to private foreign and domestic investors. It's part of India's push to boost overall refining capacity by more than 80% this century

  6. Evolutionarily conserved bias of amino-acid usage refines the definition of PDZ-binding motif

    Directory of Open Access Journals (Sweden)

    Launey Thomas

    2011-06-01

    Full Text Available Abstract Background The interactions between PDZ (PSD-95, Dlg, ZO-1 domains and PDZ-binding motifs play central roles in signal transductions within cells. Proteins with PDZ domains bind to PDZ-binding motifs almost exclusively when the motifs are located at the carboxyl (C- terminal ends of their binding partners. However, it remains little explored whether PDZ-binding motifs show any preferential location at the C-terminal ends of proteins, at genome-level. Results Here, we examined the distribution of the type-I (x-x-S/T-x-I/L/V or type-II (x-x-V-x-I/V PDZ-binding motifs in proteins encoded in the genomes of five different species (human, mouse, zebrafish, fruit fly and nematode. We first established that these PDZ-binding motifs are indeed preferentially present at their C-terminal ends. Moreover, we found specific amino acid (AA bias for the 'x' positions in the motifs at the C-terminal ends. In general, hydrophilic AAs were favored. Our genomics-based findings confirm and largely extend the results of previous interaction-based studies, allowing us to propose refined consensus sequences for all of the examined PDZ-binding motifs. An ontological analysis revealed that the refined motifs are functionally relevant since a large fraction of the proteins bearing the motif appear to be involved in signal transduction. Furthermore, co-precipitation experiments confirmed two new protein interactions predicted by our genomics-based approach. Finally, we show that influenza virus pathogenicity can be correlated with PDZ-binding motif, with high-virulence viral proteins bearing a refined PDZ-binding motif. Conclusions Our refined definition of PDZ-binding motifs should provide important clues for identifying functional PDZ-binding motifs and proteins involved in signal transduction.

  7. An Approach to Quantifying Pokemon's Entertainment Impact with focus on Battle

    OpenAIRE

    Panumate, Chetprayoon; Xiong, Shuo; Iida, Hiroyuki

    2015-01-01

    This paper explores the attractiveness of Pokemon which is a turn-based Role Playing Game (RPG) and has been very popular for the decades. In this study we focus on Pokemon battle which is the most important component in Pokemon. The game refinement theory is used as a tool to assess the degree of sophistication of Pokemon battle. For this purpose we apply two approaches of game progress modeling to derive game refinement measure, i.e., score limit sports approach and board game approach. We ...

  8. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state-of-the-art Horn clause verification tools....

  9. Unstructured Cartesian refinement with sharp interface immersed boundary method for 3D unsteady incompressible flows

    Science.gov (United States)

    Angelidis, Dionysios; Chawdhary, Saurabh; Sotiropoulos, Fotis

    2016-11-01

    A novel numerical method is developed for solving the 3D, unsteady, incompressible Navier-Stokes equations on locally refined fully unstructured Cartesian grids in domains with arbitrarily complex immersed boundaries. Owing to the utilization of the fractional step method on an unstructured Cartesian hybrid staggered/non-staggered grid layout, flux mismatch and pressure discontinuity issues are avoided and the divergence free constraint is inherently satisfied to machine zero. Auxiliary/hanging nodes are used to facilitate the discretization of the governing equations. The second-order accuracy of the solver is ensured by using multi-dimension Lagrange interpolation operators and appropriate differencing schemes at the interface of regions with different levels of refinement. The sharp interface immersed boundary method is augmented with local near-boundary refinement to handle arbitrarily complex boundaries. The discrete momentum equation is solved with the matrix free Newton-Krylov method and the Krylov-subspace method is employed to solve the Poisson equation. The second-order accuracy of the proposed method on unstructured Cartesian grids is demonstrated by solving the Poisson equation with a known analytical solution. A number of three-dimensional laminar flow simulations of increasing complexity illustrate the ability of the method to handle flows across a range of Reynolds numbers and flow regimes. Laminar steady and unsteady flows past a sphere and the oblique vortex shedding from a circular cylinder mounted between two end walls demonstrate the accuracy, the efficiency and the smooth transition of scales and coherent structures across refinement levels. Large-eddy simulation (LES) past a miniature wind turbine rotor, parameterized using the actuator line approach, indicates the ability of the fully unstructured solver to simulate complex turbulent flows. Finally, a geometry resolving LES of turbulent flow past a complete hydrokinetic turbine illustrates

  10. Refining capacity in Quebec : the Quebec market, industry viewpoint

    International Nuclear Information System (INIS)

    Trudelle, F.

    2004-01-01

    Canada's second largest oil refinery is operated by Ultramar Ltd. in Levis, Quebec. The refinery, which supplies 45 per cent of all Quebec's petroleum, distributes its products to Quebec, Ontario and the Maritime provinces. The refinery currently produces 215,000 barrels per day. Ultramar is a division of Valero, an American firm which has a total refining capacity of 2.4 million barrels per day. A summary of the company's petroleum energy distribution, consumption, and reserves was presented and compared with the national and global position. It was noted that world demand for petroleum products increases yearly by 1 to 1.5 per cent while the demand for automobile fuel in North America increases by 2.5 to 3.5 per cent. In the last 10 years, world demand has increased by more than 22 per cent but refining capacity has only increased by 12 per cent. The average profitability of the refining industry from 1992 to 2002 was approximately 5.5 per cent, which limited investments in new refinery installations. Much of the industry's profits have been utilized to modify installations in order to reduce the sulphur content in gasoline and diesel fuels. Furthermore, the obtention of permits to construct new refining installations has become a major obstacle and the ratification of the Kyoto Protocol may signify a 30 per cent reduction in automotive fuel demand. Given this, there is currently some hesitation and uncertainty by the refining industry to proceed with new major installations. tabs., figs

  11. Integrative approaches to the prediction of protein functions based on the feature selection

    Directory of Open Access Journals (Sweden)

    Lee Hyunju

    2009-12-01

    Full Text Available Abstract Background Protein function prediction has been one of the most important issues in functional genomics. With the current availability of various genomic data sets, many researchers have attempted to develop integration models that combine all available genomic data for protein function prediction. These efforts have resulted in the improvement of prediction quality and the extension of prediction coverage. However, it has also been observed that integrating more data sources does not always increase the prediction quality. Therefore, selecting data sources that highly contribute to the protein function prediction has become an important issue. Results We present systematic feature selection methods that assess the contribution of genome-wide data sets to predict protein functions and then investigate the relationship between genomic data sources and protein functions. In this study, we use ten different genomic data sources in Mus musculus, including: protein-domains, protein-protein interactions, gene expressions, phenotype ontology, phylogenetic profiles and disease data sources to predict protein functions that are labelled with Gene Ontology (GO terms. We then apply two approaches to feature selection: exhaustive search feature selection using a kernel based logistic regression (KLR, and a kernel based L1-norm regularized logistic regression (KL1LR. In the first approach, we exhaustively measure the contribution of each data set for each function based on its prediction quality. In the second approach, we use the estimated coefficients of features as measures of contribution of data sources. Our results show that the proposed methods improve the prediction quality compared to the full integration of all data sources and other filter-based feature selection methods. We also show that contributing data sources can differ depending on the protein function. Furthermore, we observe that highly contributing data sets can be similar among

  12. Selection of engineering materials for heat exchangers (An expert system approach)

    International Nuclear Information System (INIS)

    Ahmed, K.; Abou-Ali, M.; Bassuni, M.

    1997-01-01

    The materials selection as a part of the design process of the heat exchangers is one of the most important steps in the whole industry. The clear recognition of the service requirements of the different types of the heat exchangers is very important to select the adequate and economic materials to meet such requirements. of course the manufacturer should ensure that failure does not occur in service specially it is one of the main and fetal component of the nuclear reactor, pressurized water type (PWR). It is necessary to know the possible mechanisms of failure. Also the achievement of the materials selection using the expert system approach in the process sequence of heat exchanger manufacturing is introduced. Different parameters and requirements controlling each process and the linkage between these parameters and the final product will be shown. 2 figs., 3 tabs

  13. Tumor recognition in wireless capsule endoscopy images using textural features and SVM-based feature selection.

    Science.gov (United States)

    Li, Baopu; Meng, Max Q-H

    2012-05-01

    Tumor in digestive tract is a common disease and wireless capsule endoscopy (WCE) is a relatively new technology to examine diseases for digestive tract especially for small intestine. This paper addresses the problem of automatic recognition of tumor for WCE images. Candidate color texture feature that integrates uniform local binary pattern and wavelet is proposed to characterize WCE images. The proposed features are invariant to illumination change and describe multiresolution characteristics of WCE images. Two feature selection approaches based on support vector machine, sequential forward floating selection and recursive feature elimination, are further employed to refine the proposed features for improving the detection accuracy. Extensive experiments validate that the proposed computer-aided diagnosis system achieves a promising tumor recognition accuracy of 92.4% in WCE images on our collected data.

  14. Assessing food allergy risks from residual peanut protein in highly refined vegetable oil

    NARCIS (Netherlands)

    Blom, W.M.; Kruizinga, A.G.; Rubingh, C.M.; Remington, B.C.; Crevel, R.W.R.; Houben, G.F.

    2017-01-01

    Refined vegetable oils including refined peanut oil are widely used in foods. Due to shared production processes, refined non-peanut vegetable oils can contain residual peanut proteins. We estimated the predicted number of allergic reactions to residual peanut proteins using probabilistic risk

  15. A synbio approach for selection of highly expressed gene variants in Gram-positive bacteria

    DEFF Research Database (Denmark)

    Ferro, Roberto; Rennig, Maja; Hernández Rollán, Cristina

    2018-01-01

    with a long history in food fermentation. We have developed a synbio approach for increasing gene expression in two Gram-positive bacteria. First of all, the gene of interest was coupled to an antibiotic resistance gene to create a growth-based selection system. We then randomised the translation initiation...... region (TIR) preceding the gene of interest and selected clones that produced high protein titres, as judged by their ability to survive on high concentrations of antibiotic. Using this approach, we were able to significantly increase production of two industrially relevant proteins; sialidase in B....... subtilis and tyrosine ammonia lyase in L. lactis. Gram-positive bacteria are widely used to produce industrial enzymes. High titres are necessary to make the production economically feasible. The synbio approach presented here is a simple and inexpensive way to increase protein titres, which can be carried...

  16. The changing face of U.S. refining: Ominous notes

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    As environmental protection comes of age in the US, a complex series of structural changes is also expected - in enforcement bureaucracy, manufacturing, and in energy consumption. It is already quite obvious in the petroleum refining industry. A side effect may be the export of jobs. Buyouts and closures are expected, as is increased refined product import dependency. This issue updates expected changes in gasoline and distillate product requirements in the US, and reports some ominous statements from some of the oil industry's affected parties. This issue also presented the following: (1) the ED Refining Netback Data Series for the US Gulf and West Coasts, Rotterdam, and Singapore as of Jan. 24, 1992; and (2) the ED Fuel Price Tax Series for countries of the Eastern Hemisphere, Jan. 1992 edition

  17. The stock selection problem: Is the stock selection approach more important than the optimization method? Evidence from the Danish stock market

    OpenAIRE

    Grobys, Klaus

    2011-01-01

    Passive investment strategies basically aim to replicate an underlying benchmark. Thereby, the management usually selects a subset of stocks being employed in the optimization procedure. Apart from the optimization procedure, the stock selection approach determines the stock portfolios' out-of-sample performance. The empirical study here takes into account the Danish stock market from 2000-2010 and gives evidence that stock portfolios including small companies' stocks being estimated via coin...

  18. Realization of the Zone Length Measurement during Zone Refining Process via Implementation of an Infrared Camera

    Directory of Open Access Journals (Sweden)

    Danilo C. Curtolo

    2018-05-01

    Full Text Available Zone refining, as the currently most common industrial process to attain ultrapure metals, is influenced by a variety of factors. One of these parameters, the so-called “zone length”, affects not only the ultimate concentration distribution of impurities, but also the rate at which this distribution is approached. This important parameter has however neither been investigated experimentally, nor ever varied for the purpose of optimization. This lack of detections may be due to the difficult temperature measurement of a moving molten area in a vacuum system, of which the zone refining methodology is comprised. Up to now, numerical simulation as a combination of complex mathematical calculations, as well as many assumptions has been the only way to reveal it. This paper aims to propose an experimental method to accurately measure the molten zone length and to extract helpful information on the thermal gradient, temperature profile and real growth rate in the zone refining of an exemplary metal, in this case aluminum. This thermographic method is based on the measurement of the molten surface temperature via an infrared camera, as well as further data analysis through the mathematical software MATLAB. The obtained results show great correlation with the visual observations of zone length and provide helpful information to determine the thermal gradient and real growth rate during the whole process. The investigations in this paper approved the application of an infrared camera for this purpose as a promising technique to automatically control the zone length during a zone refining process.

  19. Taiwan: refined need for consuming population

    International Nuclear Information System (INIS)

    Hayes, David.

    1995-01-01

    A brief discussion is given of the oil and gas industry in Taiwan. Topics covered include the possibility of privatization, refineries and refining contracts overseas, plans for a new petrochemical complex and an offshore submarine transmission pipeline. (UK)

  20. Effect of some grain refiners on the mechanical properties of aluminum

    International Nuclear Information System (INIS)

    Zaid, A.I.O.

    2001-01-01

    It is well established that aluminum and its alloys are grain refined by some refractory metals to enhance their surface qualities and mechanical strengths. In this paper, the literature on grain refining, and its mechanism is reviewed and discussed. Also, the effect of grain refining of commercially pure aluminum by the addition of titanium, boron, vanadium, molybdenum, and zirconium is investigated. The effect of each of these elements on grain size, hardness and mechanical behavior is presented and discussed. It was found that the addition of any of these elements except zirconium resulted in enhancement of grain size, hardness and mechanical strength. An increase of 2.1 % in flow stress of Al grain refined by Ti+B was achieved by addition of 0.1 % V at 0.2 strain. (author)

  1. Refined 3d-3d correspondence

    Energy Technology Data Exchange (ETDEWEB)

    Alday, Luis F.; Genolini, Pietro Benetti; Bullimore, Mathew; Loon, Mark van [Mathematical Institute, University of Oxford, Andrew Wiles Building,Radcliffe Observatory Quarter, Woodstock Road, Oxford, OX2 6GG (United Kingdom)

    2017-04-28

    We explore aspects of the correspondence between Seifert 3-manifolds and 3d N=2 supersymmetric theories with a distinguished abelian flavour symmetry. We give a prescription for computing the squashed three-sphere partition functions of such 3d N=2 theories constructed from boundary conditions and interfaces in a 4d N=2{sup ∗} theory, mirroring the construction of Seifert manifold invariants via Dehn surgery. This is extended to include links in the Seifert manifold by the insertion of supersymmetric Wilson-’t Hooft loops in the 4d N=2{sup ∗} theory. In the presence of a mass parameter for the distinguished flavour symmetry, we recover aspects of refined Chern-Simons theory with complex gauge group, and in particular construct an analytic continuation of the S-matrix of refined Chern-Simons theory.

  2. Using atomic energy in the oil refining and petrochemical industry

    Energy Technology Data Exchange (ETDEWEB)

    Feigin, E.A.; Barashkov, R.Ia.; Raud, E.A.

    1982-01-01

    A short description of the basic large scale processes for oil refining and petrochemistry in which nuclear reactors can be used is given. The possible industrial plans for using nuclear reactors are examined together with the problems in using the advances in atomic technology in oil refining and petrochemical processes.

  3. An Adaptive Learning Based Network Selection Approach for 5G Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Xiaohong Li

    2018-03-01

    Full Text Available Networks will continue to become increasingly heterogeneous as we move toward 5G. Meanwhile, the intelligent programming of the core network makes the available radio resource be more changeable rather than static. In such a dynamic and heterogeneous network environment, how to help terminal users select optimal networks to access is challenging. Prior implementations of network selection are usually applicable for the environment with static radio resources, while they cannot handle the unpredictable dynamics in 5G network environments. To this end, this paper considers both the fluctuation of radio resources and the variation of user demand. We model the access network selection scenario as a multiagent coordination problem, in which a bunch of rationally terminal users compete to maximize their benefits with incomplete information about the environment (no prior knowledge of network resource and other users’ choices. Then, an adaptive learning based strategy is proposed, which enables users to adaptively adjust their selections in response to the gradually or abruptly changing environment. The system is experimentally shown to converge to Nash equilibrium, which also turns out to be both Pareto optimal and socially optimal. Extensive simulation results show that our approach achieves significantly better performance compared with two learning and non-learning based approaches in terms of load balancing, user payoff and the overall bandwidth utilization efficiency. In addition, the system has a good robustness performance under the condition with non-compliant terminal users.

  4. Geometrical approach to central molecular chirality: a chirality selection rule

    OpenAIRE

    Capozziello, S.; Lattanzi, A.

    2004-01-01

    Chirality is of primary importance in many areas of chemistry and has been extensively investigated since its discovery. We introduce here the description of central chirality for tetrahedral molecules using a geometrical approach based on complex numbers. According to this representation, for a molecule having n chiral centres, it is possible to define an index of chirality. Consequently a chirality selection rule has been derived which allows the characterization of a molecule as achiral, e...

  5. Refining borders of genome-rearrangements including repetitions

    Directory of Open Access Journals (Sweden)

    JA Arjona-Medina

    2016-10-01

    Full Text Available Abstract Background DNA rearrangement events have been widely studied in comparative genomic for many years. The importance of these events resides not only in the study about relatedness among different species, but also to determine the mechanisms behind evolution. Although there are many methods to identify genome-rearrangements (GR, the refinement of their borders has become a huge challenge. Until now no accepted method exists to achieve accurate fine-tuning: i.e. the notion of breakpoint (BP is still an open issue, and despite repeated regions are vital to understand evolution they are not taken into account in most of the GR detection and refinement methods. Methods and results We propose a method to refine the borders of GR including repeated regions. Instead of removing these repetitions to facilitate computation, we take advantage of them using a consensus alignment sequence of the repeated region in between two blocks. Using the concept of identity vectors for Synteny Blocks (SB and repetitions, a Finite State Machine is designed to detect transition points in the difference between such vectors. The method does not force the BP to be a region or a point but depends on the alignment transitions within the SBs and repetitions. Conclusion The accurate definition of the borders of SB and repeated genomic regions and consequently the detection of BP might help to understand the evolutionary model of species. In this manuscript we present a new proposal for such a refinement. Features of the SBs borders and BPs are different and fit with what is expected. SBs with more diversity in annotations and BPs short and richer in DNA replication and stress response, which are strongly linked with rearrangements.

  6. Development of a fraction collection approach in capillary electrophoresis SELEX for aptamer selection.

    Science.gov (United States)

    Luo, Zhaofeng; Zhou, Hongmin; Jiang, Hao; Ou, Huichao; Li, Xin; Zhang, Liyun

    2015-04-21

    Aptamers have attracted much attention due to their ability to bind to target molecules with high affinity and specificity. The development of an approach capable of efficiently generating aptamers through systematic evolution of ligands by exponential enrichment (SELEX) is particularly challenging. Herein, a fraction collection approach in capillary electrophoresis SELEX (FCE-SELEX) for the partition of a bound DNA-target complex is developed. By integrating fraction collection with a facile oil seal method for avoiding contamination while amplifying the bound DNA-target complex, in a single round of selection, a streptavidin-binding aptamer (SBA) has been generated. The affinity of aptamer SBA-36 for streptavidin (SA) is determined as 30.8 nM by surface plasmon resonance (SPR). Selectivity and biotin competition experiments demonstrate that the SBA-36 aptamer selected by FCE-SELEX is as efficient as those from other methods. Based on the ability of fraction collection in partition and collection of the aptamer-target complex from the original DNA library, FCE-SELEX can be a universal tool for the development of aptamers.

  7. Grain refinement of AZ31 by (SiC)P: Theoretical calculation and experiment

    International Nuclear Information System (INIS)

    Guenther, R.; Hartig, Ch.; Bormann, R.

    2006-01-01

    Grain refinement of gravity die-cast Mg-alloys can be achieved via two methods: in situ refinement by primary precipitated metallic or intermetallic phases, and inoculation of the melt via ceramic particles that remain stable in the melt due to their high thermodynamic stability. In order to clarify grain refinement mechanisms and optimize possible potent refiners in Mg-alloys, a simulation method for heterogeneous nucleation based on a free growth model has been developed. It allows the prediction of the grain size as a function of the particle size distribution, the volumetric content of ceramic inoculants, the cooling rate and the alloy constitution. The model assumptions were examined experimentally by a study of the grain refinement of (SiC) P in AZ31. Additions of (SiC) P result in significant grain refinement, if appropriate parameters for ceramic particles are chosen. The model makes quantitatively correct predictions for the grain size and its variation with cooling rate

  8. Process for refining shale bitumen

    Energy Technology Data Exchange (ETDEWEB)

    Plauson, H

    1920-09-19

    A process is disclosed for refining shale bitumen for use as heavy mineral oil, characterized by mixtures of blown hard shale pitch and heavy mineral oil being blown with hot air at temperatures of 120 to 150/sup 0/ with 1 to 3 percent sulfur, and if necessary with 0.5 to 3 percent of an aldehyde.

  9. Multigrid for refined triangle meshes

    Energy Technology Data Exchange (ETDEWEB)

    Shapira, Yair

    1997-02-01

    A two-level preconditioning method for the solution of (locally) refined finite element schemes using triangle meshes is introduced. In the isotropic SPD case, it is shown that the condition number of the preconditioned stiffness matrix is bounded uniformly for all sufficiently regular triangulations. This is also verified numerically for an isotropic diffusion problem with highly discontinuous coefficients.

  10. Unit-cell refinement from powder diffraction scans

    International Nuclear Information System (INIS)

    Pawley, G.S.

    1981-01-01

    A procedure for the refinement of the crystal unit cell from a powder diffraction scan is presented. In this procedure knowledge of the crystal structure is not required, and at the end of the refinement a list of indexed intensities is produced. This list may well be usable as the starting point for the application of direct methods. The problems of least-squares ill-conditioning due to overlapping reflections are overcome by constraints. An example using decafluorocyclohexene, C 6 F 10 , shows the quality of fit obtained in a case which may even be a false minimum. The method should become more relevant as powder scans of improved resolution become available, through the use of pulsed neutron sources. (Auth.)

  11. Refining crude oils and gasolines, etc

    Energy Technology Data Exchange (ETDEWEB)

    1931-11-23

    A process of refining crude oils and gasolines distilled from shale and the like is described, consisting of submitting them to a prewash with soda, an oxidation preferably with hypochlorite solution, a hydrogenation with nascent hydrogen, and finally rectification and neutralization.

  12. A conformation-dependent stereochemical library improves crystallographic refinement even at atomic resolution

    International Nuclear Information System (INIS)

    Tronrud, Dale E.; Karplus, P. Andrew

    2011-01-01

    A script was created to allow SHELXL to use the new CDL v.1.2 stereochemical library which defines the target values for main-chain bond lengths and angles as a function of the residue’s ϕ/ψ angles. Test refinements using this script show that the refinement behavior of structures at resolutions even better than 1 Å is substantially enhanced by the use of the new conformation-dependent ideal geometry paradigm. To utilize a new conformation-dependent backbone-geometry library (CDL) in protein refinements at atomic resolution, a script was written that creates a restraint file for the SHELXL refinement program. It was found that the use of this library allows models to be created that have a substantially better fit to main-chain bond angles and lengths without degrading their fit to the X-ray data even at resolutions near 1 Å. For models at much higher resolution (∼0.7 Å), the refined model for parts adopting single well occupied positions is largely independent of the restraints used, but these structures still showed much smaller r.m.s.d. residuals when assessed with the CDL. Examination of the refinement tests across a wide resolution range from 2.4 to 0.65 Å revealed consistent behavior supporting the use of the CDL as a next-generation restraint library to improve refinement. CDL restraints can be generated using the service at http://pgd.science.oregonstate.edu/cdl_shelxl/

  13. Comparisons of single-stage and two-stage approaches to genomic selection.

    Science.gov (United States)

    Schulz-Streeck, Torben; Ogutu, Joseph O; Piepho, Hans-Peter

    2013-01-01

    Genomic selection (GS) is a method for predicting breeding values of plants or animals using many molecular markers that is commonly implemented in two stages. In plant breeding the first stage usually involves computation of adjusted means for genotypes which are then used to predict genomic breeding values in the second stage. We compared two classical stage-wise approaches, which either ignore or approximate correlations among the means by a diagonal matrix, and a new method, to a single-stage analysis for GS using ridge regression best linear unbiased prediction (RR-BLUP). The new stage-wise method rotates (orthogonalizes) the adjusted means from the first stage before submitting them to the second stage. This makes the errors approximately independently and identically normally distributed, which is a prerequisite for many procedures that are potentially useful for GS such as machine learning methods (e.g. boosting) and regularized regression methods (e.g. lasso). This is illustrated in this paper using componentwise boosting. The componentwise boosting method minimizes squared error loss using least squares and iteratively and automatically selects markers that are most predictive of genomic breeding values. Results are compared with those of RR-BLUP using fivefold cross-validation. The new stage-wise approach with rotated means was slightly more similar to the single-stage analysis than the classical two-stage approaches based on non-rotated means for two unbalanced datasets. This suggests that rotation is a worthwhile pre-processing step in GS for the two-stage approaches for unbalanced datasets. Moreover, the predictive accuracy of stage-wise RR-BLUP was higher (5.0-6.1%) than that of componentwise boosting.

  14. A Canadian refiner's perspective of synthetic crudes

    International Nuclear Information System (INIS)

    Halford, T.L.; McIntosh, A.P.; Rasmussen

    1997-01-01

    Some of the factors affecting a refiner's choice of crude oil include refinery hardware, particularly gas oil crackers, products slate and product specifications, crude availability, relative crude price and crude quality. An overview of synthetic crude, the use of synthetic crude combined with other crudes and a comparison of synthetic crude with conventional crude oil was given. The two main users of synthetic crude are basically two groups of refiners, those large groups who use synthetic crude combined with other crudes, and a smaller group who run synthetic crude on specially designed units as a sole feed. The effects of changes in fuel legislation were reviewed. It was predicted that the changes will have a mixed impact on the value of synthetic crude, but low sulphur diesel regulations and gasoline sulphur regulations will make current synthetic crudes attractive. The big future change with a negative impact will be diesel cetane increases to reduce engine emissions. This will reduce synthetic crude attractiveness due to distillate yields and quality and high gas oil yields. Similarly, any legislation limiting aromatics in diesel fuel will also make synthetic crudes less attractive. Problems experienced by refiners with hardware dedicated to synthetic crude (salt, naphthenic acid, fouling, quality variations) were also reviewed. 3 tabs

  15. Comparing the steam and electric heat tracing solutions for petrochemical or refining facilities

    Energy Technology Data Exchange (ETDEWEB)

    Young, Joseph G.; McQueen, Greg [Tyco Thermal Controls, Belgie (Belgium)

    2012-07-01

    In this era of energy conservation and cost reduction, the ability to effectively select the optimal solution to meet the heat management system needs of petrochemical or refining facilities is becoming increasingly important. Depending on the type and location of the plant, a heat management system (HMS) can comprise a significant portion of the overall capital expenditure, as well as the ongoing operating and maintenance costs. Several important heat management system design decisions affect the financial operations of a facility, including the selection of the heat tracing technology, the utility distribution scheme, and the insulation system criteria, among others. However, most of these decisions are made early in the project life-cycle without thorough analysis of the various options available. From a high level perspective, numerous heat trace media should be considered, including electric, steam, tempered water, and glycol. These systems also have different impacts on piping systems within the plant battery limits (ISBL) and transfer lines outside of the battery limits (OSBL). This paper takes a careful look at two of the predominant heat tracing technologies - electric heat tracing and steam tracing - and compares these within the larger framework of the heat management system, and relative to petrochemical or refining facilities within the general Brazil geography. In the broader context, a heat management system is defined as the heat tracing technology itself, the utility distribution associated with that technology, the control and monitoring scheme associated with that technology, and the insulation system. We will evaluate the capital expenditure cost, operating expenditure cost, and overall reliability of the electric and steam tracing mediums in both the ISBL and OSBL environments within this broader context. (author)

  16. Grain Refinement of Commercial EC Grade 1070 Aluminium Alloy for Electrical Application

    OpenAIRE

    Hassanabadi, Massoud

    2015-01-01

    The aluminium alloys for electrical conductivity applications are generally not grain refinedsince the addition of grain refiners drops the electrical conductivity by introducing impuritiesinto the melt. Non-grain refined aluminium may lead to bar fracture and cracks during themetalworking process. The present study focuses to find an optimum balance between the grain refiner addition andthe electrical conductivity of commercial EC grade 1070 aluminium alloy for electricalapplication. In orde...

  17. Refined Fuchs inequalities for systems of linear differential equations

    International Nuclear Information System (INIS)

    Gontsov, R R

    2004-01-01

    We refine the Fuchs inequalities obtained by Corel for systems of linear meromorphic differential equations given on the Riemann sphere. Fuchs inequalities enable one to estimate the sum of exponents of the system over all its singular points. We refine these well-known inequalities by considering the Jordan structure of the leading coefficient of the Laurent series for the matrix of the right-hand side of the system in the neighbourhood of a singular point

  18. Rising costs call for new European refining strategies

    International Nuclear Information System (INIS)

    Sweeney, B.N.C.

    1993-01-01

    The outlook for the global refining industry is for increased spending and reduced margins, largely because of efforts to improve the environment. A look at these trends through the end of the decade is thus in order. Three major industry thrusts are proposed to see refiners through this uncertain period. Three main thrusts are necessary: fixed costs must be reduced by re-engineering business processes and reexamining noncore business units against total and marginal costs. In this respect the best refiners are well ahead of the good ones. New cooperative ways of meeting regulations must be sought, to avoid wasteful over capacity. Joint ventures and alliances with competitors will be needed. The cooperative principle upstream must be extended and new strategies must be sought to meet product demand changes and reduce feedstock costs. The picture that is presented is tough, largely because of the wish to improve the environment. The question that must be continually reviewed is ''Have governments got the right balance in these regulations between the environment and the downstream industry?''

  19. Stability studies on refined soybean oil stored in various conditions

    International Nuclear Information System (INIS)

    Arawande, J.O.; Amoo, I.A.

    2008-01-01

    The 12 months stability study of freshly produced refined soybean oil revealed that refined soybean oil stored in plastic containers in dark was more hydrolytically and oxidatively stable than that stored in other containers in light condition. There was no significant difference at P < 0.05 in free fatty acids and acid value of oil stored under light and dark conditions in tin and glass containers but there was significant difference at P < 0.05 in peroxide value of oil stored in light and dark conditions in all the storage containers. Light increased the degree of oxidative rancidity of refined soybean oil, the most in tin containers, followed by glass containers and the least in plastic containers. (author)

  20. Atlantic Basin refining profitability

    International Nuclear Information System (INIS)

    Jones, R.J.

    1998-01-01

    A review of the profitability margins of oil refining in the Atlantic Basin was presented. Petroleum refiners face the continuous challenge of balancing supply with demand. It would appear that the profitability margins in the Atlantic Basin will increase significantly in the near future because of shrinking supply surpluses. Refinery capacity utilization has reached higher levels than ever before. The American Petroleum Institute reported that in August 1997, U.S. refineries used 99 per cent of their capacity for several weeks in a row. U.S. gasoline inventories have also declined as the industry has focused on reducing capital costs. This is further evidence that supply and demand are tightly balanced. Some of the reasons for tightening supplies were reviewed. It was predicted that U.S. gasoline demand will continue to grow in the near future. Gasoline demand has not declined as expected because new vehicles are not any more fuel efficient today than they were a decade ago. Although federally-mandated fuel efficiency standards were designed to lower gasoline consumption, they may actually have prevented consumption from falling. Atlantic margins were predicted to continue moving up because of the supply and demand evidence: high capacity utilization rates, low operating inventories, limited capacity addition resulting from lower capital spending, continued U.S. gasoline demand growth, and steady total oil demand growth. 11 figs

  1. The effect of refining step on the changes in viscosity values of vegetable oils

    International Nuclear Information System (INIS)

    Ergonul, P.G.

    2013-01-01

    In this work, the viscosity values of chemically refined vegetable oils (sunflower, corn, soybean and rapeseed) and physically refined vegetable oils (olive and palm) were determined during refining processes. At this point of view, fatty acid compositions and viscosity values of oil samples were determined. The edible vegetable oils presented Newtonian behavior in shear rates at ranges 6.28-20.93 s/sup -1/. It was observed that palm oil is more viscous than the others. During physical refining, the effect of both oil type and refining steps were significantly important, whereas in chemical refining only the effect of oil type was found statistically important (p<0.01). It was observed that correlation among fatty acid compositions and viscosity values of the samples showed differences according to oil type. (author)

  2. Hybrid direct and iterative solvers for h refined grids with singularities

    KAUST Repository

    Paszyński, Maciej R.; Paszyńska, Anna; Dalcin, Lisandro; Calo, Victor M.

    2015-01-01

    on top of it. The hybrid solver is applied for two or three dimensional grids automatically h refined towards point or edge singularities. The automatic refinement is based on the relative error estimations between the coarse and fine mesh solutions [2

  3. Panorama 2016 - Refining outlook for 2035

    International Nuclear Information System (INIS)

    Marion, Pierre; Saint-Antonin, Valerie

    2015-12-01

    The rising influence of objectives intended to address the energy transition in global industry helps to perpetuate a high degree of uncertainty about changes in the transportation sector, currently a bastion of the oil industry. How can the growing need for individual mobility be met while reducing Greenhouse Gas (GHG) emissions in a world of open international competition? The refining sector is gaining strength in Asia and the Middle East to the detriment of Europe and North America, reflecting demand and the intrinsic competitiveness of various geographic regions. The 2025 worldwide roll-out (2020 in Europe) of a bunker fuel grade below 0.5 wt% (percentage by weight) in sulphur could experience delays, given the number of installations to be completed. Finally, the reversal of the 'all diesel' trend in the European transport market is a positive change for the European refining industry. (authors)

  4. A robust optimisation approach to the problem of supplier selection and allocation in outsourcing

    Science.gov (United States)

    Fu, Yelin; Keung Lai, Kin; Liang, Liang

    2016-03-01

    We formulate the supplier selection and allocation problem in outsourcing under an uncertain environment as a stochastic programming problem. Both the decision-maker's attitude towards risk and the penalty parameters for demand deviation are considered in the objective function. A service level agreement, upper bound for each selected supplier's allocation and the number of selected suppliers are considered as constraints. A novel robust optimisation approach is employed to solve this problem under different economic situations. Illustrative examples are presented with managerial implications highlighted to support decision-making.

  5. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivation...... compare the results with other state of the art Horn clause verification tools....

  6. The regulations concerning refining business of nuclear source material and nuclear fuel materials

    International Nuclear Information System (INIS)

    1979-01-01

    The regulations are provided for under the law for the regulations of nuclear source materials, nuclear fuel materials and reactors and provisions concerning refining business in the enforcement order for the law. The basic concepts and terms are defined, such as: exposure dose, accumulative dose; controlled area; inspected surrounding area and employee. Refining facilities listed in the application for designation shall be classified into clushing and leaching, thickning, refining facilities, storage facilities of nuclear source materials and nuclear fuel materials, disposal facilities of contaminated substances and building for refining, etc. Business program attached to the application shall include expected time of beginning of refining, estimated production amount of nuclear source materials or nuclear fuel materials for the first three years and funds necessary for construction, etc. Records shall be made and kept for particular periods on delivery and storage of nuclear source materials and nuclear fuel materials, control of radiation, maintenance and accidents of refining facilities. Safety securing, application of internationally regulated substances and measures in dangerous situations are stipulated respectively. Exposure dose of employees and other specified matters shall be reported by the refiner yearly to the Director General of Science and Technology Agency and the Minister of International Trade and Industry. (Okada, K.)

  7. Refinement in black chrome for use as a solar selective coating

    Science.gov (United States)

    Mcdonald, G. E.

    1974-01-01

    Black chrome is significant as a solar selective coating because the current extensive use of black chrome in the electroplating industry as a durable decorative finish makes black chrome widely available on a commercial scale and potentially low in cost as a solar selective coating. Black-chrome deposits were modified by underplating with dull nickel or by being plated on rough surfaces. Both of these procedures increased the visible absorptance. There was no change in the infrared reflectance for the dull-nickel - black-chrome combination from that reported for the bright-nickel - black-chrome combination. However, the bright-nickel - black-chrome coating plated on rough surfaces indicated a slight decrease in infrared reflectance. As integrated over the solar spectrum for air mass 2, the reflectance of the dull-nickel - black-chrome coating was 0.077, of the bright-nickel - black-chrome coating plated on a 0.75-micron (30-microinch) surface was 0.070, of the bright-nickel - black-chrome coating plated on a 2.5 micron (100-microinch) surface was 0.064. The corresponding values for the bright-nickel - black-chrome coating on a 0.0125-micron (0.5-microinch) surface, two samples of black nickel, and two samples of Nextrel black paint were 0.132, 0.123, 0.133, and 0.033, respectively.

  8. A QM/MM refinement of an experimental DNA structure with metal-mediated base pairs.

    Science.gov (United States)

    Kumbhar, Sadhana; Johannsen, Silke; Sigel, Roland K O; Waller, Mark P; Müller, Jens

    2013-10-01

    A series of hybrid quantum mechanical/molecular mechanical (QM/MM) calculations was performed on models of a DNA duplex with artificial silver(I)-mediated imidazole base pairs. The optimized structures were compared to the original experimental NMR structure (Nat. Chem. 2 (2010) 229-234). The metal⋯metal distances are significantly shorter (~0.5Å) in the QM/MM model than in the original NMR structure. As a result, argentophilic interactions are feasible between the silver(I) ions of neighboring metal-mediated base pairs. Using the computationally determined metal⋯metal distances, a re-refined NMR solution structure of the DNA duplex was obtained. In this new NMR structure, all experimental constraints remain fulfilled. The new NMR structure shows less deviation from the regular B-type conformation than the original one. This investigation shows that the application of QM/MM models to generate additional constraints to be used during NMR structural refinements represents an elegant approach to obtaining high-resolution NMR structures. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Refining waste hardmetals into tungsten oxide nanosheets via facile method

    Energy Technology Data Exchange (ETDEWEB)

    Li, Zhifei; Zheng, Guangwei; Wang, Jinshu, E-mail: wangjsh@bjut.edu.cn; Li, Hongyi, E-mail: lhy06@bjut.edu.cn; Wu, Junshu; Du, Yucheng [Beijing University of Technology, Key Laboratory of Advanced Functional Materials, School of Materials Science and Engineering (China)

    2016-04-15

    A new hydrothermal system has been designed to recycle waste WC–Co hardmetal with low cobalt (Co) content (3 %). In the solution system, nitric acid was designed to dissolve Co, H{sub 2}O{sub 2} served as oxidant to accelerate the oxidation of the WC–Co hardmetals, and fluorine (F{sup −}) was designed to dissolve and recrystallize generated tungsten oxides, which were found to possess a layered structure using scanning electron microscopy and transmission electron microscopy. The obtained tungsten oxides were identified as WO{sub 3}·0.33H{sub 2}O by X-ray diffraction and their specific surface area was measured as 89.2 m{sup 2} g{sup −1} via N{sub 2} adsorption–desorption techniques. The present layered structure tungsten oxides exhibited a promising capability for removing lead ion (Pb{sup 2+}) and organic species, such as methyl blue. The adsorption model was found to be in agreement with Langmuir isotherm model. Given the facile synthesis procedure and promising properties of final products, this new approach should have great potential for refining some other waste hardmetals or tungsten products.Graphical AbstractA new hydrothermal system was designed to recycle waste hardmetal with low cobalt content. Through this method, waste hardmetal was refined into WO{sub 3}·0.33H{sub 2}O nanosheets which shows excellent adsorption capacities toward methylene blue and lead ion (Pb{sup 2+}).

  10. A Fuzzy MCDM Approach for Green Supplier Selection from the Economic and Environmental Aspects

    Directory of Open Access Journals (Sweden)

    Hsiu Mei Wang Chen

    2016-01-01

    Full Text Available Due to the challenge of rising public awareness of environmental issues and governmental regulations, green supply chain management (SCM has become an important issue for companies to gain environmental sustainability. Supplier selection is one of the key operational tasks necessary to construct a green SCM. To select the most suitable suppliers, many economic and environmental criteria must be considered in the decision process. Although numerous studies have used economic criteria such as cost, quality, and lead time in the supplier selection process, only some studies have taken into account the environmental issues. This study proposes a comprehensive fuzzy multicriteria decision making (MCDM approach for green supplier selection and evaluation, using both economic and environmental criteria. In the proposed approach, a fuzzy analytic hierarchy process (AHP is employed to determine the important weights of criteria under vague environment. In addition, a fuzzy technique for order performance by similarity to ideal solution (TOPSIS is used to evaluate and rank the potential suppliers. Finally, a case study in Luminance Enhancement Film (LEF industry is presented to illustrate the applicability and efficiency of the proposed method.

  11. Assessment of Energy Efficiency Improvement in the United States Petroleum Refining Industry

    Energy Technology Data Exchange (ETDEWEB)

    Morrow, William R. [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Marano, John [JM Energy Consulting, Inc.; Sathaye, Jayant [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Hasanbeigi, Ali [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Xu, Tengfang [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2013-02-01

    Adoption of efficient process technologies is an important approach to reducing CO2 emissions, in particular those associated with combustion. In many cases, implementing energy efficiency measures is among the most cost-effective approaches that any refiner can take, improving productivity while reducing emissions. Therefore, careful analysis of the options and costs associated with efficiency measures is required to establish sound carbon policies addressing global climate change, and is the primary focus of LBNL’s current petroleum refining sector analysis for the U.S. Environmental Protection Agency. The analysis is aimed at identifying energy efficiency-related measures and developing energy abatement supply curves and CO2 emissions reduction potential for the U.S. refining industry. A refinery model has been developed for this purpose that is a notional aggregation of the U.S. petroleum refining sector. It consists of twelve processing units and account s for the additional energy requirements from steam generation, hydrogen production and water utilities required by each of the twelve processing units. The model is carbon and energy balanced such that crud e oil inputs and major refinery sector outputs (fuels) are benchmarked to 2010 data. Estimates of the current penetration for the identified energy efficiency measures benchmark the energy requirements to those reported in U.S. DOE 2010 data. The remaining energy efficiency potential for each of the measures is estimated and compared to U.S. DOE fuel prices resulting in estimates of cost- effective energy efficiency opportunities for each of the twelve major processes. A combined cost of conserved energy supply curve is also presented along with the CO2 emissions abatement opportunities that exist in the U.S. petroleum refinery sector. Roughly 1,200 PJ per year of primary fuels savings and close to 500 GWh per y ear of electricity savings are potentially cost

  12. Profex: a graphical user interface for the Rietveld refinement program BGMN.

    Science.gov (United States)

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-10-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.

  13. Stable grid refinement and singular source discretization for seismic wave simulations

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, N A; Sjogreen, B

    2009-10-30

    An energy conserving discretization of the elastic wave equation in second order formulation is developed for a composite grid, consisting of a set of structured rectangular component grids with hanging nodes on the grid refinement interface. Previously developed summation-by-parts properties are generalized to devise a stable second order accurate coupling of the solution across mesh refinement interfaces. The discretization of singular source terms of point force and point moment tensor type are also studied. Based on enforcing discrete moment conditions that mimic properties of the Dirac distribution and its gradient, previous single grid formulas are generalized to work in the vicinity of grid refinement interfaces. These source discretization formulas are shown to give second order accuracy in the solution, with the error being essentially independent of the distance between the source and the grid refinement boundary. Several numerical examples are given to illustrate the properties of the proposed method.

  14. Demand, deregulation may attract more refiners to Asia

    International Nuclear Information System (INIS)

    Rhodes, A.K.

    1995-01-01

    As a result of rapidly increasing demand, major oil companies are eyeing Asian oil and gas markets more closely than ever. Higher earnings can be expected there than in the US or Europe, where product markets--especially light products--are expected to tighten long-term. Of the nations with growing requirements for refined products, China and India offer greater opportunity for foreign investors to enter downstream projects. Also offering excellent business prospects are Thailand, Malaysia, and Indonesia. The paper discusses oil demand, products, refining capacity, and capacity additions in Japan, South Korea, Taiwan, Singapore, India, Indonesia, Malaysia, and China

  15. China's refiners face massive overhaul, expansion to meet demand growth, new crude slate

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    China's refining industry has embarked on a massive overhaul and expansion to accommodate soaring domestic growth in refined products demand. Currently that growth in demand is being met by increasing imports of refined products, in recent years attaining triple digit growth rates and squeezing direly needed foreign exchange. The focus is on adding refining capacity of about 1.4 million b/d to the current capacity of about 3.2 million b/d by 2000. Priority for increasing capacity is being given to expanding existing refineries and participating in foreign joint venture grassroots refineries along China's booming coastal regions as well as hiking output. A major challenge for China's refineries is that country's reentry into the General Agreement on Tariffs and Trade (GATT), recently signed in Morocco by more than 100 nations. The accompanying reduction of tariffs on imported refined products will make it more difficult for China's marginal refineries to compete in the domestic market. The paper discusses imports and exports, LPG outlook, refining capacity, revamps needed, third party processing, China's first joint venture refinery, industry plans, and GATT challenges

  16. The Refining Mechanism of Super Gravity on the Solidification Structure of Al-Cu Alloys

    Directory of Open Access Journals (Sweden)

    Yuhou Yang

    2016-12-01

    Full Text Available There is far less study of the refining effect of super gravity fields on solidification structures of metals than of the effects of electrical currents, magnetic and ultrasonic fields. Moreover, the refining mechanisms of super gravity are far from clear. This study applied a super gravity field to Al-Cu alloys to investigate its effect on refining their structures and the mechanism of interaction. The experimental results showed that the solidification structure of Al-Cu alloys can be greatly refined by a super gravity field. The major refining effect was mainly achieved when super gravity was applied at the initial solidification stage; only slight refinement could be obtained towards the end of solidification. No refinement was obtained by the super gravity treatment on pure liquid or solid stages. The effectiveness of super gravity results from its promoting the multiplication of crystal nuclei, which we call “Heavy Crystal Rain”, thereby greatly strengthening the migration of crystal nuclei within the alloy. Increasing the solute Cu content can increase nucleation density and restrict the growth of crystals, which further increases the refining effect of super gravity. Within this paper, we also discuss the motile behavior of crystals in a field of super gravity.

  17. Effect of grain refinement on the fluidity of two commercial Al-Si foundry alloys

    Science.gov (United States)

    Dahle, A. K.; Tøndel, P. A.; Paradies, C. J.; Arnberg, L.

    1996-08-01

    The effect of grain refinement on the fluidity of AlSi7Mg and AlSi11Mg has been investigated by spiral tests. Two different types of grain refiners have been evaluated. An AlTi5Bl master alloy was added to different Ti contents. Since the commercial alloys had a high initial content of titanium, model alloys were made to investigate the fluidity at low grain refiner additions. Commercial alloys grain refined only by boron additions have also been investigated. The results from the fluidity measurements have been verified by measuring the dendrite coherency point of the different cast alloys. Although different, the two methods show similar trends. The spirals from each fraction grain refiner cast were subsequently investigated metallographically at the tip of the spirals and at a reference point a distance behind, but no obvious difference in structure was observed. For both alloys, an increase in fluidity is observed as the content of grain refiner increases above 0.12 pct Ti, while the fluidity is impaired with increased grain refinement below 0.12 pct Ti. The alloys grain refined with ~0.015 pct B show the highest fraction solid at dendrite coherency, the smallest grain size, and the best fluidity.

  18. Approaches to Refining Estimates of Global Burden and Economics of Dengue

    Science.gov (United States)

    Shepard, Donald S.; Undurraga, Eduardo A.; Betancourt-Cravioto, Miguel; Guzmán, María G.; Halstead, Scott B.; Harris, Eva; Mudin, Rose Nani; Murray, Kristy O.; Tapia-Conyer, Roberto; Gubler, Duane J.

    2014-01-01

    Dengue presents a formidable and growing global economic and disease burden, with around half the world's population estimated to be at risk of infection. There is wide variation and substantial uncertainty in current estimates of dengue disease burden and, consequently, on economic burden estimates. Dengue disease varies across time, geography and persons affected. Variations in the transmission of four different viruses and interactions among vector density and host's immune status, age, pre-existing medical conditions, all contribute to the disease's complexity. This systematic review aims to identify and examine estimates of dengue disease burden and costs, discuss major sources of uncertainty, and suggest next steps to improve estimates. Economic analysis of dengue is mainly concerned with costs of illness, particularly in estimating total episodes of symptomatic dengue. However, national dengue disease reporting systems show a great diversity in design and implementation, hindering accurate global estimates of dengue episodes and country comparisons. A combination of immediate, short-, and long-term strategies could substantially improve estimates of disease and, consequently, of economic burden of dengue. Suggestions for immediate implementation include refining analysis of currently available data to adjust reported episodes and expanding data collection in empirical studies, such as documenting the number of ambulatory visits before and after hospitalization and including breakdowns by age. Short-term recommendations include merging multiple data sources, such as cohort and surveillance data to evaluate the accuracy of reporting rates (by health sector, treatment, severity, etc.), and using covariates to extrapolate dengue incidence to locations with no or limited reporting. Long-term efforts aim at strengthening capacity to document dengue transmission using serological methods to systematically analyze and relate to epidemiologic data. As promising tools

  19. Fuzzy Axiomatic Design approach based green supplier selection

    DEFF Research Database (Denmark)

    Kannan, Devika; Govindan, Kannan; Rajendran, Sivakumar

    2015-01-01

    proposes a multi-criteria decision-making (MCDM) approach called Fuzzy Axiomatic Design (FAD) to select the best green supplier for Singapore-based plastic manufacturing company. At first, the environmental criteria was developed along with the traditional criteria based on the literature review......Abstract Green Supply Chain Management (GSCM) is a developing concept recently utilized by manufacturing firms of all sizes. All industries, small or large, seek improvements in the purchasing of raw materials, manufacturing, allocation, transportation efficiency, in curbing storage time, importing...... responsible in addition to being efficiently managed. A significant way to implement responsible GSCM is to reconsider, in innovative ways, the purchase and supply cycle, and a preliminary step would be to ensure that the supplier of goods successfully incorporates green criteria. Therefore, this paper...

  20. Application of methodological approach to selection of sportswomen to calisthenics teams for group exercises, considering compatibility factor

    Directory of Open Access Journals (Sweden)

    O.S. Kozhanova

    2015-04-01

    Full Text Available Purpose: motivation of methodological approach to selection of sportswomen to calisthenics teams for group exercises considering compatibility factor. Material: in the research 40 high qualification sportswomen of 17-23 yrs age with sport experience of 11-16 years participated. With cluster analysis 10 gymnasts with morphological indicators, meeting modern standards of group exercises were selected. Results: we found 5 generalized factors, which characterize structure of selection to teams and determines 72% of dispersion. Influence of kinds and connected with them criteria of compatibility on efficiency of gymnasts’ competition functioning were also determined. The authors substantiated methodological approach to selection of sportswomen to calisthenics teams for group exercises, considering compatibility factor. Conclusions: in selection to calisthenics teams for group exercises it is purposeful to realize complex registration of compatibility kinds, considering gymnasts’ similar features by recommended indicators.