WorldWideScience

Sample records for selective refinement approach

  1. Refining processes of selected copper alloys

    Directory of Open Access Journals (Sweden)

    S. Rzadkosz

    2009-04-01

    Full Text Available The analysis of the refining effectiveness of the liquid copper and selected copper alloys by various micro additions and special refiningsubstances – was performed. Examinations of an influence of purifying, modifying and deoxidation operations performed in a metal bath on the properties of certain selected alloys based on copper matrix - were made. Refining substances, protecting-purifying slag, deoxidation and modifying substances containing micro additions of such elements as: zirconium, boron, phosphor, sodium, lithium, or their compounds introduced in order to change micro structures and properties of alloys, were applied in examinations. A special attention was directed to macro and micro structures of alloys, their tensile and elongation strength and hot-cracks sensitivity. Refining effects were estimated by comparing the effectiveness of micro structure changes with property changes of copper and its selected alloys from the group of tin bronzes.

  2. A refined approach: Saudi Arabia moves beyond crude

    International Nuclear Information System (INIS)

    Krane, Jim

    2015-01-01

    Saudi Arabia's role in global energy markets is changing. The kingdom is reshaping itself as a supplier of refined petroleum products while moving beyond its long-held role as a simple exporter of crude oil. This change is commensurate with the typical development trajectory of a state progressing to a more advanced stage of global economic integration. Gains from increased refining include reducing fuel imports and capturing margins now bequeathed to competitors. Refining also allows the kingdom to export its heavy crude oil to a wider array of customers, beyond select importers configured to handle heavy crudes. However, the move also presents strategic complications. The world's 'swing supplier' of oil may grow less willing or able to adjust supply to suit market demands. In the process, Saudi Arabia may have to update the old “oil for security” relationship that links it with Washington, augmenting it with a more diverse set of economic and investment ties with individual companies and countries, including China. -- Highlights: •Saudi Arabia is diverting crude oil into an expanding refining sector. •In doing so, the kingdom is moving beyond its role as global “swing supplier” of crude oil. •The kingdom will benefit from increased refining, including enhanced demand for heavy crude. •Strategic complications may force it to seek security partners beyond Washington

  3. Risk as economic category: systematics scientific approach and refinement contents

    OpenAIRE

    V.G. Vygovskyy

    2015-01-01

    The paper studies the categorical-conceptual apparatus of risk and its refinement based on a critical analysis of existing systematic scientific approaches. Determined that in the refinement of the economic nature of the risk of a number of controversial issues: the definition of the objective or subjective nature of risk; matching of concepts such as «risk», «danger», «loss», «probability of loss»; definition of negative or positive consequences of risk; identification of risk with its conse...

  4. Risk as economic category: systematics scientific approach and refinement contents

    Directory of Open Access Journals (Sweden)

    V.G. Vygovskyy

    2015-03-01

    Full Text Available The paper studies the categorical-conceptual apparatus of risk and its refinement based on a critical analysis of existing systematic scientific approaches. Determined that in the refinement of the economic nature of the risk of a number of controversial issues: the definition of the objective or subjective nature of risk; matching of concepts such as «risk», «danger», «loss», «probability of loss»; definition of negative or positive consequences of risk; identification of risk with its consequences, or source of origin, which makes the relevance of research topics. As a result of scientific research has been refined interpretation of risk as an economic category, the characteristics of the company associated with the probability of unforeseen situations that may lead to negative and positive impacts, assessment of which requires the development of alternatives for management decisions. Clarification of the definition focuses on the possibility (probability of a favorable (unfavorable events which require certain corrective action management unit of the enterprise. The author emphasizes the mandatory features of the category of «risk», in particular: the concept of risk is always associated with the uncertainty of the future; event occurring has implications for the enterprise (both negative and positive; consequences for necessitates the development of a number of alternative solutions to the possible elimination of the negative consequences of risky events; risk – a mandatory attribute of modern management (its value is enhanced in terms of market conditions; subject to risk assessment and management by the company. Dedicated and updated features contribute to the clarification of the nature of the economic risk and categorical conceptual apparatus of risk management.

  5. An approach of requirements tracing in formal refinement

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Leuschel, Michael

    2010-01-01

    Formal modeling of computing systems yields models that are intended to be correct with respect to the requirements that have been formalized. The complexity of typical computing systems can be addressed by formal refinement introducing all the necessary details piecemeal. We report on preliminar...... changes, making use of corresponding techniques already built into the Event-B method....

  6. Decadal climate prediction with a refined anomaly initialisation approach

    Science.gov (United States)

    Volpi, Danila; Guemas, Virginie; Doblas-Reyes, Francisco J.; Hawkins, Ed; Nichols, Nancy K.

    2017-03-01

    In decadal prediction, the objective is to exploit both the sources of predictability from the external radiative forcings and from the internal variability to provide the best possible climate information for the next decade. Predicting the climate system internal variability relies on initialising the climate model from observational estimates. We present a refined method of anomaly initialisation (AI) applied to the ocean and sea ice components of the global climate forecast model EC-Earth, with the following key innovations: (1) the use of a weight applied to the observed anomalies, in order to avoid the risk of introducing anomalies recorded in the observed climate, whose amplitude does not fit in the range of the internal variability generated by the model; (2) the AI of the ocean density, instead of calculating it from the anomaly initialised state of temperature and salinity. An experiment initialised with this refined AI method has been compared with a full field and standard AI experiment. Results show that the use of such refinements enhances the surface temperature skill over part of the North and South Atlantic, part of the South Pacific and the Mediterranean Sea for the first forecast year. However, part of such improvement is lost in the following forecast years. For the tropical Pacific surface temperature, the full field initialised experiment performs the best. The prediction of the Arctic sea-ice volume is improved by the refined AI method for the first three forecast years and the skill of the Atlantic multidecadal oscillation is significantly increased compared to a non-initialised forecast, along the whole forecast time.

  7. Use of nutrient self selection as a diet refining tool in Tenebrio molitor (Coleoptera: Tenebrionidae)

    Science.gov (United States)

    A new method to refine existing dietary supplements for improving production of the yellow mealworm, Tenebrio molitor L. (Coleoptera: Tenebrionidae), was tested. Self selected ratios of 6 dietary ingredients by T. molitor larvae were used to produce a dietary supplement. This supplement was compared...

  8. Spatially adaptive hp refinement approach for PN neutron transport equation using spectral element method

    International Nuclear Information System (INIS)

    Nahavandi, N.; Minuchehr, A.; Zolfaghari, A.; Abbasi, M.

    2015-01-01

    Highlights: • Powerful hp-SEM refinement approach for P N neutron transport equation has been presented. • The method provides great geometrical flexibility and lower computational cost. • There is a capability of using arbitrary high order and non uniform meshes. • Both posteriori and priori local error estimation approaches have been employed. • High accurate results are compared against other common adaptive and uniform grids. - Abstract: In this work we presented the adaptive hp-SEM approach which is obtained from the incorporation of Spectral Element Method (SEM) and adaptive hp refinement. The SEM nodal discretization and hp adaptive grid-refinement for even-parity Boltzmann neutron transport equation creates powerful grid refinement approach with high accuracy solutions. In this regard a computer code has been developed to solve multi-group neutron transport equation in one-dimensional geometry using even-parity transport theory. The spatial dependence of flux has been developed via SEM method with Lobatto orthogonal polynomial. Two commonly error estimation approaches, the posteriori and the priori has been implemented. The incorporation of SEM nodal discretization method and adaptive hp grid refinement leads to high accurate solutions. Coarser meshes efficiency and significant reduction of computer program runtime in comparison with other common refining methods and uniform meshing approaches is tested along several well-known transport benchmarks

  9. Repetitive Identification of Structural Systems Using a Nonlinear Model Parameter Refinement Approach

    Directory of Open Access Journals (Sweden)

    Jeng-Wen Lin

    2009-01-01

    Full Text Available This paper proposes a statistical confidence interval based nonlinear model parameter refinement approach for the health monitoring of structural systems subjected to seismic excitations. The developed model refinement approach uses the 95% confidence interval of the estimated structural parameters to determine their statistical significance in a least-squares regression setting. When the parameters' confidence interval covers the zero value, it is statistically sustainable to truncate such parameters. The remaining parameters will repetitively undergo such parameter sifting process for model refinement until all the parameters' statistical significance cannot be further improved. This newly developed model refinement approach is implemented for the series models of multivariable polynomial expansions: the linear, the Taylor series, and the power series model, leading to a more accurate identification as well as a more controllable design for system vibration control. Because the statistical regression based model refinement approach is intrinsically used to process a “batch” of data and obtain an ensemble average estimation such as the structural stiffness, the Kalman filter and one of its extended versions is introduced to the refined power series model for structural health monitoring.

  10. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  11. a Novel Approach to Veterinary Spatial Epidemiology: Dasymetric Refinement of the Swiss Dog Tumor Registry Data

    Science.gov (United States)

    Boo, G.; Fabrikant, S. I.; Leyk, S.

    2015-08-01

    In spatial epidemiology, disease incidence and demographic data are commonly summarized within larger regions such as administrative units because of privacy concerns. As a consequence, analyses using these aggregated data are subject to the Modifiable Areal Unit Problem (MAUP) as the geographical manifestation of ecological fallacy. In this study, we create small area disease estimates through dasymetric refinement, and investigate the effects on predictive epidemiological models. We perform a binary dasymetric refinement of municipality-aggregated dog tumor incidence counts in Switzerland for the year 2008 using residential land as a limiting ancillary variable. This refinement is expected to improve the quality of spatial data originally aggregated within arbitrary administrative units by deconstructing them into discontinuous subregions that better reflect the underlying population distribution. To shed light on effects of this refinement, we compare a predictive statistical model that uses unrefined administrative units with one that uses dasymetrically refined spatial units. Model diagnostics and spatial distributions of model residuals are assessed to evaluate the model performances in different regions. In particular, we explore changes in the spatial autocorrelation of the model residuals due to spatial refinement of the enumeration units in a selected mountainous region, where the rugged topography induces great shifts of the analytical units i.e., residential land. Such spatial data quality refinement results in a more realistic estimation of the population distribution within administrative units, and thus, in a more accurate modeling of dog tumor incidence patterns. Our results emphasize the benefits of implementing a dasymetric modeling framework in veterinary spatial epidemiology.

  12. Refinement of Triple-Negative Breast Cancer Molecular Subtypes: Implications for Neoadjuvant Chemotherapy Selection.

    Directory of Open Access Journals (Sweden)

    Brian D Lehmann

    Full Text Available Triple-negative breast cancer (TNBC is a heterogeneous disease that can be classified into distinct molecular subtypes by gene expression profiling. Considered a difficult-to-treat cancer, a fraction of TNBC patients benefit significantly from neoadjuvant chemotherapy and have far better overall survival. Outside of BRCA1/2 mutation status, biomarkers do not exist to identify patients most likely to respond to current chemotherapy; and, to date, no FDA-approved targeted therapies are available for TNBC patients. Previously, we developed an approach to identify six molecular subtypes TNBC (TNBCtype, with each subtype displaying unique ontologies and differential response to standard-of-care chemotherapy. Given the complexity of the varying histological landscape of tumor specimens, we used histopathological quantification and laser-capture microdissection to determine that transcripts in the previously described immunomodulatory (IM and mesenchymal stem-like (MSL subtypes were contributed from infiltrating lymphocytes and tumor-associated stromal cells, respectively. Therefore, we refined TNBC molecular subtypes from six (TNBCtype into four (TNBCtype-4 tumor-specific subtypes (BL1, BL2, M and LAR and demonstrate differences in diagnosis age, grade, local and distant disease progression and histopathology. Using five publicly available, neoadjuvant chemotherapy breast cancer gene expression datasets, we retrospectively evaluated chemotherapy response of over 300 TNBC patients from pretreatment biopsies subtyped using either the intrinsic (PAM50 or TNBCtype approaches. Combined analysis of TNBC patients demonstrated that TNBC subtypes significantly differ in response to similar neoadjuvant chemotherapy with 41% of BL1 patients achieving a pathological complete response compared to 18% for BL2 and 29% for LAR with 95% confidence intervals (CIs; [33, 51], [9, 28], [17, 41], respectively. Collectively, we provide pre-clinical data that could inform

  13. Refining a Tool for the Selection of Experts in Educational Research

    Directory of Open Access Journals (Sweden)

    Miguel Cruz Ramírez

    2012-11-01

    Full Text Available In this paper we report a research study geared toward refining an empirical instrument for the selection of experts for educational research, according to its reliability and internal consistency. To this end we used a three-round Delphi technique and subjected the results to a factor analysis. Latent variables were determined that explain the nature of the sources of argumentation necessary for ensuring an adequate level of competence on the part of the experts.

  14. Gradual approach to refinement of the nasal tip: surgical results

    Directory of Open Access Journals (Sweden)

    Thiago Bittencourt Ottoni de Carvalho

    2015-02-01

    Full Text Available Introduction: The complexity of the nasal tip structures and the impact of surgical maneuvers make the prediction of the final outcome very difficult. Therefore, no single technique is enough to correct the several anatomical presentations, and adequate preoperative planning represents the basis of rhinoplasty. Objective: To present results of rhinoplasty, through the gradual surgical approach to nasal tip definition based on anatomical features, and to evaluate the degree of patient satisfaction after the surgical procedure. Methods: Longitudinal retrospective cohort study of the medical charts of 533 patients of both genders who underwent rhinoplasty from January of 2005 to January of 2012 was performed. Cases were allocated into seven groups: (1 no surgery on nasal tip; (2 interdomal breakup; (3 cephalic trim; (4 domal suture; (5 shield-shaped graft; (6 vertical dome division; (7 replacement of lower lateral cartilages. Results: Group 4 was the most prevalent. The satisfaction rate was 96% and revision surgery occurred in 4% of cases. Conclusion: The protocol used allowed the implementation of a gradual surgical approach to nasal tip definition with the nasal anatomical characteristics, high rate of patient satisfaction with the surgical outcome, and low rate of revision.

  15. Refining mass formulas for astrophysical applications: A Bayesian neural network approach

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2017-10-01

    Background: Exotic nuclei, particularly those near the drip lines, are at the core of one of the fundamental questions driving nuclear structure and astrophysics today: What are the limits of nuclear binding? Exotic nuclei play a critical role in both informing theoretical models as well as in our understanding of the origin of the heavy elements. Purpose: Our aim is to refine existing mass models through the training of an artificial neural network that will mitigate the large model discrepancies far away from stability. Methods: The basic paradigm of our two-pronged approach is an existing mass model that captures as much as possible of the underlying physics followed by the implementation of a Bayesian neural network (BNN) refinement to account for the missing physics. Bayesian inference is employed to determine the parameters of the neural network so that model predictions may be accompanied by theoretical uncertainties. Results: Despite the undeniable quality of the mass models adopted in this work, we observe a significant improvement (of about 40%) after the BNN refinement is implemented. Indeed, in the specific case of the Duflo-Zuker mass formula, we find that the rms deviation relative to experiment is reduced from σrms=0.503 MeV to σrms=0.286 MeV. These newly refined mass tables are used to map the neutron drip lines (or rather "drip bands") and to study a few critical r -process nuclei. Conclusions: The BNN approach is highly successful in refining the predictions of existing mass models. In particular, the large discrepancy displayed by the original "bare" models in regions where experimental data are unavailable is considerably quenched after the BNN refinement. This lends credence to our approach and has motivated us to publish refined mass tables that we trust will be helpful for future astrophysical applications.

  16. An adaptive mesh refinement approach for average current nodal expansion method in 2-D rectangular geometry

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► A new adaptive h-refinement approach has been developed for a class of nodal method. ► The resulting system of nodal equations is more amenable to efficient numerical solution. ► The benefit of the approach is reducing computational efforts relative to the uniform fine mesh modeling. ► Spatially adaptive approach greatly enhances the accuracy of the solution. - Abstract: The aim of this work is to develop a spatially adaptive coarse mesh strategy that progressively refines the nodes in appropriate regions of domain to solve the neutron balance equation by zeroth order nodal expansion method. A flux gradient based a posteriori estimation scheme has been utilized for checking the approximate solutions for various nodes. The relative surface net leakage of nodes has been considered as an assessment criterion. In this approach, the core module is called in by adaptive mesh generator to determine gradients of node surfaces flux to explore the possibility of node refinements in appropriate regions and directions of the problem. The benefit of the approach is reducing computational efforts relative to the uniform fine mesh modeling. For this purpose, a computer program ANRNE-2D, Adaptive Node Refinement Nodal Expansion, has been developed to solve neutron diffusion equation using average current nodal expansion method for 2D rectangular geometries. Implementing the adaptive algorithm confirms its superiority in enhancing the accuracy of the solution without using fine nodes throughout the domain and increasing the number of unknown solution. Some well-known benchmarks have been investigated and improvements are reported

  17. Application of multi-criteria material selection techniques to constituent refinement in biobased composites

    International Nuclear Information System (INIS)

    Miller, Sabbie A.; Lepech, Michael D.; Billington, Sarah L.

    2013-01-01

    Highlights: • Biobased composites have the potential to replace certain engineered materials. • Woven reinforcement can provide better material properties in biobased composites. • Short fiber filler can provide lower environmental impact in biobased composites. • Per function, different fibers are desired to lower composite environmental impact. - Abstract: Biobased composites offer a potentially low environmental impact material option for the construction industries. Designing these materials to meet both performance requirements for an application and minimize environmental impacts requires the ability to refine composite constituents based on environmental impact and mechanical properties. In this research, biobased composites with varying natural fiber reinforcement in a poly(β-hydroxybutyrate)-co-(β-hydroxyvalerate) matrix were characterized based on material properties through experiments and environmental impact through life cycle assessments. Using experimental results, these biobased composites were found to have competitive flexural properties and thermal conductivity with certain short-chopped glass fiber reinforced plastics. Multi-criteria material selection techniques were applied to weigh desired material properties with greenhouse gas emissions, fossil fuel demand, and Eco-Indicator ’99 score. The effects of using different reinforcing fibers in biobased composites were analyzed using the developed selection scheme as a tool for choosing constituents. The use of multi-criteria material selection provided the ability to select fiber reinforcement for biobased composites and showed when it would be more appropriate to use a novel biobased composite or a currently available engineered material

  18. Modulation wave approach to the structural parameterization and Rietveld refinement of low carnegieite

    International Nuclear Information System (INIS)

    Withers, R.L.; Thompson, J.G.

    1993-01-01

    The crystal structure of low carnegieite, NaAlSiO 4 [M r =142.05, orthorhombic, Pb2 1 a, a=10.261(1), b=14.030(2), c=5.1566(6) A, D x =2.542 g cm -3 , Z=4, Cu Kα 1 , λ=1.5406 A, μ=77.52 cm -1 , F(000)=559.85], is determined via Rietveld refinement from powder data, R p =0.057, R wp =0.076, R Bragg =0.050. Given that there are far too many parameters to be determined via unconstrained Rietveld refinement, a group theoretical or modulation wave approach is used in order to parameterize the structural deviation of low carnegieite from its underlying C9 aristotype. Appropriate crystal chemical constraints are applied in order to provide two distinct plausible starting models for the structure of the aluminosilicate framework. The correct starting model for the aluminosilicate framework as well as the ordering and positions of the non-framework Na atoms are then determined via Rietveld refinement. At all stages, chemical plausibility is checked via the use of the bond-length-bond-valence formalism. The JCPDS file number for low carnegieite is 44-1496. (orig.)

  19. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics

    Science.gov (United States)

    Claudio-Campos, Karla; Rivera-Miranda, Giselle; Bermúdez-Bosch, Luis; Renta, Jessicca Y.; Cadilla, Carmen L.; Cruz, Iadelisse; Feliu, Juan F.; Vergara, Cunegundo; Ruaño, Gualberto

    2016-01-01

    Aim This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients. Patients & Methods A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals. Results The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day) than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day), and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (pwarfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias). Conclusions Results supported our rationale to incorporate individual’s genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics. Trial Registration ClinicalTrials.gov NCT01318057 PMID:26745506

  20. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics.

    Directory of Open Access Journals (Sweden)

    Jorge Duconge

    Full Text Available This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients.A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals.The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day, and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001. The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias.Results supported our rationale to incorporate individual's genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics.ClinicalTrials.gov NCT01318057.

  1. Grid refinement for aeroacoustics in the lattice Boltzmann method: A directional splitting approach

    Science.gov (United States)

    Gendre, Félix; Ricot, Denis; Fritz, Guillaume; Sagaut, Pierre

    2017-08-01

    This study focuses on grid refinement techniques for the direct simulation of aeroacoustics, when using weakly compressible lattice Boltzmann models, such as the D3Q19 athermal velocity set. When it comes to direct noise computation, very small errors on the density or pressure field may have great negative consequences. Even strong acoustic density fluctuations have indeed a clearly lower amplitude than the hydrodynamic ones. This work deals with such very weak spurious fluctuations that emerge when a vortical structure crosses a refinement interface, which may contaminate the resulting aeroacoustic field. We show through an extensive literature review that, within the framework described above, this issue has never been addressed before. To tackle this problem, we develop an alternative algorithm and compare its behavior to a classical one, which fits our in-house vertex-centered data structure. Our main idea relies on a directional splitting of the continuous discrete velocity Boltzmann equation, followed by an integration over specific characteristics. This method can be seen as a specific coupling between finite difference and lattice Boltzmann, locally on the interface between the two grids. The method is assessed considering two cases: an acoustic pulse and a convected vortex. We show how very small errors on the density field arise and propagate throughout the domain when a vortical flow crosses the refinement interface. We also show that an increased free stream Mach number (but still within the weakly compressible regime) strongly deteriorates the situation, although the magnitude of the errors may remain negligible for purely aerodynamic studies. A drastically reduced level of error for the near-field spurious noise is obtained with our approach, especially for under-resolved simulations, a situation that is crucial for industrial applications. Thus, the vortex case is proved useful for aeroacoustic validations of any grid refinement algorithm.

  2. A Semi-Supervised Approach for Refining Transcriptional Signatures of Drug Response and Repositioning Predictions.

    Directory of Open Access Journals (Sweden)

    Francesco Iorio

    Full Text Available We present a novel strategy to identify drug-repositioning opportunities. The starting point of our method is the generation of a signature summarising the consensual transcriptional response of multiple human cell lines to a compound of interest (namely the seed compound. This signature can be derived from data in existing databases, such as the connectivity-map, and it is used at first instance to query a network interlinking all the connectivity-map compounds, based on the similarity of their transcriptional responses. This provides a drug neighbourhood, composed of compounds predicted to share some effects with the seed one. The original signature is then refined by systematically reducing its overlap with the transcriptional responses induced by drugs in this neighbourhood that are known to share a secondary effect with the seed compound. Finally, the drug network is queried again with the resulting refined signatures and the whole process is carried on for a number of iterations. Drugs in the final refined neighbourhood are then predicted to exert the principal mode of action of the seed compound. We illustrate our approach using paclitaxel (a microtubule stabilising agent as seed compound. Our method predicts that glipizide and splitomicin perturb microtubule function in human cells: a result that could not be obtained through standard signature matching methods. In agreement, we find that glipizide and splitomicin reduce interphase microtubule growth rates and transiently increase the percentage of mitotic cells-consistent with our prediction. Finally, we validated the refined signatures of paclitaxel response by mining a large drug screening dataset, showing that human cancer cell lines whose basal transcriptional profile is anti-correlated to them are significantly more sensitive to paclitaxel and docetaxel.

  3. Comparison of Two Grid Refinement Approaches for High Resolution Regional Climate Modeling: MPAS vs WRF

    Science.gov (United States)

    Leung, L.; Hagos, S. M.; Rauscher, S.; Ringler, T.

    2012-12-01

    This study compares two grid refinement approaches using global variable resolution model and nesting for high-resolution regional climate modeling. The global variable resolution model, Model for Prediction Across Scales (MPAS), and the limited area model, Weather Research and Forecasting (WRF) model, are compared in an idealized aqua-planet context with a focus on the spatial and temporal characteristics of tropical precipitation simulated by the models using the same physics package from the Community Atmosphere Model (CAM4). For MPAS, simulations have been performed with a quasi-uniform resolution global domain at coarse (1 degree) and high (0.25 degree) resolution, and a variable resolution domain with a high-resolution region at 0.25 degree configured inside a coarse resolution global domain at 1 degree resolution. Similarly, WRF has been configured to run on a coarse (1 degree) and high (0.25 degree) resolution tropical channel domain as well as a nested domain with a high-resolution region at 0.25 degree nested two-way inside the coarse resolution (1 degree) tropical channel. The variable resolution or nested simulations are compared against the high-resolution simulations that serve as virtual reality. Both MPAS and WRF simulate 20-day Kelvin waves propagating through the high-resolution domains fairly unaffected by the change in resolution. In addition, both models respond to increased resolution with enhanced precipitation. Grid refinement induces zonal asymmetry in precipitation (heating), accompanied by zonal anomalous Walker like circulations and standing Rossby wave signals. However, there are important differences between the anomalous patterns in MPAS and WRF due to differences in the grid refinement approaches and sensitivity of model physics to grid resolution. This study highlights the need for "scale aware" parameterizations in variable resolution and nested regional models.

  4. Structure refinement and membrane positioning of selectively labeled OmpX in phospholipid nanodiscs

    Energy Technology Data Exchange (ETDEWEB)

    Hagn, Franz, E-mail: franz.hagn@tum.de; Wagner, Gerhard, E-mail: gerhard-wagner@hms.harvard.edu [Harvard Medical School, Department of Biological Chemistry and Molecular Pharmacology (United States)

    2015-04-15

    NMR structural studies on membrane proteins are often complicated by their large size, taking into account the contribution of the membrane mimetic. Therefore, classical resonance assignment approaches often fail. The large size of phospholipid nanodiscs, a detergent-free phospholipid bilayer mimetic, prevented their use in high-resolution solution-state NMR spectroscopy so far. We recently introduced smaller nanodiscs that are suitable for NMR structure determination. However, side-chain assignments of a membrane protein in nanodiscs still remain elusive. Here, we utilized a NOE-based approach to assign (stereo-) specifically labeled Ile, Leu, Val and Ala methyl labeled and uniformly {sup 15}N-Phe and {sup 15}N-Tyr labeled OmpX and calculated a refined high-resolution structure. In addition, we were able to obtain residual dipolar couplings (RDCs) of OmpX in nanodiscs using Pf1 phage medium for the induction of weak alignment. Back-calculated NOESY spectra of the obtained NMR structures were compared to experimental NOESYs in order to validate the quality of these structures. We further used NOE information between protonated lipid head groups and side-chain methyls to determine the position of OmpX in the phospholipid bilayer. These data were verified by paramagnetic relaxation enhancement (PRE) experiments obtained with Gd{sup 3+}-modified lipids. Taken together, this study emphasizes the need for the (stereo-) specific labeling of membrane proteins in a highly deuterated background for high-resolution structure determination, particularly in large membrane mimicking systems like phospholipid nanodiscs. Structure validation by NOESY back-calculation will be helpful for the structure determination and validation of membrane proteins where NOE assignment is often difficult. The use of protein to lipid NOEs will be beneficial for the positioning of a membrane protein in the lipid bilayer without the need for preparing multiple protein samples.

  5. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    Science.gov (United States)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  6. “New” Antigenic Targets and Methodological Approaches for Refining Laboratory Diagnosis of Antiphospholipid Syndrome

    Science.gov (United States)

    Misasi, Roberta; Capozzi, Antonella; Longo, Agostina; Recalchi, Serena; Lococo, Emanuela; Alessandri, Cristiano; Conti, Fabrizio; Valesini, Guido

    2015-01-01

    Antiphospholipid antibodies (aPLs) are a heterogeneous group of antibodies directed against phospholipids or protein/phospholipid complexes. Currently, aPLs are assessed using either “solid-phase” assays that identify anticardiolipin antibodies and anti-β2-glycoprotein I antibodies or “liquid-phase” assay that identifies lupus anticoagulant. However, in the last few years, “new” antigenic targets and methodological approaches have been employed for refining laboratory diagnosis of antiphospholipid syndrome (APS). In this review the potential diagnostic value of antibodies to domains of β2-GPI, prothrombin/phosphatidylserine, vimentin/cardiolipin, protein S, protein C, annexin A2, annexin A5, and phospholipid antigens is discussed. Moreover, new technical approaches, including chemiluminescence, multiline dot assay, and thin layer chromatography (TLC) immunostaining, which utilize different supports for detection of aPL, have been developed. A special focus has been dedicated on “seronegative” APS, that is, those patients with a clinical profile suggestive of APS (thromboses, recurrent miscarriages, or foetal loss), who are persistently negative for the routinely used aPL. Recent findings suggest that, in sera from patients with SN-APS, antibodies may be detected using “new” antigenic targets (mainly vimentin/cardiolipin) or methodological approaches different from traditional techniques (TLC immunostaining). Thus, APS represents a mosaic, in which antibodies against different antigenic targets may be detected thanks to the continuously evolving new technologies. PMID:25874238

  7. Action Refinement

    NARCIS (Netherlands)

    Gorrieri, R.; Rensink, Arend; Bergstra, J.A.; Ponse, A.; Smolka, S.A.

    2001-01-01

    In this chapter, we give a comprehensive overview of the research results in the field of action refinement during the past 12 years. The different approaches that have been followed are outlined in detail and contrasted to each other in a uniform framework. We use two running examples to discuss

  8. Comparison of geometrical isomerization of unsaturated fatty acids in selected commercially refined oils

    Directory of Open Access Journals (Sweden)

    Tasan, M.

    2011-09-01

    Full Text Available Four different commercially refined vegetable oils were analyzed by capillary gas-liquid chromatography for their trans fatty acid contents. The results obtained showed that the total trans FA contents in refined sunflower, corn, soybean, and hazelnut oils were 0.68 ± 0.41, 0.51 ± 0.24, 1.27 ± 0.57, and 0.26 ± 0.07% of total FA, respectively. The total trans FA comprised isomers of the C18:1, C18:2 and C18:3 FA. Meanwhile, five brands of the refined sunflower oil and two brands of hazelnut oil contained no measurable amounts of total trans C18:3 acids. The total trans C18:2 acid was the predominant trans FA found in the refined sunflower and corn oils, while trans polyunsaturated FAs for the refined soybean oils were found at high levels. However, total trans C18:1 acid was the major trans FA for refined hazelnut oils. The commercially refined vegetable oils with a relatively high total polyunsaturated FA contained considerable amounts of trans polyunsaturated isomers. This study indicates that it is necessary to optimize industrial deodorization, especially the time and temperature, for each different FA composition of oil used.

    Cuatro aceites vegetales refinados comerciales diferentes fueron analizados por cromatografía de gases para determinar el contenido en ácidos grasos trans. Los resultados obtenidos mostraron que el contenido total de los FA trans de aceites refinados de girasol, maíz, soja y avellana fueron 0.68 ± 0.41, 0.51 ± 0.24, 1.27 ± 0.57, y 0.26 ± 0.07% de FA totales, respetivamente. Los ácidos grasos totales trans comprenden a isómeros de FA C18:1, C18:2 y C18:3. Cinco marcas de aceites de girasol refinado y dos marcas de aceite de avellana contenían cantidades no medibles de ácidos trans C18:3 totales. Los ácidos C18:2 trans totales fueron los FA trans predominantes en el aceite de girasol y ma

  9. A spatially adaptive grid-refinement approach for the finite element solution of the even-parity Boltzmann transport equation

    International Nuclear Information System (INIS)

    Mirza, Anwar M.; Iqbal, Shaukat; Rahman, Faizur

    2007-01-01

    A spatially adaptive grid-refinement approach has been investigated to solve the even-parity Boltzmann transport equation. A residual based a posteriori error estimation scheme has been utilized for checking the approximate solutions for various finite element grids. The local particle balance has been considered as an error assessment criterion. To implement the adaptive approach, a computer program ADAFENT (adaptive finite elements for neutron transport) has been developed to solve the second order even-parity Boltzmann transport equation using K + variational principle for slab geometry. The program has a core K + module which employs Lagrange polynomials as spatial basis functions for the finite element formulation and Legendre polynomials for the directional dependence of the solution. The core module is called in by the adaptive grid generator to determine local gradients and residuals to explore the possibility of grid refinements in appropriate regions of the problem. The a posteriori error estimation scheme has been implemented in the outer grid refining iteration module. Numerical experiments indicate that local errors are large in regions where the flux gradients are large. A comparison of the spatially adaptive grid-refinement approach with that of uniform meshing approach for various benchmark cases confirms its superiority in greatly enhancing the accuracy of the solution without increasing the number of unknown coefficients. A reduction in the local errors of the order of 10 2 has been achieved using the new approach in some cases

  10. A spatially adaptive grid-refinement approach for the finite element solution of the even-parity Boltzmann transport equation

    Energy Technology Data Exchange (ETDEWEB)

    Mirza, Anwar M. [Department of Computer Science, National University of Computer and Emerging Sciences, NUCES-FAST, A.K. Brohi Road, H-11, Islamabad (Pakistan)], E-mail: anwar.m.mirza@gmail.com; Iqbal, Shaukat [Faculty of Computer Science and Engineering, Ghulam Ishaq Khan (GIK) Institute of Engineering Science and Technology, Topi-23460, Swabi (Pakistan)], E-mail: shaukat@giki.edu.pk; Rahman, Faizur [Department of Physics, Allama Iqbal Open University, H-8 Islamabad (Pakistan)

    2007-07-15

    A spatially adaptive grid-refinement approach has been investigated to solve the even-parity Boltzmann transport equation. A residual based a posteriori error estimation scheme has been utilized for checking the approximate solutions for various finite element grids. The local particle balance has been considered as an error assessment criterion. To implement the adaptive approach, a computer program ADAFENT (adaptive finite elements for neutron transport) has been developed to solve the second order even-parity Boltzmann transport equation using K{sup +} variational principle for slab geometry. The program has a core K{sup +} module which employs Lagrange polynomials as spatial basis functions for the finite element formulation and Legendre polynomials for the directional dependence of the solution. The core module is called in by the adaptive grid generator to determine local gradients and residuals to explore the possibility of grid refinements in appropriate regions of the problem. The a posteriori error estimation scheme has been implemented in the outer grid refining iteration module. Numerical experiments indicate that local errors are large in regions where the flux gradients are large. A comparison of the spatially adaptive grid-refinement approach with that of uniform meshing approach for various benchmark cases confirms its superiority in greatly enhancing the accuracy of the solution without increasing the number of unknown coefficients. A reduction in the local errors of the order of 10{sup 2} has been achieved using the new approach in some cases.

  11. A novel non-uniform control vector parameterization approach with time grid refinement for flight level tracking optimal control problems.

    Science.gov (United States)

    Liu, Ping; Li, Guodong; Liu, Xinggao; Xiao, Long; Wang, Yalin; Yang, Chunhua; Gui, Weihua

    2018-02-01

    High quality control method is essential for the implementation of aircraft autopilot system. An optimal control problem model considering the safe aerodynamic envelop is therefore established to improve the control quality of aircraft flight level tracking. A novel non-uniform control vector parameterization (CVP) method with time grid refinement is then proposed for solving the optimal control problem. By introducing the Hilbert-Huang transform (HHT) analysis, an efficient time grid refinement approach is presented and an adaptive time grid is automatically obtained. With this refinement, the proposed method needs fewer optimization parameters to achieve better control quality when compared with uniform refinement CVP method, whereas the computational cost is lower. Two well-known flight level altitude tracking problems and one minimum time cost problem are tested as illustrations and the uniform refinement control vector parameterization method is adopted as the comparative base. Numerical results show that the proposed method achieves better performances in terms of optimization accuracy and computation cost; meanwhile, the control quality is efficiently improved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Modeling pH-zone refining countercurrent chromatography: a dynamic approach.

    Science.gov (United States)

    Kotland, Alexis; Chollet, Sébastien; Autret, Jean-Marie; Diard, Catherine; Marchal, Luc; Renault, Jean-Hugues

    2015-04-24

    A model based on mass transfer resistances and acid-base equilibriums at the liquid-liquid interface was developed for the pH-zone refining mode when it is used in countercurrent chromatography (CCC). The binary separation of catharanthine and vindoline, two alkaloids used as starting material for the semi-synthesis of chemotherapy drugs, was chosen for the model validation. Toluene/CH3CN/water (4/1/5, v/v/v) was selected as biphasic solvent system. First, hydrodynamics and mass transfer were studied by using chemical tracers. Trypan blue only present in the aqueous phase allowed the determination of the parameters τextra and Pe for hydrodynamic characterization whereas acetone, which partitioned between the two phases, allowed the determination of the transfer parameter k0a. It was shown that mass transfer was improved by increasing both flow rate and rotational speed, which is consistent with the observed mobile phase dispersion. Then, the different transfer parameters of the model (i.e. the local transfer coefficient for the different species involved in the process) were determined by fitting experimental concentration profiles. The model accurately predicted both equilibrium and dynamics factors (i.e. local mass transfer coefficients and acid-base equilibrium constant) variation with the CCC operating conditions (cell number, flow rate, rotational speed and thus stationary phase retention). The initial hypotheses (the acid-base reactions occurs instantaneously at the interface and the process is mainly governed by mass transfer) are thus validated. Finally, the model was used as a tool for catharanthine and vindoline separation prediction in the whole experimental domain that corresponded to a flow rate between 20 and 60 mL/min and rotational speeds from 900 and 2100 rotation per minutes. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Operator Product Formulas in the Algebraic Approach of the Refined Topological Vertex

    International Nuclear Information System (INIS)

    Cai Li-Qiang; Wang Li-Fang; Wu Ke; Yang Jie

    2013-01-01

    The refined topological vertex of Iqbal—Kozçaz—Vafa has been investigated from the viewpoint of the quantum algebra of type W 1+∞ by Awata, Feigin, and Shiraishi. They introduced the trivalent intertwining operator Φ which is normal ordered along with some prefactors. We manage to establish formulas from the infinite operator product of the vertex operators and the generalized ones to restore this prefactor, and obtain an explicit formula for the vertex realization of the topological vertex as well as the refined topological vertex

  14. A new approach to grain refinement of an Mg-Li-Al cast alloy

    International Nuclear Information System (INIS)

    Jiang, B.; Qiu, D.; Zhang, M.-X.; Ding, P.D.; Gao, L.

    2010-01-01

    Crystallographic calculation based on the edge-to-edge matching model predicted that both TiB 2 and Al 3 Ti intermetallic compounds have strong potential to be effective grain refiners for β phase in the Mg-14Li-1Al alloy due to the small atomic matching misfit across the interface between the compounds and β phase. Experimental results showed that addition of 1.25 wt%Al-5Ti-1B master alloy reduced grain size of β phase in the alloy from 1750 to 500 μm. The possible grain refining mechanisms were also discussed.

  15. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  16. Refining the Classification of Children with Selective Mutism: A Latent Profile Analysis

    Science.gov (United States)

    Cohan, Sharon L.; Chavira, Denise A.; Shipon-Blum, Elisa; Hitchcock, Carla; Roesch, Scott C.; Stein, Murray B.

    2008-01-01

    The goal of this study was to develop an empirically derived classification system for selective mutism (SM) using parent-report measures of social anxiety, behavior problems, and communication delays. The sample consisted of parents of 130 children (ages 5-12) with SM. Results from latent profile analysis supported a 3-class solution made up of…

  17. Post-Processing Approach for Refining Raw Land Cover Change Detection of Very High-Resolution Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhiyong Lv

    2018-03-01

    Full Text Available In recent decades, land cover change detection (LCCD using very high-spatial resolution (VHR remote sensing images has been a major research topic. However, VHR remote sensing images usually lead to a large amount of noises in spectra, thereby reducing the reliability of the detected results. To solve this problem, this study proposes an object-based expectation maximization (OBEM post-processing approach for enhancing raw LCCD results. OBEM defines a refinement of the labeling in a detected map to enhance its raw detection accuracies. Current mainstream change detection (preprocessing techniques concentrate on proposing a change magnitude measurement or considering image spatial features to obtain a change detection map. The proposed OBEM approach is a new solution to enhance change detection accuracy by refining the raw result. Post-processing approaches can achieve competitive accuracies to the preprocessing methods, but in a direct and succinct manner. The proposed OBEM post-processing method synthetically considers multi-scale segmentation and expectation maximum algorithms to refine the raw change detection result. Then, the influence of the scale of segmentation on the LCCD accuracy of the proposed OBEM is investigated. Four pairs of remote sensing images, one of two pairs (aerial image with 0.5 m/pixel resolution which depict two landslide sites on Landtau Island, Hong Kong, China, are used in the experiments to evaluate the effectiveness of the proposed approach. In addition, the proposed approach is applied, and validated by two case studies, LCCD in Tianjin City China (SPOT-5 satellite image with 2.5 m/pixel resolution and Mexico forest fire case (Landsat TM images with 30 m/pixel resolution, respectively. Quantitative evaluations show that the proposed OBEM post-processing approach can achieve better performance and higher accuracies than several commonly used preprocessing methods. To the best of the authors’ knowledge, this type

  18. Refinement in black chrome for use as a solar selective coating

    Science.gov (United States)

    Mcdonald, G. E.

    1974-01-01

    Black chrome is significant as a solar selective coating because the current extensive use of black chrome in the electroplating industry as a durable decorative finish makes black chrome widely available on a commercial scale and potentially low in cost as a solar selective coating. Black-chrome deposits were modified by underplating with dull nickel or by being plated on rough surfaces. Both of these procedures increased the visible absorptance. There was no change in the infrared reflectance for the dull-nickel - black-chrome combination from that reported for the bright-nickel - black-chrome combination. However, the bright-nickel - black-chrome coating plated on rough surfaces indicated a slight decrease in infrared reflectance. As integrated over the solar spectrum for air mass 2, the reflectance of the dull-nickel - black-chrome coating was 0.077, of the bright-nickel - black-chrome coating plated on a 0.75-micron (30-microinch) surface was 0.070, of the bright-nickel - black-chrome coating plated on a 2.5 micron (100-microinch) surface was 0.064. The corresponding values for the bright-nickel - black-chrome coating on a 0.0125-micron (0.5-microinch) surface, two samples of black nickel, and two samples of Nextrel black paint were 0.132, 0.123, 0.133, and 0.033, respectively.

  19. Involving users in the refinement of the competency-based achievement system: an innovative approach to competency-based assessment.

    Science.gov (United States)

    Ross, Shelley; Poth, Cheryl-Anne; Donoff, Michel G; Papile, Chiara; Humphries, Paul; Stasiuk, Samantha; Georgis, Rebecca

    2012-01-01

    Competency-based assessment innovations are being implemented to address concerns about the effectiveness of traditional approaches to medical training and the assessment of competence. Integrating intended users' perspectives during the piloting and refinement process of an innovation is necessary to ensure the innovation meets users' needs. Failure to do so results in no opportunity for users to influence the innovation, nor for developers to assess why an innovation works or does not work in different contexts. A qualitative participatory action research approach was used. Sixteen first-year residents participated in three focus groups and two interviews during piloting. Verbatim transcripts were analyzed individually and then across all transcripts using a constant comparison approach. The analysis revealed three key characteristics related to the impact on the residents' acceptance of the innovation as being a worthwhile investment of time and effort: access to frequent, timely, and specific feedback from preceptors. Findings were used to refine the innovation further. This study highlights the necessary conditions for assessing the success of implementation of educational innovations. Reciprocal communication between users and developers is vital. This reflects the approaches recommended in the Ottawa Consensus Statement on research in assessment published in Medical Teacher in March 2011.

  20. A practical approach towards energy conversion through bio-refining of mixed kraft pulps

    Energy Technology Data Exchange (ETDEWEB)

    Dharm, D.; Upadhyaya, J.S.; Tyagi, C.H.; Ahamad, S. (Dept. of Paper Technology, Indian Inst. of Technology Roorkee, Saharanpur (India))

    2007-07-01

    The pulp and paper industry is an energy intensive process industry where energy contributes about 16-20% of the manufacturing cost. Due to shortage in energy availability and increase in energy cost, energy conservation has become a necessity in the paper industry. A laboratory study on bleached and unbleached kraft pulps having 15% bamboo, eucalyptus 15%, poplar waste 20% and veneer waste 50% was conducted using two distinct commercial enzymes i.e. cellulases and xylanase and its effect on slowness, drainage time, beating time, mechanical strength properties and power consumption were studied. Enzymatic pretreatment of chemical pulp in laboratory improves degSR by 5 and 4 points in case of bleached and unbleached pulps respectively at the same beating time. Breaking length improves up to 4.0% at the constant beating level. The application of cellulase during refining saves energy 18.5% at constant refining level i.e. 28 degSR. The enzymatic treatment shows a power saving per 100 metric tone of paper by 1390 kWh which costs power saving in euro 160.70/ 100 metric tone of paper. In this way, net cost saving by deducting the cost of enzyme per 100 metric tone of paper is euro 157.90. (orig.)

  1. An Approximate Approach to Automatic Kernel Selection.

    Science.gov (United States)

    Ding, Lizhong; Liao, Shizhong

    2016-02-02

    Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.

  2. A new approach to texture measurements: Orientation distribution function (ODF) determination by Rietveld refinement

    International Nuclear Information System (INIS)

    Vondreele, R.; Larson, A.; Lawson, A.; Sheldon, R.; Wright, S.

    1996-01-01

    The preferred orientation of crystal grains within a manufactured part is described most fully by its orientation distribution function (ODF), which is a mapping of the probability of each of the possible grain orientations with respect to the exterior dimensions. Traditionally, an ODF is determined from pole figures for a relatively small number of reflections. These pole figures are measured with x-rays or neutrons using short detector scans over the center of an individual diffraction peak for a large number of different sample orientations. This is efficient if the selected diffraction peaks are reasonably strong (relative to background) and well separated, such as in pure fcc and bcc metals. It is also appropriate for constant wavelength sources where collection of individual diffraction peak intensities is a reasonably efficient use of the source. However, the traditional method is not very efficient for neutron diffraction at a spallation source such as LANSCE where the entire diffraction pattern is accessible for each sample setting. Moreover, a different approach is necessary for complicated diffraction patterns, such as from composite materials, intermetallic compounds, high T c ceramics, polyphasic minerals and polymers where there is expected to be heavy overlap of adjacent diffraction peaks. In addition, the large number of settings normally collected for an individual pole figure may not be necessary, since the entire pattern is obtained at each setting. Thus, a new method of ODF analysis needs to be developed to handle the more complex diffraction patterns obtained from modern technological materials as well as take advantage of the particular characteristics of spallation neutron sources. This project sought to develop the experimental procedures and the mathematical treatment needed to produce an orientation distribution function (ODF) directly from full diffraction patterns from a sample in a limited number of orientations

  3. Approaches to Refining Estimates of Global Burden and Economics of Dengue

    Science.gov (United States)

    Shepard, Donald S.; Undurraga, Eduardo A.; Betancourt-Cravioto, Miguel; Guzmán, María G.; Halstead, Scott B.; Harris, Eva; Mudin, Rose Nani; Murray, Kristy O.; Tapia-Conyer, Roberto; Gubler, Duane J.

    2014-01-01

    Dengue presents a formidable and growing global economic and disease burden, with around half the world's population estimated to be at risk of infection. There is wide variation and substantial uncertainty in current estimates of dengue disease burden and, consequently, on economic burden estimates. Dengue disease varies across time, geography and persons affected. Variations in the transmission of four different viruses and interactions among vector density and host's immune status, age, pre-existing medical conditions, all contribute to the disease's complexity. This systematic review aims to identify and examine estimates of dengue disease burden and costs, discuss major sources of uncertainty, and suggest next steps to improve estimates. Economic analysis of dengue is mainly concerned with costs of illness, particularly in estimating total episodes of symptomatic dengue. However, national dengue disease reporting systems show a great diversity in design and implementation, hindering accurate global estimates of dengue episodes and country comparisons. A combination of immediate, short-, and long-term strategies could substantially improve estimates of disease and, consequently, of economic burden of dengue. Suggestions for immediate implementation include refining analysis of currently available data to adjust reported episodes and expanding data collection in empirical studies, such as documenting the number of ambulatory visits before and after hospitalization and including breakdowns by age. Short-term recommendations include merging multiple data sources, such as cohort and surveillance data to evaluate the accuracy of reporting rates (by health sector, treatment, severity, etc.), and using covariates to extrapolate dengue incidence to locations with no or limited reporting. Long-term efforts aim at strengthening capacity to document dengue transmission using serological methods to systematically analyze and relate to epidemiologic data. As promising tools

  4. Stock selection using a hybrid MCDM approach

    Directory of Open Access Journals (Sweden)

    Tea Poklepović

    2014-12-01

    Full Text Available The problem of selecting the right stocks to invest in is of immense interest for investors on both emerging and developed capital markets. Moreover, an investor should take into account all available data regarding stocks on the particular market. This includes fundamental and stock market indicators. The decision making process includes several stocks to invest in and more than one criterion. Therefore, the task of selecting the stocks to invest in can be viewed as a multiple criteria decision making (MCDM problem. Using several MCDM methods often leads to divergent rankings. The goal of this paper is to resolve these possible divergent results obtained from different MCDM methods using a hybrid MCDM approach based on Spearman’s rank correlation coefficient. Five MCDM methods are selected: COPRAS, linear assignment, PROMETHEE, SAW and TOPSIS. The weights for all criteria are obtained by using the AHP method. Data for this study includes information on stock returns and traded volumes from March 2012 to March 2014 for 19 stocks on the Croatian capital market. It also includes the most important fundamental and stock market indicators for selected stocks. Rankings using five selected MCDM methods in the stock selection problem yield divergent results. However, after applying the proposed approach the final hybrid rankings are obtained. The results show that the worse stocks to invest in happen to be the same when the industry is taken into consideration or when not. However, when the industry is taken into account, the best stocks to invest in are slightly different, because some industries are more profitable than the others.

  5. Refining mortality estimates in shark demographic analyses: a Bayesian inverse matrix approach.

    Science.gov (United States)

    Smart, Jonathan J; Punt, André E; White, William T; Simpfendorfer, Colin A

    2018-01-18

    Leslie matrix models are an important analysis tool in conservation biology that are applied to a diversity of taxa. The standard approach estimates the finite rate of population growth (λ) from a set of vital rates. In some instances, an estimate of λ is available, but the vital rates are poorly understood and can be solved for using an inverse matrix approach. However, these approaches are rarely attempted due to prerequisites of information on the structure of age or stage classes. This study addressed this issue by using a combination of Monte Carlo simulations and the sample-importance-resampling (SIR) algorithm to solve the inverse matrix problem without data on population structure. This approach was applied to the grey reef shark (Carcharhinus amblyrhynchos) from the Great Barrier Reef (GBR) in Australia to determine the demography of this population. Additionally, these outputs were applied to another heavily fished population from Papua New Guinea (PNG) that requires estimates of λ for fisheries management. The SIR analysis determined that natural mortality (M) and total mortality (Z) based on indirect methods have previously been overestimated for C. amblyrhynchos, leading to an underestimated λ. The updated Z distributions determined using SIR provided λ estimates that matched an empirical λ for the GBR population and corrected obvious error in the demographic parameters for the PNG population. This approach provides opportunity for the inverse matrix approach to be applied more broadly to situations where information on population structure is lacking. © 2018 by the Ecological Society of America.

  6. Refining revolution

    Energy Technology Data Exchange (ETDEWEB)

    Fesharaki, F.; Isaak, D.

    1984-01-01

    A review of changes in the oil refining industry since 1973 examines the drop in capacity use and its effect on profits of the Organization of Economic Cooperation and Development (OECD) countries compared to world refining. OPEC countries used their new oil revenues to expand Gulf refineries, which put additional pressure on OECD refiners. OPEC involvement in global marketing, however, could help to secure supplies. Scrapping some older OECD refineries could improve the percentage of capacity in use if new construction is kept to a minimum. Other issues facing refiners are the changes in oil demand patterns and government responses to the market. 2 tables.

  7. Innovative Approach for IBS Vendor Selection Problem

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2016-01-01

    Full Text Available Supply chain management in Industrialised Building System (IBS construction management has significantly determined the successful of company and project performance. Due to the wide variety of criteria and vendor available, the vendor selection process for a specific project needs is becoming more difficult. The need of decision aid for vendor selection in other areas is widely discussed in previous research. However, study on vendor selection for IBS project is largely neglected. Decision Support System (DSS is proposed for this purpose. Yet, most of the DSS models are impractical since they are complicated and difficult for a layman such as project managers to use. Research indicates that the rapid development of ICT has highly potential towards simple and effective DSS. Thus, this paper highlights the importance and research approach for vendor selection in IBS project management. The study is based on Design Science Research Methodology with combination of case studies. It is anticipates that this study will yield an effective value-for-money decision making platform to manage vendor selection process.

  8. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    Science.gov (United States)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  9. Positioning oneself within an epistemology: refining our thinking about integrative approaches.

    Science.gov (United States)

    Dickerson, Victoria C

    2010-09-01

    Integrative approaches seem to be paramount in the current climate of family therapy and other psychotherapies. However, integration between and among theories and practices can only occur within a specific epistemology. This article makes a distinction between three different epistemologies: individualizing, systems, and poststructural. It then makes the argument that one can integrate theories within epistemologies and one can adopt practices and some theoretical concepts across theories and across epistemologies, but that it is impossible to integrate theories across epistemologies. It further states that although social constructionism has influenced much of contemporary psychological thinking, because of the divergence between a structural and a poststructural approach, constructionism looks different depending upon one's epistemological stance. Examples of integration within epistemologies and of what looks like integration across epistemologies (but is not) further illustrate these important distinctions. The conclusions reached here are crucial to our philosophical considerations, our pedagogical assumptions, and implications for both research and a reflexive clinical practice. 2010 © FPI, Inc.

  10. Quantum mechanics new approaches to selected topics

    CERN Document Server

    Lipkin, Harry Jeannot

    1973-01-01

    Acclaimed as ""excellent"" (Nature) and ""very original and refreshing"" (Physics Today), this collection of self-contained studies is geared toward advanced undergraduates and graduate students. Its broad selection of topics includes the Mössbauer effect, many-body quantum mechanics, scattering theory, Feynman diagrams, and relativistic quantum mechanics.Author Harry J. Lipkin, a well-known teacher at Israel's Weizmann Institute, takes an unusual approach by introducing many interesting physical problems and mathematical techniques at a much earlier point than in conventional texts. This meth

  11. Transbrachial artery approach for selective cerebral angiography

    International Nuclear Information System (INIS)

    Touho, Hajime; Karasawa, Jun; Shishido, Hisashi; Morisako, Toshitaka; Numazawa, Shinichi; Yamada, Keisuke; Nagai, Shigeki; Shibamoto, Kenji

    1990-01-01

    Transaxillary or transbrachial approaches to the cerebral vessels have been reported, but selective angiography of all four vessels has not been possible through one route. In this report, a new technique for selective cerebral angiography with transbrachial approach is described. One hundred and twenty three patients with cerebral infarction, vertebrobasilar insufficiency, intracerebral hemorrhage, epilepsy, or cerebral tumor were examined. Those patients consisted of 85 outpatients and 38 inpatients whose age ranged from 15 years old to 82 years old. The patients were examined via the transbrachial approach (97 cases via the right brachial, 29 cases via the left). Materials included a DSA system (Digital Fluorikon 5000, General Electric Co.), a 4 French tight J-curved Simmons 80-cm catheter, a 19-gauge extra-thin-wall Seldinger needle, and a J/Straight floppy 125-cm guide-wire. Generally, the volume of the contrast agent (300 mgI/ml iopamidol) used in the common carotid artery angiogram was 6 ml, while that used in the vertebral artery angiogram was 4 ml. If catheterization of the vertebral artery or right common carotid artery was unsuccessful, about 8 ml of the contrast agent was injected into the subclavian or branchiocephalic artery. Definitive diagnosis and a decision on proper treatment of the patients can be easily obtained, and the results were clinically satisfactory. Moreover, no complications were encountered in this study. This new technique making a transbrachial approach to the cerebral vessels using the DSA system is introduced here. Neurosurgeons can use this technique easily, and they will find that it provides them with all the information they need about the patient. (author)

  12. Refinement by interface instantiation

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Hoang, Thai Son

    2012-01-01

    be easily refined. Our first contribution hence is a proposal for a new construct called interface that encapsulates the external variables, along with a mechanism for interface instantiation. Using the new construct and mechanism, external variables can be refined consistently. Our second contribution...... is an approach for verifying the correctness of Event-B extensions using the supporting Rodin tool. We illustrate our approach by proving the correctness of interface instantiation....

  13. EVALUATING AND REFINING THE ‘ENTERPRISE ARCHITECTURE AS STRATEGY’ APPROACH AND ARTEFACTS

    Directory of Open Access Journals (Sweden)

    M. De Vries

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Enterprise Architecture (EA is a new discipline that has emerged from the need to create a holistic view of an enterprise, and thereby to discover business/IT integration and alignment opportunities across enterprise structures. Previous EA value propositions that merely focus on IT cost reductions will no longer convince management to invest in EA. Today, EA should enable business strategy in the organisation to create value. This resides in the ability to do enterprise optimisation through process standardisation and integration. In order to do this, a new approach is required to integrate EA into the strategy planning process of the organisation.
    This article explores the use of three key artefacts – operating models, core diagrams, and an operating maturity assessment as defined by Ross, Weill & Robertson [1] – as the basis of this new approach. Action research is applied to a research group to obtain qualitative feedback on the practicality of the artefacts.

    AFRIKAANSE OPSOMMING: Ondernemingsargitektuur (OA is ’n nuwe dissipline wat ontstaan het uit die behoefte om ’n holistiese perspektief van ’n onderneming te skep om sodoende besigheid/IT-integrasie en - belyningsgeleenthede regoor ondernemingstrukture te ontdek. Vorige OA waardeaanbiedings wat hoofsaaklik gefokus het op IT kostebesparings sal bestuur nie meer kan oorreed om in OA te belê nie. Vandag behoort OA bevoegdheid te gee aan ondernemingstrategie om werklik waarde te skep. Hierdie bevoegdheid lê gesetel in ondernemingsoptimering deur middel van prosesstandaardisasie en -integrasie. ’n Nuwe benadering word benodig ten einde OA te integreer met die strategiese beplanningsproses van die organisasie.
    Hierdie artikel ondersoek die gebruik van drie artefakte – operasionele modelle, kerndiagramme, en operasionele volwassenheidsassessering soos gedefinieer deur Ross, Weill & Robertson [1] – as die basis van hierdie nuwe benadering

  14. A Systems Approach to Refine Disease Taxonomy by Integrating Phenotypic and Molecular Networks

    Directory of Open Access Journals (Sweden)

    Xuezhong Zhou

    2018-05-01

    Full Text Available The International Classification of Diseases (ICD relies on clinical features and lags behind the current understanding of the molecular specificity of disease pathobiology, necessitating approaches that incorporate growing biomedical data for classifying diseases to meet the needs of precision medicine. Our analysis revealed that the heterogeneous molecular diversity of disease chapters and the blurred boundary between disease categories in ICD should be further investigated. Here, we propose a new classification of diseases (NCD by developing an algorithm that predicts the additional categories of a disease by integrating multiple networks consisting of disease phenotypes and their molecular profiles. With statistical validations from phenotype-genotype associations and interactome networks, we demonstrate that NCD improves disease specificity owing to its overlapping categories and polyhierarchical structure. Furthermore, NCD captures the molecular diversity of diseases and defines clearer boundaries in terms of both phenotypic similarity and molecular associations, establishing a rational strategy to reform disease taxonomy. Keywords: Disease taxonomy, Network medicine, Disease phenotypes, Molecular profiles, Precision medicine

  15. A refined regional modeling approach for the Corn Belt - Experiences and recommendations for large-scale integrated modeling

    Science.gov (United States)

    Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.

    2015-05-01

    Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential

  16. Refining cost-effectiveness analyses using the net benefit approach and econometric methods: an example from a trial of anti-depressant treatment.

    Science.gov (United States)

    Sabes-Figuera, Ramon; McCrone, Paul; Kendricks, Antony

    2013-04-01

    Economic evaluation analyses can be enhanced by employing regression methods, allowing for the identification of important sub-groups and to adjust for imperfect randomisation in clinical trials or to analyse non-randomised data. To explore the benefits of combining regression techniques and the standard Bayesian approach to refine cost-effectiveness analyses using data from randomised clinical trials. Data from a randomised trial of anti-depressant treatment were analysed and a regression model was used to explore the factors that have an impact on the net benefit (NB) statistic with the aim of using these findings to adjust the cost-effectiveness acceptability curves. Exploratory sub-samples' analyses were carried out to explore possible differences in cost-effectiveness. Results The analysis found that having suffered a previous similar depression is strongly correlated with a lower NB, independent of the outcome measure or follow-up point. In patients with previous similar depression, adding an selective serotonin reuptake inhibitors (SSRI) to supportive care for mild-to-moderate depression is probably cost-effective at the level used by the English National Institute for Health and Clinical Excellence to make recommendations. This analysis highlights the need for incorporation of econometric methods into cost-effectiveness analyses using the NB approach.

  17. Survey for service selection approaches in dynamic environments

    CSIR Research Space (South Africa)

    Manqele, Lindelweyizizwe S

    2017-09-01

    Full Text Available The usage of the service selection approaches across different dynamic service provisioning environments has increased the challenges associated with an effective method that can be used to select a relevant service. The use of service selection...

  18. Structural exploration for the refinement of anticancer matrix metalloproteinase-2 inhibitor designing approaches through robust validated multi-QSARs

    Science.gov (United States)

    Adhikari, Nilanjan; Amin, Sk. Abdul; Saha, Achintya; Jha, Tarun

    2018-03-01

    Matrix metalloproteinase-2 (MMP-2) is a promising pharmacological target for designing potential anticancer drugs. MMP-2 plays critical functions in apoptosis by cleaving the DNA repair enzyme namely poly (ADP-ribose) polymerase (PARP). Moreover, MMP-2 expression triggers the vascular endothelial growth factor (VEGF) having a positive influence on tumor size, invasion, and angiogenesis. Therefore, it is an urgent need to develop potential MMP-2 inhibitors without any toxicity but better pharmacokinetic property. In this article, robust validated multi-quantitative structure-activity relationship (QSAR) modeling approaches were attempted on a dataset of 222 MMP-2 inhibitors to explore the important structural and pharmacophoric requirements for higher MMP-2 inhibition. Different validated regression and classification-based QSARs, pharmacophore mapping and 3D-QSAR techniques were performed. These results were challenged and subjected to further validation to explain 24 in house MMP-2 inhibitors to judge the reliability of these models further. All these models were individually validated internally as well as externally and were supported and validated by each other. These results were further justified by molecular docking analysis. Modeling techniques adopted here not only helps to explore the necessary structural and pharmacophoric requirements but also for the overall validation and refinement techniques for designing potential MMP-2 inhibitors.

  19. An analytical approach to elucidate the mechanism of grain refinement in calcium added Mg-Al alloys

    International Nuclear Information System (INIS)

    Nagasivamuni, B.; Ravi, K.R.

    2015-01-01

    Highlights: • Minor additions of Ca (<0.2%) refines the grain structure in Mg-(3, 6 and 9)Al alloys. • Analytical model elucidate that nucleation potency is enhanced after Ca addition. • Ternary Mg-Al-xCa growth restriction values (Q t ) are computed using Scheil equations. • Grain size predictions elucidate that nucleation events dominate grain refinement. • Growth restriction due to the higher Ca addition on grain refinement is not significant. - Abstract: The present study investigates the grain refinement of Mg-3Al, Mg-6Al and Mg-9Al alloys by calcium addition. The maximum reduction in grain size has been observed at 0.2% Ca addition in Mg-Al alloys, in which any further addition (up to 0.4%) has marginal improvement in grain refinement. The mechanism associated with the grain refinement of Mg-Al alloys by Ca addition is discussed in terms of growth restriction factor (Q) and constitutional undercooling (ΔT CS ) using analytical model. The influence of growth restriction factor (Q) on the final grain size of Ca-added Mg-Al alloys are calculated with the help analytical model by assuming that the number of nucleant particles is not altered through Ca addition. For accurate grain size calculations, the value of Q has been estimated with reliable thermodynamic database using Scheil solidification simulation. The comparison of predicted and experimental grain size results indicate that constitutional undercooling activation of nucleation events plays dominant role in grain refinement in Mg-Al alloys by calcium addition, whereas the increase in growth restriction value has negligible effect

  20. Spanish Refining

    International Nuclear Information System (INIS)

    Lores, F.R.

    2001-01-01

    An overview of petroleum refining in Spain is presented (by Repsol YPF) and some views on future trends are discussed. Spain depends heavily on imports. Sub-headings in the article cover: sources of crude imports, investments and logistics and marketing, -detailed data for each are shown diagrammatically. Tables show: (1) economic indicators (e.g. total GDP, vehicle numbers and inflation) for 1998-200; (2) crude oil imports for 1995-2000; (3) oil products balance for 1995-2000; (4) commodities demand, by product; (5) refining in Spain in terms of capacity per region; (6) outlets in Spain and other European countries in 2002 and (7) sales distribution channel by product

  1. Supplier selection problem: A fuzzy multicriteria approach

    African Journals Online (AJOL)

    kirstam

    simultaneously: maximising the total value of purchases, minimising ... Keywords: Supplier selection, multi-criteria decision-making, fuzzy logic, satisfaction ... includes both qualitative and quantitative factors, and it is necessary to make a.

  2. Ab-initio crystal structure analysis and refinement approaches of oligo p-benzamides based on electron diffraction data

    DEFF Research Database (Denmark)

    Gorelik, Tatiana E; van de Streek, Jacco; Kilbinger, Andreas F M

    2012-01-01

    Ab-initio crystal structure analysis of organic materials from electron diffraction data is presented. The data were collected using the automated electron diffraction tomography (ADT) technique. The structure solution and refinement route is first validated on the basis of the known crystal stru...

  3. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    Science.gov (United States)

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  4. Supplier selection an MCDA-based approach

    CERN Document Server

    Mukherjee, Krishnendu

    2017-01-01

    The purpose of this book is to present a comprehensive review of the latest research and development trends at the international level for modeling and optimization of the supplier selection process for different industrial sectors. It is targeted to serve two audiences: the MBA and PhD student interested in procurement, and the practitioner who wishes to gain a deeper understanding of procurement analysis with multi-criteria based decision tools to avoid upstream risks to get better supply chain visibility. The book is expected to serve as a ready reference for supplier selection criteria and various multi-criteria based supplier’s evaluation methods for forward, reverse and mass customized supply chain. This book encompasses several criteria, methods for supplier selection in a systematic way based on extensive literature review from 1998 to 2012. It provides several case studies and some useful links which can serve as a starting point for interested researchers. In the appendix several computer code wri...

  5. Categorization and selection of regulatory approaches for nuclear power plants

    International Nuclear Information System (INIS)

    Sugaya, Junko; Harayama, Yuko

    2009-01-01

    Several new regulatory approaches have been introduced to Japanese nuclear safety regulations, in which a prescriptive and deterministic approach had traditionally predominated. However, the options of regulatory approaches that can possibly be applied to nuclear safety regulations as well as the methodology for selecting the options are not systematically defined. In this study, various regulatory approaches for nuclear power plants are categorized as prescriptive or nonprescriptive, outcome-based or process-based, and deterministic or risk-informed. 18 options of regulatory approaches are conceptually developed and the conditions for selecting the appropriate regulatory approaches are identified. Current issues on nuclear regulations regarding responsibilities, transparency, consensus standards and regulatory inspections are examined from the viewpoints of regulatory approaches to verify usefulness of the categorization and selection concept of regulatory approaches. Finally, some of the challenges at the transitional phase of regulatory approaches are discussed. (author)

  6. A Ranking Approach to Genomic Selection.

    Science.gov (United States)

    Blondel, Mathieu; Onogi, Akio; Iwata, Hiroyoshi; Ueda, Naonori

    2015-01-01

    Genomic selection (GS) is a recent selective breeding method which uses predictive models based on whole-genome molecular markers. Until now, existing studies formulated GS as the problem of modeling an individual's breeding value for a particular trait of interest, i.e., as a regression problem. To assess predictive accuracy of the model, the Pearson correlation between observed and predicted trait values was used. In this paper, we propose to formulate GS as the problem of ranking individuals according to their breeding value. Our proposed framework allows us to employ machine learning methods for ranking which had previously not been considered in the GS literature. To assess ranking accuracy of a model, we introduce a new measure originating from the information retrieval literature called normalized discounted cumulative gain (NDCG). NDCG rewards more strongly models which assign a high rank to individuals with high breeding value. Therefore, NDCG reflects a prerequisite objective in selective breeding: accurate selection of individuals with high breeding value. We conducted a comparison of 10 existing regression methods and 3 new ranking methods on 6 datasets, consisting of 4 plant species and 25 traits. Our experimental results suggest that tree-based ensemble methods including McRank, Random Forests and Gradient Boosting Regression Trees achieve excellent ranking accuracy. RKHS regression and RankSVM also achieve good accuracy when used with an RBF kernel. Traditional regression methods such as Bayesian lasso, wBSR and BayesC were found less suitable for ranking. Pearson correlation was found to correlate poorly with NDCG. Our study suggests two important messages. First, ranking methods are a promising research direction in GS. Second, NDCG can be a useful evaluation measure for GS.

  7. NPP site selection: A systems engineering approach

    Energy Technology Data Exchange (ETDEWEB)

    Pwani, Henry; Kamanja, Florah; Zolkaffly, Zulfakar; Jung, J. C. [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2012-10-15

    The necessity for improved decision making concerning the siting and licensing of major power facilities has been accelerated in the past decade by the increased environmental consciousness of the public and by the energy crisis. These problems are exceedingly complex due to their multiple objective nature, the many interest groups, the long range time horizons, and the inherent uncertainties of the potential impacts of any decision. Along with the relatively objective economic and engineering concerns, the more subjective factors involving safety, environmental, and social issues are crucial to the problem. The preferences of the general public, as consumers, the utility companies, as builders and operators of power plant facilities, and environmentalists and the government must be accounted for in analyzing power plant siting and licensing issues. We advocate for a systems engineering approach that articulates stake holder's requirements, expert judgements, and a systems decision making approach. The appropriateness and application of systems decision making process is illustrated in this paper.

  8. NPP site selection: A systems engineering approach

    International Nuclear Information System (INIS)

    Pwani, Henry; Kamanja, Florah; Zolkaffly, Zulfakar; Jung, J. C.

    2012-01-01

    The necessity for improved decision making concerning the siting and licensing of major power facilities has been accelerated in the past decade by the increased environmental consciousness of the public and by the energy crisis. These problems are exceedingly complex due to their multiple objective nature, the many interest groups, the long range time horizons, and the inherent uncertainties of the potential impacts of any decision. Along with the relatively objective economic and engineering concerns, the more subjective factors involving safety, environmental, and social issues are crucial to the problem. The preferences of the general public, as consumers, the utility companies, as builders and operators of power plant facilities, and environmentalists and the government must be accounted for in analyzing power plant siting and licensing issues. We advocate for a systems engineering approach that articulates stake holder's requirements, expert judgements, and a systems decision making approach. The appropriateness and application of systems decision making process is illustrated in this paper

  9. Selective adsorption resonances: Quantum and stochastic approaches

    International Nuclear Information System (INIS)

    Sanz, A.S.; Miret-Artes, S.

    2007-01-01

    In this review we cover recent advances in the theory of the selective adsorption phenomenon that appears in light atom/molecule scattering off solid surfaces. Due to the universal van der Waals attractive interaction incoming gas particles can get trapped by the surface, this giving rise to the formation of quasi-bound states or resonances. The knowledge of the position and width of these resonances provides relevant direct information about the nature of the gas-surface interaction as well as about the evaporation and desorption mechanisms. This information can be obtained by means of a plethora of theoretical methods developed in both the energy and time domains, which we analyze and discuss here in detail. In particular, special emphasis is given to close-coupling, wave-packet, and trajectory-based formalisms. Furthermore, a novel description of selective adsorption resonances from a stochastic quantum perspective within the density matrix and Langevin formalisms, when correlations and fluctuations of the surface (considered as a thermal bath) are taken into account, is also proposed and discussed

  10. A systematic review of COTS evaluation and selection approaches

    Directory of Open Access Journals (Sweden)

    Rakesh Garg

    2017-11-01

    Full Text Available In the past decades, a number of researchers have made their significant contributions to develop different approaches for solving a very challenging problem of commercial off-the shelf (COTS selection. The development of software with high quality and minimum development time has always been a difficult job for the software developers. Therefore, in today’s scenario, software developers move towards the implementation of component based software engineering that relies on the integration of small pieces of code namely (COTS. In this study, we present a comprehensive descriptive explanation of the various COTS evaluation and selection approaches developed by various researchers in the past to understand the concept of COTS selection. The advantages and disadvantages of each COTS selection approach are also provided, which will give a better prospect to the readers to understand the various existing COTS evaluation and selection approaches.

  11. Fourier Collocation Approach With Mesh Refinement Method for Simulating Transit-Time Ultrasonic Flowmeters Under Multiphase Flow Conditions.

    Science.gov (United States)

    Simurda, Matej; Duggen, Lars; Basse, Nils T; Lassen, Benny

    2018-02-01

    A numerical model for transit-time ultrasonic flowmeters operating under multiphase flow conditions previously presented by us is extended by mesh refinement and grid point redistribution. The method solves modified first-order stress-velocity equations of elastodynamics with additional terms to account for the effect of the background flow. Spatial derivatives are calculated by a Fourier collocation scheme allowing the use of the fast Fourier transform, while the time integration is realized by the explicit third-order Runge-Kutta finite-difference scheme. The method is compared against analytical solutions and experimental measurements to verify the benefit of using mapped grids. Additionally, a study of clamp-on and in-line ultrasonic flowmeters operating under multiphase flow conditions is carried out.

  12. AHRQ series paper 3: identifying, selecting, and refining topics for comparative effectiveness systematic reviews: AHRQ and the effective health-care program.

    Science.gov (United States)

    Whitlock, Evelyn P; Lopez, Sarah A; Chang, Stephanie; Helfand, Mark; Eder, Michelle; Floyd, Nicole

    2010-05-01

    This article discusses the identification, selection, and refinement of topics for comparative effectiveness systematic reviews within the Agency for Healthcare Research and Quality's Effective Health Care (EHC) program. The EHC program seeks to align its research topic selection with the overall goals of the program, impartially and consistently apply predefined criteria to potential topics, involve stakeholders to identify high-priority topics, be transparent and accountable, and continually evaluate and improve processes. A topic prioritization group representing stakeholder and scientific perspectives evaluates topic nominations that fit within the EHC program (are "appropriate") to determine how "important" topics are as considered against seven criteria. The group then judges whether a new comparative effectiveness systematic review would be a duplication of existing research syntheses, and if not duplicative, if there is adequate type and volume of research to conduct a new systematic review. Finally, the group considers the "potential value and impact" of a comparative effectiveness systematic review. As the EHC program develops, ongoing challenges include ensuring the program addresses truly unmet needs for synthesized research because national and international efforts in this arena are uncoordinated, as well as engaging a range of stakeholders in program decisions while also achieving efficiency and timeliness.

  13. Selective amygdalohippocampectomy via trans-superior temporal gyrus keyhole approach

    OpenAIRE

    Mathon , Bertrand; Clemenceau , Stéphane

    2016-01-01

    International audience; BackgroundHippocampal sclerosis is the most common cause of drug-resistant epilepsy amenable for surgical treatment and seizure control. The rationale of the selective amygdalohippocampectomy is to spare cerebral tissue not included in the seizure generator.MethodDescribe the selective amygdalohippocampectomy through the trans-superior temporal gyrus keyhole approach.ConclusionSelective amygdalohippocampectomy for temporal lobe epilepsy is performed when the data (semi...

  14. A simulated annealing approach to supplier selection aware inventory planning

    OpenAIRE

    Turk, Seda; Miller, Simon; Özcan, Ender; John, Robert

    2015-01-01

    Selection of an appropriate supplier is a crucial and challenging task in the effective management of a supply chain. Also, appropriate inventory management is critical to the success of a supply chain operation. In recent years, there has been a growing interest in the area of selection of an appropriate vendor and creating good inventory planning using supplier selection information. In this paper, we consider both of these tasks in a two-stage approach employing Interval Type-2 Fuzzy Sets ...

  15. Developing, Approving and Maintaining Qualifications: Selected International Approaches. Research Report

    Science.gov (United States)

    Misko, Josie

    2015-01-01

    There are lessons for Australia in the key approaches to the development, approval, maintenance and quality assurance of qualifications adopted in countries overseas. This research takes into account a range of approaches used in selected European Union (EU) member states (Germany, Finland and Sweden), the United Kingdom (England, Northern Ireland…

  16. Supplier selection problem: A fuzzy multicriteria approach | Allouche ...

    African Journals Online (AJOL)

    The purpose of this paper is to suggest a fuzzy multi-criteria approach to solve the supplier selection problem, an approach based on the fuzzy analytic hierarchy process and imprecise goal programming. To deal with decision-maker (DM) preferences, the concept of satisfaction function is introduced. The proposed ...

  17. Selective amygdalohippocampectomy via trans-superior temporal gyrus keyhole approach.

    Science.gov (United States)

    Mathon, Bertrand; Clemenceau, Stéphane

    2016-04-01

    Hippocampal sclerosis is the most common cause of drug-resistant epilepsy amenable for surgical treatment and seizure control. The rationale of the selective amygdalohippocampectomy is to spare cerebral tissue not included in the seizure generator. Describe the selective amygdalohippocampectomy through the trans-superior temporal gyrus keyhole approach. Selective amygdalohippocampectomy for temporal lobe epilepsy is performed when the data (semiology, neuroimaging, electroencephalography) point to the mesial temporal structures. The trans-superior temporal gyrus keyhole approach is a minimally invasive and safe technique that allows disconnection of the temporal stem and resection of temporomesial structures.

  18. External and Internal Citation Analyses Can Provide Insight into Serial/Monograph Ratios when Refining Collection Development Strategies in Selected STEM Disciplines

    Directory of Open Access Journals (Sweden)

    Stephanie Krueger

    2016-12-01

    Full Text Available A Review of: Kelly, M. (2015. Citation patterns of engineering, statistics, and computer science researchers: An internal and external citation analysis across multiple engineering subfields. College and Research Libraries, 76(7, 859-882. http://doi.org/10.5860/crl.76.7.859 Objective – To determine internal and external citation analysis methods and their potential applicability to the refinement of collection development strategies at both the institutional and cross-institutional levels for selected science, technology, engineering, and mathematics (STEM subfields. Design – Multidimensional citation analysis; specifically, analysis of citations from 1 key scholarly journals in selected STEM subfields (external analysis compared to those from 2 local doctoral dissertations in similar subfields (internal analysis. Setting – Medium-sized, STEM-dominant public research university in the United States of America. Subjects – Two citation datasets: 1 14,149 external citations from16 journals (i.e., 2 journals per subfield; citations from 2012 volumes representing bioengineering, civil engineering, computer science (CS, electrical engineering, environmental engineering, operations research, statistics (STAT, and systems engineering; and 2 8,494 internal citations from 99 doctoral dissertations (18-22 per subfield published between 2008-–2012 from CS, electrical and computer engineering (ECE, and applied information technology (AIT and published between 2005-–2012 for systems engineering and operations research (SEOR and STAT. Methods – Citations, including titles and publication dates, were harvested from source materials and stored in Excel and then manually categorized according to format (book, book chapter, journal, conference proceeding, website, and several others. To analyze citations, percentages of occurrence by subfield were calculated for variables including format, age (years since date cited, journal distribution, and the

  19. The effects of predictor method factors on selection outcomes: A modular approach to personnel selection procedures.

    Science.gov (United States)

    Lievens, Filip; Sackett, Paul R

    2017-01-01

    Past reviews and meta-analyses typically conceptualized and examined selection procedures as holistic entities. We draw on the product design literature to propose a modular approach as a complementary perspective to conceptualizing selection procedures. A modular approach means that a product is broken down into its key underlying components. Therefore, we start by presenting a modular framework that identifies the important measurement components of selection procedures. Next, we adopt this modular lens for reviewing the available evidence regarding each of these components in terms of affecting validity, subgroup differences, and applicant perceptions, as well as for identifying new research directions. As a complement to the historical focus on holistic selection procedures, we posit that the theoretical contributions of a modular approach include improved insight into the isolated workings of the different components underlying selection procedures and greater theoretical connectivity among different selection procedures and their literatures. We also outline how organizations can put a modular approach into operation to increase the variety in selection procedures and to enhance the flexibility in designing them. Overall, we believe that a modular perspective on selection procedures will provide the impetus for programmatic and theory-driven research on the different measurement components of selection procedures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. A benefit/risk approach towards selecting appropriate pharmaceutical dosage forms - an application for paediatric dosage form selection.

    Science.gov (United States)

    Sam, Tom; Ernest, Terry B; Walsh, Jennifer; Williams, Julie L

    2012-10-05

    The design and selection of new pharmaceutical dosage forms involves the careful consideration and balancing of a quality target product profile against technical challenges and development feasibility. Paediatric dosage forms present particular complexity due to the diverse patient population, patient compliance challenges and safety considerations of this vulnerable population. This paper presents a structured framework for assessing the comparative benefits and risks of different pharmaceutical design options against pre-determined criteria relating to (1) efficacy, (2) safety and (3) patient access. This benefit/risk framework has then been applied to three hypothetical, but realistic, scenarios for paediatric dosage forms in order to explore its utility in guiding dosage form design and formulation selection. The approach allows a rigorous, systematic and qualitative assessment of the merits and disadvantages of each dosage form option and helps identify mitigating strategies to modify risk. The application of a weighting and scoring system to the criteria depending on the specific case could further refine the analysis and aid decision-making. In this paper, one case study is scored for illustrative purposes. However, it is acknowledged that in real development scenarios, the generation of actual data considering the very specific situation for the patient/product/developer would come into play to drive decisions on the most appropriate dosage form strategy. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Statistical approach for selection of biologically informative genes.

    Science.gov (United States)

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  2. A study of selective precipitation techniques used to recover refined iron oxide pigments for the production of paint from a synthetic acid mine drainage solution

    International Nuclear Information System (INIS)

    Ryan, M.J.; Kney, A.D.; Carley, T.L.

    2017-01-01

    New resource recovery methods of acid mine drainage (AMD) treatment aim to reduce waste by extracting iron contaminants in usable forms, specifically iron oxides as industrial inorganic pigments, which can be marketed and sold to subsidize treatment costs. In this study, iron oxide pigments of varying colors and properties were recovered from a synthetic AMD solution through a stepwise selective precipitation process using oxidation, pH adjustment, and filtration. Chemical and physical design variables within the process, such as alkaline addition rate, reaction temperature, drying duration, and target pH, were altered and observed for their effects on iron oxide morphology as a means of reducing—or even eliminating—the need for refining after synthesis. Resulting iron oxide pigment powders were analyzed with X-ray diffraction (XRD) and energy dispersive spectroscopy (EDS), and visually evaluated for color and coating ability. Drying duration resulted in increased redness in paint streaks and enhanced crystallinity, as amorphous phases of iron oxide transformed into hematite. Alkaline addition rate showed no effect on the crystallinity of the powders and no consistent effect on color. Conversely, increasing reaction temperature darkened the color of pigments and increased surface area of pigment particles (thus improving coating ability) without changing the crystallinity of the samples. Iron oxides precipitated at pH 3 displayed the highest purity and possessed a distinct yellow color suggestive of jarosite, while other paint streaks darkened in color as trace metal impurities increased. The choice to use lower pH for higher quality iron oxides comes with the compromise of reduced iron recovery efficiency. Manganese and nickel did not begin to precipitate out of solution up to pH 7 and thus require increased pH neutralization in the field if natural AMD is found to contain those metals. All pigments developed in this study were found to be adequate for use as

  3. Synthesizing dimensional and categorical approaches to personality disorders: refining the research agenda for DSM-V Axis II.

    Science.gov (United States)

    Krueger, Robert F; Skodol, Andrew E; Livesley, W John; Shrout, Patrick E; Huang, Yueqin

    2007-01-01

    Personality disorder researchers have long considered the utility of dimensional approaches to diagnosis, signaling the need to consider a dimensional approach for personality disorders in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-V). Nevertheless, a dimensional approach to personality disorders in DSM-V is more likely to succeed if it represents an orderly and logical progression from the categorical system in DSM-IV. With these considerations and opportunities in mind, the authors sought to delineate ways of synthesizing categorical and dimensional approaches to personality disorders that could inform the construction of DSM-V. This discussion resulted in (1) the idea of having a set of core descriptive elements of personality for DSM-V, (2) an approach to rating those elements for specific patients, (3) a way of combining those elements into personality disorder prototypes, and (4) a revised conception of personality disorder as a construct separate from personality traits. Copyright (c) 2007 John Wiley & Sons, Ltd.

  4. Refining mineral oils

    Energy Technology Data Exchange (ETDEWEB)

    1946-07-05

    A process is described refining raw oils such as mineral oils, shale oils, tar, their fractions and derivatives, by extraction with a selected solvent or a mixture of solvents containing water, forming a solvent more favorable for the hydrocarbons poor in hydrogen than for hydrocarbons rich in hydrogen, this process is characterized by the addition of an aiding solvent for the water which can be mixed or dissolved in the water and the solvent or in the dissolving mixture and increasing in this way the solubility of the water in the solvent or the dissolving mixture.

  5. A new approach to the LILW repository site selection

    International Nuclear Information System (INIS)

    Mele, I.; Zeleznik, N.

    1998-01-01

    After the failure of site selection, which was performed between 1990-1993, the Agency for Radwaste Management was urged to start a new site selection process for low and intermediate level waste (LILW). Since this is the most sensitive and delicate phase of the whole disposal project extensive analyses of foreign and domestic experiences in siting were performed. Three different models were studied and discussed at a workshop on preparation of the siting procedure for LILW repository. The participants invited to the workshop supported the combined approach, to the site selection, which is presented in this paper.(author)

  6. Refining discordant gene trees.

    Science.gov (United States)

    Górecki, Pawel; Eulenstein, Oliver

    2014-01-01

    Evolutionary studies are complicated by discordance between gene trees and the species tree in which they evolved. Dealing with discordant trees often relies on comparison costs between gene and species trees, including the well-established Robinson-Foulds, gene duplication, and deep coalescence costs. While these costs have provided credible results for binary rooted gene trees, corresponding cost definitions for non-binary unrooted gene trees, which are frequently occurring in practice, are challenged by biological realism. We propose a natural extension of the well-established costs for comparing unrooted and non-binary gene trees with rooted binary species trees using a binary refinement model. For the duplication cost we describe an efficient algorithm that is based on a linear time reduction and also computes an optimal rooted binary refinement of the given gene tree. Finally, we show that similar reductions lead to solutions for computing the deep coalescence and the Robinson-Foulds costs. Our binary refinement of Robinson-Foulds, gene duplication, and deep coalescence costs for unrooted and non-binary gene trees together with the linear time reductions provided here for computing these costs significantly extends the range of trees that can be incorporated into approaches dealing with discordance.

  7. A “genetics first” approach to selection

    Science.gov (United States)

    A different approach for using genomic information in genetic improvement is proposed. Past research in population genetics and animal breeding combined with information on sequence variants suggest the possibility that selection might be able to capture a portion of inbreeding and heterosis effect...

  8. practical common weight maximin approach for technology selection

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... proposes a multi-objective decision tool for industrial robot selection, which does not require subjective assessments ... Over the past several decades, manufacturers who have been faced with intense competition ... three groups: economic analysis techniques, analytical methods and strategic approaches.

  9. A Multiscale Adaptive Mesh Refinement Approach to Architectured Steel Specification in the Design of a Frameless Stressed Skin Structure

    DEFF Research Database (Denmark)

    Nicholas, Paul; Stasiuk, David; Nørgaard, Esben

    2015-01-01

    This paper describes the development of a modelling approach for the design and fabrication of an incrementally formed, stressed skin metal structure. The term incremental forming refers to a progression of localised plastic deformation to impart 3D form onto a 2D metal sheet, directly from 3D...... design data. A brief introduction presents this fabrication concept, as well as the context of structures whose skin plays a significant structural role. Existing research into ISF privileges either the control of forming parameters to minimise geometric deviation, or the more accurate measurement...... of the impact of the forming process at the scale of the grain. But to enhance structural performance for architectural applications requires that both aspects are considered synthetically. We demonstrate a mesh-based approach that incorporates critical parameters at the scales of structure, element...

  10. China’s Comprehensive Approach: Refining the U.S. Targeting Process to Inform U.S. Strategy

    Science.gov (United States)

    2018-04-20

    engagement of key PRC leaders with other world leaders and multinational organizations. The bottom-up approach relies upon the vast Chinese population...internal affairs, equality and mutual benefit for all parties, and a peaceful coexistence.33 Much like Chinese leaders that preceded him, Xi Jinping’s... autocratic structure. For example, Xi Jinping’s ability to control the buildup of China’s military capabilities and economic expansions designed to benefit

  11. Evaluating secondary neutron doses of a refined shielded design for a medical cyclotron using the TLD approach

    International Nuclear Information System (INIS)

    Lin, Jye-Bin; Tseng, Hsien-Chun; Liu, Wen-Shan; Lin, Ding-Bang; Hsieh, Teng-San; Chen, Chien-Yi

    2013-01-01

    An increasing number of cyclotrons at medical centers in Taiwan have been installed to generate radiopharmaceutical products. An operating cyclotron generates immense amounts of secondary neutrons from reactions such the 18 O(p, n) 18 F, used in the production of FDG. This intense radiation can be hazardous to public health, particularly to medical personnel. To increase the yield of 18 F-FDG from 4200 GBq in 2005 to 48,600 GBq in 2011, Chung Shan Medical University Hospital (CSMUH) has prolonged irradiation time without changing the target or target current to meet requirements regarding the production 18 F. The CSMUH has redesigned the CTI Radioisotope Delivery System shield. The lack of data for a possible secondary neutron doses has increased due to newly designed cyclotron rooms. This work aims to evaluate secondary neutron doses at a CTI cyclotron center using a thermoluminescent dosimeter (TLD-600). Two-dimensional neutron doses were mapped and indicated that neutron doses were high as neutrons leaked through self-shielded blocks and through the L-shaped concrete shield in vault rooms. These neutron doses varied markedly among locations close to the H 2 18 O target. The Monte Carlo simulation and minimum detectable dose are also discussed and demonstrated the reliability of using the TLD-600 approach. Findings can be adopted by medical centers to identify radioactive hot spots and develop radiation protection. - Highlights: • Neutron doses were verified using TLD approach. • Neutron doses were increased at cyclotron centers. • Revised L-shaped shield suppresses effectively the neutrons. • Neutron dose can be attenuated to 1.13×10 6 %

  12. An integrated approach towards future ballistic neck protection materials selection.

    Science.gov (United States)

    Breeze, John; Helliker, Mark; Carr, Debra J

    2013-05-01

    Ballistic protection for the neck has historically taken the form of collars attached to the ballistic vest (removable or fixed), but other approaches, including the development of prototypes incorporating ballistic material into the collar of an under body armour shirt, are now being investigated. Current neck collars incorporate the same ballistic protective fabrics as the soft armour of the remaining vest, reflecting how ballistic protective performance alone has historically been perceived as the most important property for neck protection. However, the neck has fundamental differences from the thorax in terms of anatomical vulnerability, flexibility and equipment integration, necessitating a separate solution from the thorax in terms of optimal materials selection. An integrated approach towards the selection of the most appropriate combination of materials to be used for each of the two potential designs of future neck protection has been developed. This approach requires evaluation of the properties of each potential material in addition to ballistic performance alone, including flexibility, mass, wear resistance and thermal burden. The aim of this article is to provide readers with an overview of this integrated approach towards ballistic materials selection and an update of its current progress in the development of future ballistic neck protection.

  13. Six Sigma Project Selection Using Fuzzy TOPSIS Decision Making Approach

    Directory of Open Access Journals (Sweden)

    Rajeev Rathi

    2015-05-01

    Full Text Available Six Sigma is considered as a logical business strategy that attempts to identify and eliminate the defects or failures for improving the quality of product and processes. A decision on project selection in Six Sigma is always very critical; it plays a key role in successful implementation of Six Sigma. Selection of a right Six Sigma project is essentially important for an automotive company because it greatly influences the manufacturing costs. This paper discusses an approach for right Six Sigma project selection at an automotive industry using fuzzy logic based TOPSIS method. The fuzzy TOPSIS is a well recognized tool to undertake the fuzziness of the data involved in choosing the right preferences. In this context, evaluation criteria have been designed for selection of best alternative. The weights of evaluation criteria are calculated by using the MDL (modified digital logic method and final ranking is calculated through priority index obtained by using fuzzy TOPSIS method. In the selected case study, this approach has rightly helped to identify the right project for implementing Six Sigma for achieving improvement in productivity.

  14. Grain refinement of aluminum and its alloys

    International Nuclear Information System (INIS)

    Zaid, A.I.O.

    2001-01-01

    Grain refinement of aluminum and its alloys by the binary Al-Ti and Ternary Al-Ti-B master alloys is reviewed and discussed. The importance of grain refining to the cast industry and the parameters affecting it are presented and discussed. These include parameters related to the cast, parameters related to the grain refining alloy and parameters related to the process. The different mechanisms, suggested in the literature for the process of grain refining are presented and discussed, from which it is found that although the mechanism of refining by the binary Al-Ti is well established the mechanism of grain refining by the ternary Al-Ti-B is still a controversial matter and some research work is still needed in this area. The effect of the addition of other alloying elements in the presence of the grain refiner on the grain refining efficiency is also reviewed and discussed. It is found that some elements e.g. V, Mo, C improves the grain refining efficiency, whereas other elements e.g. Cr, Zr, Ta poisons the grain refinement. Based on the parameters affecting the grain refinement and its mechanism, a criterion for selection of the optimum grain refiner is forwarded and discussed. (author)

  15. Source selection for analogical reasoning an empirical approach

    Energy Technology Data Exchange (ETDEWEB)

    Stubblefield, W.A. [Sandia National Labs., Albuquerque, NM (United States); Luger, G.F. [Univ. of New Mexico, Albuquerque, NM (United States)

    1996-12-31

    The effectiveness of an analogical reasoner depends upon its ability to select a relevant analogical source. In many problem domains, however, too little is known about target problems to support effective source selection. This paper describes the design and evaluation of SCAVENGER, an analogical reasoner that applies two techniques to this problem: (1) An assumption-based approach to matching that allows properties of candidate sources to match unknown target properties in the absence of evidence to the contrary. (2) The use of empirical learning to improve memory organization based on problem solving experience.

  16. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  17. Ecosystem-based management and refining governance of wind energy in the Massachusetts coastal zone: A case study approach

    Science.gov (United States)

    Kumin, Enid C.

    While there are as yet no wind energy facilities in New England coastal waters, a number of wind turbine projects are now operating on land adjacent to the coast. In the Gulf of Maine region (from Maine to Massachusetts), at least two such projects, one in Falmouth, Massachusetts, and another on the island of Vinalhaven, Maine, began operation with public backing only to face subsequent opposition from some who were initially project supporters. I investigate the reasons for this dynamic using content analysis of documents related to wind energy facility development in three case study communities. For comparison and contrast with the Vinalhaven and Falmouth case studies, I examine materials from Hull, Massachusetts, where wind turbine construction and operation has received steady public support and acceptance. My research addresses the central question: What does case study analysis of the siting and initial operation of three wind energy projects in the Gulf of Maine region reveal that can inform future governance of wind energy in Massachusetts state coastal waters? I consider the question with specific attention to governance of wind energy in Massachusetts, then explore ways in which the research results may be broadly transferable in the U.S. coastal context. I determine that the change in local response noted in Vinalhaven and Falmouth may have arisen from a failure of consistent inclusion of stakeholders throughout the entire scoping-to-siting process, especially around the reporting of environmental impact studies. I find that, consistent with the principles of ecosystem-based and adaptive management, design of governance systems may require on-going cycles of review and adjustment before the implementation of such systems as intended is achieved in practice. I conclude that evolving collaborative processes must underlie science and policy in our approach to complex environmental and wind energy projects; indeed, collaborative process is fundamental to

  18. Geometrical approach to central molecular chirality: a chirality selection rule

    OpenAIRE

    Capozziello, S.; Lattanzi, A.

    2004-01-01

    Chirality is of primary importance in many areas of chemistry and has been extensively investigated since its discovery. We introduce here the description of central chirality for tetrahedral molecules using a geometrical approach based on complex numbers. According to this representation, for a molecule having n chiral centres, it is possible to define an index of chirality. Consequently a chirality selection rule has been derived which allows the characterization of a molecule as achiral, e...

  19. Log-Optimal Portfolio Selection Using the Blackwell Approachability Theorem

    OpenAIRE

    V'yugin, Vladimir

    2014-01-01

    We present a method for constructing the log-optimal portfolio using the well-calibrated forecasts of market values. Dawid's notion of calibration and the Blackwell approachability theorem are used for computing well-calibrated forecasts. We select a portfolio using this "artificial" probability distribution of market values. Our portfolio performs asymptotically at least as well as any stationary portfolio that redistributes the investment at each round using a continuous function of side in...

  20. An integrated MCDM approach to green supplier selection

    Directory of Open Access Journals (Sweden)

    Morteza Yazdani

    2014-06-01

    Full Text Available Supplier selection management has been considered as an important subject for industrial organizations. In order to remain on the market, to gain profitability and to retain competitive advantage, business units need to establish an integrated and structured supplier selection system. In addition, environmental protection problems have been big solicitudes for organizations to consider green approach in supplier selection problem. However, finding proper suppliers involves several variables and it is critically a complex process. In this paper, the main attention is focused on finding the right supplier based on fuzzy multi criteria decision making (MCDM process. The weights of criteria are calculated by analytical hierarchical process (AHP and the final ranking is achieved by fuzzy technique for order preference by similarity to an ideal solution (TOPSIS. TOPSIS advantage among the other similar methods is to obtain the best solution close to ideal solution. The paper attempts to express better understanding by an example of an automobile manufacturing supply chain.

  1. An intutionistic fuzzy optimization approach to vendor selection problem

    Directory of Open Access Journals (Sweden)

    Prabjot Kaur

    2016-09-01

    Full Text Available Selecting the right vendor is an important business decision made by any organization. The decision involves multiple criteria and if the objectives vary in preference and scope, then nature of decision becomes multiobjective. In this paper, a vendor selection problem has been formulated as an intutionistic fuzzy multiobjective optimization where appropriate number of vendors is to be selected and order allocated to them. The multiobjective problem includes three objectives: minimizing the net price, maximizing the quality, and maximizing the on time deliveries subject to supplier's constraints. The objection function and the demand are treated as intutionistic fuzzy sets. An intutionistic fuzzy set has its ability to handle uncertainty with additional degrees of freedom. The Intutionistic fuzzy optimization (IFO problem is converted into a crisp linear form and solved using optimization software Tora. The advantage of IFO is that they give better results than fuzzy/crisp optimization. The proposed approach is explained by a numerical example.

  2. Colorectal cancer chemoprevention: the potential of a selective approach.

    Science.gov (United States)

    Ben-Amotz, Oded; Arber, Nadir; Kraus, Sarah

    2010-10-01

    Colorectal cancer (CRC) is a leading cause of cancer death, and therefore demands special attention. Novel recent approaches for the chemoprevention of CRC focus on selective targeting of key pathways. We review the study by Zhang and colleagues, evaluating a selective approach targeting APC-deficient premalignant cells using retinoid-based therapy and TNF-related apoptosis-inducing ligand (TRAIL). This study demonstrates that induction of TRAIL-mediated death signaling contributes to the chemopreventive value of all-trans-retinyl acetate (RAc) by sensitizing premalignant adenoma cells for apoptosis without affecting normal cells. We discuss these important findings, raise few points that deserve consideration, and may further contribute to the development of RAc-based combination therapies with improved efficacy. The authors clearly demonstrate a synergistic interaction between TRAIL, RAc and APC, which leads to the specific cell death of premalignant target cells. The study adds to the growing body of literature related to CRC chemoprevention, and provides solid data supporting a potentially selective approach for preventing CRC using RAc and TRAIL.

  3. The effects of carbon prices and anti-leakage policies on selected industrial sectors in Spain – Cement, steel and oil refining

    International Nuclear Information System (INIS)

    Santamaría, Alberto; Linares, Pedro; Pintos, Pablo

    2014-01-01

    This paper assesses the impacts on the cement, steel and oil refining sectors in Spain of the carbon prices derived from the European Emissions Trading Scheme (EU ETS), and the potential effect on these sectors of the European Union anti-leakage policy measures. The assessment is carried out by means of three engineering models developed for this purpose. Our results show a high exposure to leakage of cement in coastal regions; a smaller risk in the steel sector, and non-negligible risk of leakage for the oil refining sector when carbon allowance prices reach high levels. We also find that the risk of leakage could be better handled with other anti-leakage policies than those currently in place in the EU. - Highlights: • We simulate the impact of carbon prices on the risk of leakage in the cement, steel and oil refining sectors. • We also assess the effectiveness of different anti-leakage policies in Europe. • Cement production in coastal areas is highly exposed. • The risk of leakage for steel and oil refining is smaller. • Anti-leakage policies should be modified to be efficient

  4. 《文选》在古代朝鲜半岛的传播及其价值%Studies on the value and circulation of Selections of Refined Literature in ancient Korean peninsular

    Institute of Scientific and Technical Information of China (English)

    季南

    2012-01-01

      Selections of Refined Literature, which is known as the kingleader of anthology or the source of literature works, speaded to ancient Korean peninsular. The status of Selections of Refined Literature was in unbalanced and unstable state with the change of the political system and literary concept. Selections of Refined Literature has an important role in history, literature and acadamic.%  被誉为“总集之弁冕”“文章之渊薮”的《文选》伴随着中朝两国的文化交流传到古代朝鲜半岛。由于政治制度和文学观念的变迁,《文选》的地位在古代朝鲜半岛各朝代呈现出不平衡不稳定的状态。《文选》之于古代朝鲜半岛有着重要的历史价值、文学价值、学术价值。

  5. Molecular genetics and livestock selection. Approaches, opportunities and risks

    International Nuclear Information System (INIS)

    Williams, J.L.

    2005-01-01

    Following domestication, livestock were selected both naturally through adaptation to their environments and by man so that they would fulfil a particular use. As selection methods have become more sophisticated, rapid progress has been made in improving those traits that are easily measured. However, selection has also resulted in decreased diversity. In some cases, improved breeds have replaced local breeds, risking the loss of important survival traits. The advent of molecular genetics provides the opportunity to identify the genes that control particular traits by a gene mapping approach. However, as with selection, the early mapping studies focused on traits that are easy to measure. Where molecular genetics can play a valuable role in livestock production is by providing the means to select effectively for traits that are difficult to measure. Identifying the genes underpinning particular traits requires a population in which these traits are segregating. Fortunately, several experimental populations have been created that have allowed a wide range of traits to be studied. Gene mapping work in these populations has shown that the role of particular genes in controlling variation in a given trait can depend on the genetic background. A second finding is that the most favourable alleles for a trait may in fact. be present in animals that perform poorly for the trait. In the long term, knowledge of -the genes controlling particular traits, and the way they interact with the genetic background, will allow introgression between breeds and the assembly of genotypes that are best suited to particular environments, producing animals with the desired characteristics. If used wisely, this approach will maintain genetic diversity while improving performance over a wide range of desired traits. (author)

  6. On the refinement calculus

    CERN Document Server

    Vickers, Trevor

    1992-01-01

    On the Refinement Calculus gives one view of the development of the refinement calculus and its attempt to bring together - among other things - Z specifications and Dijkstra's programming language. It is an excellent source of reference material for all those seeking the background and mathematical underpinnings of the refinement calculus.

  7. Cell-Averaged discretization for incompressible Navier-Stokes with embedded boundaries and locally refined Cartesian meshes: a high-order finite volume approach

    Science.gov (United States)

    Bhalla, Amneet Pal Singh; Johansen, Hans; Graves, Dan; Martin, Dan; Colella, Phillip; Applied Numerical Algorithms Group Team

    2017-11-01

    We present a consistent cell-averaged discretization for incompressible Navier-Stokes equations on complex domains using embedded boundaries. The embedded boundary is allowed to freely cut the locally-refined background Cartesian grid. Implicit-function representation is used for the embedded boundary, which allows us to convert the required geometric moments in the Taylor series expansion (upto arbitrary order) of polynomials into an algebraic problem in lower dimensions. The computed geometric moments are then used to construct stencils for various operators like the Laplacian, divergence, gradient, etc., by solving a least-squares system locally. We also construct the inter-level data-transfer operators like prolongation and restriction for multi grid solvers using the same least-squares system approach. This allows us to retain high-order of accuracy near coarse-fine interface and near embedded boundaries. Canonical problems like Taylor-Green vortex flow and flow past bluff bodies will be presented to demonstrate the proposed method. U.S. Department of Energy, Office of Science, ASCR (Award Number DE-AC02-05CH11231).

  8. Using a systematic approach to select flagship species for bird conservation.

    Science.gov (United States)

    Veríssimo, Diogo; Pongiluppi, Tatiana; Santos, Maria Cintia M; Develey, Pedro F; Fraser, Iain; Smith, Robert J; MacMilan, Douglas C

    2014-02-01

    Conservation marketing campaigns that focus on flagship species play a vital role in biological diversity conservation because they raise funds and change people's behavior. However, most flagship species are selected without considering the target audience of the campaign, which can hamper the campaign's effectiveness. To address this problem, we used a systematic and stakeholder-driven approach to select flagship species for a conservation campaign in the Serra do Urubu in northeastern Brazil. We based our techniques on environmental economic and marketing methods. We used choice experiments to examine the species attributes that drive preference and latent-class models to segment respondents into groups by preferences and socioeconomic characteristics. We used respondent preferences and information on bird species inhabiting the Serra do Urubu to calculate a flagship species suitability score. We also asked respondents to indicate their favorite species from a set list to enable comparison between methods. The species' traits that drove audience preference were geographic distribution, population size, visibility, attractiveness, and survival in captivity. However, the importance of these factors differed among groups and groups differed in their views on whether species with small populations and the ability to survive in captivity should be prioritized. The popularity rankings of species differed between approaches, a result that was probably related to the different ways in which the 2 methods measured preference. Our new approach is a transparent and evidence-based method that can be used to refine the way stakeholders are engaged in the design of conservation marketing campaigns. © 2013 Society for Conservation Biology.

  9. An integrated approach to route selection in slurry pipeline design

    Energy Technology Data Exchange (ETDEWEB)

    Betinol, Roy G.; Altmann, Nara [Brass Chile S.A., Santiago (Chile)

    2009-12-19

    The pressure to get engineering projects done and constructed as fast as possible in order to take advantage of the high prices in metals and petrochemicals has been driving companies to skip the conceptual phase and go straight into basic engineering with cost estimates in the level of 15% accuracy. By-passing early engineering and demanding higher cost estimating accuracy is a contradiction. In most cases, savings made on capital investment is much higher had money been spent in conceptual studies which allow for the optimal solution to be found. This paper reviews one of the key aspects in conceptual engineering of slurry pipeline designs: route selection. This activity is often overlooked, causing capital cost and operating difficulties to rise unnecessarily. This paper describes and gives example on how an integrated client/engineering company's approach to route selection can produce significant savings in pipeline construction and operating costs. (author)

  10. A unified conformational selection and induced fit approach to protein-peptide docking.

    Directory of Open Access Journals (Sweden)

    Mikael Trellet

    Full Text Available Protein-peptide interactions are vital for the cell. They mediate, inhibit or serve as structural components in nearly 40% of all macromolecular interactions, and are often associated with diseases, making them interesting leads for protein drug design. In recent years, large-scale technologies have enabled exhaustive studies on the peptide recognition preferences for a number of peptide-binding domain families. Yet, the paucity of data regarding their molecular binding mechanisms together with their inherent flexibility makes the structural prediction of protein-peptide interactions very challenging. This leaves flexible docking as one of the few amenable computational techniques to model these complexes. We present here an ensemble, flexible protein-peptide docking protocol that combines conformational selection and induced fit mechanisms. Starting from an ensemble of three peptide conformations (extended, a-helix, polyproline-II, flexible docking with HADDOCK generates 79.4% of high quality models for bound/unbound and 69.4% for unbound/unbound docking when tested against the largest protein-peptide complexes benchmark dataset available to date. Conformational selection at the rigid-body docking stage successfully recovers the most relevant conformation for a given protein-peptide complex and the subsequent flexible refinement further improves the interface by up to 4.5 Å interface RMSD. Cluster-based scoring of the models results in a selection of near-native solutions in the top three for ∼75% of the successfully predicted cases. This unified conformational selection and induced fit approach to protein-peptide docking should open the route to the modeling of challenging systems such as disorder-order transitions taking place upon binding, significantly expanding the applicability limit of biomolecular interaction modeling by docking.

  11. Refining the Results of a Classical SELEX Experiment by Expanding the Sequence Data Set of an Aptamer Pool Selected for Protein A

    Directory of Open Access Journals (Sweden)

    Regina Stoltenburg

    2018-02-01

    Full Text Available New, as yet undiscovered aptamers for Protein A were identified by applying next generation sequencing (NGS to a previously selected aptamer pool. This pool was obtained in a classical SELEX (Systematic Evolution of Ligands by EXponential enrichment experiment using the FluMag-SELEX procedure followed by cloning and Sanger sequencing. PA#2/8 was identified as the only Protein A-binding aptamer from the Sanger sequence pool, and was shown to be able to bind intact cells of Staphylococcus aureus. In this study, we show the extension of the SELEX results by re-sequencing of the same aptamer pool using a medium throughput NGS approach and data analysis. Both data pools were compared. They confirm the selection of a highly complex and heterogeneous oligonucleotide pool and show consistently a high content of orphans as well as a similar relative frequency of certain sequence groups. But in contrast to the Sanger data pool, the NGS pool was clearly dominated by one sequence group containing the known Protein A-binding aptamer PA#2/8 as the most frequent sequence in this group. In addition, we found two new sequence groups in the NGS pool represented by PA-C10 and PA-C8, respectively, which also have high specificity for Protein A. Comparative affinity studies reveal differences between the aptamers and confirm that PA#2/8 remains the most potent sequence within the selected aptamer pool reaching affinities in the low nanomolar range of KD = 20 ± 1 nM.

  12. Refining the Results of a Classical SELEX Experiment by Expanding the Sequence Data Set of an Aptamer Pool Selected for Protein A.

    Science.gov (United States)

    Stoltenburg, Regina; Strehlitz, Beate

    2018-02-24

    New, as yet undiscovered aptamers for Protein A were identified by applying next generation sequencing (NGS) to a previously selected aptamer pool. This pool was obtained in a classical SELEX (Systematic Evolution of Ligands by EXponential enrichment) experiment using the FluMag-SELEX procedure followed by cloning and Sanger sequencing. PA#2/8 was identified as the only Protein A-binding aptamer from the Sanger sequence pool, and was shown to be able to bind intact cells of Staphylococcus aureus . In this study, we show the extension of the SELEX results by re-sequencing of the same aptamer pool using a medium throughput NGS approach and data analysis. Both data pools were compared. They confirm the selection of a highly complex and heterogeneous oligonucleotide pool and show consistently a high content of orphans as well as a similar relative frequency of certain sequence groups. But in contrast to the Sanger data pool, the NGS pool was clearly dominated by one sequence group containing the known Protein A-binding aptamer PA#2/8 as the most frequent sequence in this group. In addition, we found two new sequence groups in the NGS pool represented by PA-C10 and PA-C8, respectively, which also have high specificity for Protein A. Comparative affinity studies reveal differences between the aptamers and confirm that PA#2/8 remains the most potent sequence within the selected aptamer pool reaching affinities in the low nanomolar range of K D = 20 ± 1 nM.

  13. The transradial approach for selective carotid and vertebral angiography

    Energy Technology Data Exchange (ETDEWEB)

    Iwasaki, S.; Ueda, K.; Sueyosi, S.; Nagasawa, M.; Ude, K. [Higashiosaka City General Hospital, Osaka (Japan). Dept. of Radiology; Yokoyama, K. [Higashiosaka City General Hospital, Osaka (Japan). Dept. of Neurosurgery; Takayama, K.; Nakagawa, H.; Kichikawa, K. [Nara Medical Univ., Osaka (Japan). Dept. of Radiology

    2002-11-01

    Purpose: The transradial approach is not so popular in cerebral angiography. The purpose of this study was therefore to present our experience of success rate and safety of this method. Material and Methods: From December 1998 to June 2001, 526 carotid and vertebral angiographies with DSA were performed via the radial artery. A 1.4-mm catheter was used through a 1.4-mm introducer sheath. We evaluated the procedure as successful if sufficient images for diagnosis were obtained of the bilateral carotid arteries and unilateral vertebral artery. Each patient was reassessed for any complications, occurring until the next morning. The length of time needed for an examination was measured in the last 10 cases. Results: In all but 5 cases, the procedures were evaluated as successful (99.0%). Unsuccessful cases manifested severe pain at the radial puncture, angiospasm at the radial artery, loop formation at the radial artery, occlusion at the subclavian artery, and an aberrant right subclavian artery. No severe complications including neurological ones were encountered. Minor complications were noted in 17 cases (3.2%): 4 cases of thrombus at the ulnar artery, 1 angiospasm at the radial artery, and 12 cases of small hematoma at the puncture site. The radial approach took 14 min less in the common carotid study and 3 min 30 s less in the internal carotid study than by the femoral approach. Conclusion: The transradial approach enabled selective studies for carotid and vertebral angiography with a high success rate and safety with few complications.

  14. The transradial approach for selective carotid and vertebral angiography

    International Nuclear Information System (INIS)

    Iwasaki, S.; Ueda, K.; Sueyosi, S.; Nagasawa, M.; Ude, K.; Yokoyama, K.; Takayama, K.; Nakagawa, H.; Kichikawa, K.

    2002-01-01

    Purpose: The transradial approach is not so popular in cerebral angiography. The purpose of this study was therefore to present our experience of success rate and safety of this method. Material and Methods: From December 1998 to June 2001, 526 carotid and vertebral angiographies with DSA were performed via the radial artery. A 1.4-mm catheter was used through a 1.4-mm introducer sheath. We evaluated the procedure as successful if sufficient images for diagnosis were obtained of the bilateral carotid arteries and unilateral vertebral artery. Each patient was reassessed for any complications, occurring until the next morning. The length of time needed for an examination was measured in the last 10 cases. Results: In all but 5 cases, the procedures were evaluated as successful (99.0%). Unsuccessful cases manifested severe pain at the radial puncture, angiospasm at the radial artery, loop formation at the radial artery, occlusion at the subclavian artery, and an aberrant right subclavian artery. No severe complications including neurological ones were encountered. Minor complications were noted in 17 cases (3.2%): 4 cases of thrombus at the ulnar artery, 1 angiospasm at the radial artery, and 12 cases of small hematoma at the puncture site. The radial approach took 14 min less in the common carotid study and 3 min 30 s less in the internal carotid study than by the femoral approach. Conclusion: The transradial approach enabled selective studies for carotid and vertebral angiography with a high success rate and safety with few complications

  15. A Grey Fuzzy Logic Approach for Cotton Fibre Selection

    Science.gov (United States)

    Chakraborty, Shankar; Das, Partha Protim; Kumar, Vidyapati

    2017-06-01

    It is a well known fact that the quality of ring spun yarn predominantly depends on various physical properties of cotton fibre. Any variation in these fibre properties may affect the strength and unevenness of the final yarn. Thus, so as to achieve the desired yarn quality and characteristics, it becomes imperative for the spinning industry personnel to identify the most suitable cotton fibre from a set of feasible alternatives in presence of several conflicting properties/attributes. This cotton fibre selection process can be modelled as a Multi-Criteria Decision Making (MCDM) problem. In this paper, a grey fuzzy logic-based approach is proposed for selection of the most apposite cotton fibre from 17 alternatives evaluated based on six important fibre properties. It is observed that the preference order of the top-ranked cotton fibres derived using the grey fuzzy logic approach closely matches with that attained by the past researchers which proves the application potentiality of this method in solving varying MCDM problems in textile industries.

  16. Optimal Subinterval Selection Approach for Power System Transient Stability Simulation

    Directory of Open Access Journals (Sweden)

    Soobae Kim

    2015-10-01

    Full Text Available Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modal analysis using a single machine infinite bus (SMIB system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. The performance of the proposed method is demonstrated with the GSO 37-bus system.

  17. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  18. Creating value in refining

    International Nuclear Information System (INIS)

    Cobb, C.B.

    2001-01-01

    This article focuses on recent developments in the US refining industry and presents a model for improving the performance of refineries based on the analysis of the refining industry by Cap Gemini Ernst and Young. The identification of refineries in risk of failing, the construction of pipelines for refinery products from Gulf State refineries, mergers and acquisitions, and poor financial performance are discussed. Current challenges concerning the stagnant demand for refinery products, environmental regulations, and shareholder value are highlighted. The structure of the industry, the creation of value in refining, and the search for business models are examined. The top 25 US companies and US refining business groups are listed

  19. Fuzzy Axiomatic Design approach based green supplier selection

    DEFF Research Database (Denmark)

    Kannan, Devika; Govindan, Kannan; Rajendran, Sivakumar

    2015-01-01

    proposes a multi-criteria decision-making (MCDM) approach called Fuzzy Axiomatic Design (FAD) to select the best green supplier for Singapore-based plastic manufacturing company. At first, the environmental criteria was developed along with the traditional criteria based on the literature review......Abstract Green Supply Chain Management (GSCM) is a developing concept recently utilized by manufacturing firms of all sizes. All industries, small or large, seek improvements in the purchasing of raw materials, manufacturing, allocation, transportation efficiency, in curbing storage time, importing...... responsible in addition to being efficiently managed. A significant way to implement responsible GSCM is to reconsider, in innovative ways, the purchase and supply cycle, and a preliminary step would be to ensure that the supplier of goods successfully incorporates green criteria. Therefore, this paper...

  20. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  1. Analysis of Trust-Based Approaches for Web Service Selection

    DEFF Research Database (Denmark)

    Dragoni, Nicola; Miotto, Nicola

    2011-01-01

    The basic tenet of Service-Oriented Computing (SOC) is the possibility of building distributed applications on the Web by using Web services as fundamental building blocks. The proliferation of such services is considered the second wave of evolution in the Internet age, moving the Web from...... a collection of pages to a collections of services. Consensus is growing that this Web service revolution wont eventuate until we resolve trust-related issues. Indeed, the intrinsic openness of the SOC vision makes crucial to locate useful services and recognize them as trustworthy. In this paper we review...... the field of trust-based Web service selection, providing a structured classification of current approaches and highlighting the main limitations of each class and of the overall field....

  2. Transbasal versus endoscopic endonasal versus combined approaches for olfactory groove meningiomas: importance of approach selection.

    Science.gov (United States)

    Liu, James K; Silva, Nicole A; Sevak, Ilesha A; Eloy, Jean Anderson

    2018-04-01

    OBJECTIVE There has been much debate regarding the optimal surgical approach for resecting olfactory groove meningiomas (OGMs). In this paper, the authors analyzed the factors involved in approach selection and reviewed the surgical outcomes in a series of OGMs. METHODS A retrospective review of 28 consecutive OGMs from a prospective database was conducted. Each tumor was treated via one of 3 approaches: transbasal approach (n = 15), pure endoscopic endonasal approach (EEA; n = 5), and combined (endoscope-assisted) transbasal-EEA (n = 8). RESULTS The mean tumor volume was greatest in the transbasal (92.02 cm 3 ) and combined (101.15 cm 3 ) groups. Both groups had significant lateral dural extension over the orbits (transbasal 73.3%, p 95%) was achieved in 20% of transbasal and 37.5% of combined cases, all due to tumor adherence to the critical neurovascular structures. The rate of CSF leakage was 0% in the transbasal and combined groups, and there was 1 leak in the EEA group (20%), resulting in an overall CSF leakage rate of 3.6%. Olfaction was preserved in 66.7% in the transbasal group. There was no significant difference in length of stay or 30-day readmission rate between the 3 groups. The mean modified Rankin Scale score was 0.79 after the transbasal approach, 2.0 after EEA, and 2.4 after the combined approach (p = 0.0604). The mean follow-up was 14.5 months (range 1-76 months). CONCLUSIONS The transbasal approach provided the best clinical outcomes with the lowest rate of complications for large tumors (> 40 mm) and for smaller tumors (OGMs invading the sinonasal cavity. Careful patient selection using an individualized, tailored strategy is important to optimize surgical outcomes.

  3. Selection of treatment and surgical approach for vestibular schwannomas

    International Nuclear Information System (INIS)

    Eguchi, Kuniki; Yamaguchi, Satoshi; Sakoda, Eiichiro

    2007-01-01

    Described are the present state of selection of stereotactic radiotherapy or surgical treatment and their combination for schwannomas in the title, and authors' policy of surgery as a first choice treatment. The policy stands on the concept that surgery is useful for the controllability thereafter of the tumor, of which size is at first larger than 25 mm diameter, and radiotherapy like a gamma-knife is applicable to the residual tissue grown after operation and to the tissue with less than the size before surgery because the smaller the size, the better is thought the control by the knife (reportedly 100% for the tissue of <14 mm diameter). The basis of authors' selection of two surgical approaches of through-lower lateral occiput and trans-labyrinthine, and their outcomes like hearing loss are described and discussed in details for 24 patients (two underwent radiotherapy before surgery) during the period Sep. 2003-Aug. 2006 of authors' hospital. Radiotherapy is thought essentially useful for the control of the tumor with a small or surgically reduced size. (R.T.)

  4. An Ensemble-Based Training Data Refinement for Automatic Crop Discrimination Using WorldView-2 Imagery

    DEFF Research Database (Denmark)

    Chellasamy, Menaka; Ferre, Ty Paul; Greve, Mogens Humlekrog

    2015-01-01

    This paper presents a new approach for refining and selecting training data for satellite imagery-based crop discrimination. The goal of this approach is to automate the pixel-based “multievidence crop classification approach,” proposed by the authors in their previous research. The present study...

  5. Using a modified intervention mapping approach to develop and refine a single-session motivational intervention for methamphetamine-using men who have sex with men.

    Science.gov (United States)

    Zule, William A; Coomes, Curtis M; Karg, Rhonda; Harris, Jennie L; Orr, Alex; Wechsberg, Wendee M

    2010-05-14

    There is an ongoing need for the development and adaptation of behavioral interventions to address behaviors related to acquisition and transmission of infectious diseases and for preventing the onset of chronic diseases. This paper describes the application of an established systematic approach to the development of a behavioral intervention to reduce sexual risk behaviors for HIV among men who have sex with men and who use methamphetamine. The approach includes six steps: (1) a needs assessment; (2) preparing matrices of proximal program objectives; (3) selecting theory-based methods and practical strategies; (4) producing program components and materials; (5) planning for program adoption, implementation, and sustainability; and (6) planning for evaluation. The focus of this article is on the intervention development process; therefore the article does not describe steps 5 and 6. Overall the process worked well, although it had to be adapted to fit the sequence of events associated with a funded research project. This project demonstrates that systematic approaches to intervention development can be applied even in research projects where some of the steps occur during the proposal writing process rather than during the actual project. However, intervention developers must remain flexible and be prepared to adapt the process to the situation. This includes being ready to make choices regarding intervention efficacy versus feasibility and being willing to select the best intervention that is likely to be delivered with available resources rather than an ideal intervention that may not be practical.

  6. Estimates of dietary exposure to bisphenol A (BPA) from light metal packaging using food consumption and packaging usage data: a refined deterministic approach and a fully probabilistic (FACET) approach.

    Science.gov (United States)

    Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J

    2014-01-01

    The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the

  7. A multivariate-utility approach for selection of energy sources

    International Nuclear Information System (INIS)

    Ahmed, S; Husseiny, A.A.

    1978-01-01

    A deterministic approach is devised to compare the safety features of various energy sources. The approach is based on multiattribute utility theory. The method is used in evaluating the safety aspects of alternative energy sources used for the production of electrical energy. Four alternative energy sources are chosen which could be considered for the production of electricity to meet the national energy demand. These are nuclear, coal, solar, and geothermal energy. For simplicity, a total electrical system is considered in each case. A computer code is developed to evaluate the overall utility function for each alternative from the utility patterns corresponding to 23 energy attributes, mostly related to safety. The model can accommodate other attributes assuming that these are independent. The technique is kept flexible so that virtually any decision problem with various attributes can be attacked and optimal decisions can be reached. The selected data resulted in preference of geothermal and nuclear energy over other sources, and the method is found viable in making decisions on energy uses based on quantified and subjective attributes. (author)

  8. Relational Demonic Fuzzy Refinement

    Directory of Open Access Journals (Sweden)

    Fairouz Tchier

    2014-01-01

    Full Text Available We use relational algebra to define a refinement fuzzy order called demonic fuzzy refinement and also the associated fuzzy operators which are fuzzy demonic join (⊔fuz, fuzzy demonic meet (⊓fuz, and fuzzy demonic composition (□fuz. Our definitions and properties are illustrated by some examples using mathematica software (fuzzy logic.

  9. An approach to selecting routes over which to transport excess salt from the Deaf Smith County Site

    International Nuclear Information System (INIS)

    1987-09-01

    This report presents an approach to be utilized in the identification of rail and/or highway routes for the disposal of waste salt and other salt contaminated material from repository construction. Relevant issues regarding salt transport also are identified. The report identifies a sequence of activities that precede actual route selection, i.e., final selection of a salt disposal method and its location, refined estimates of salt shipment volume and schedule, followed by selection of rail or truck or a combination thereof, as the preferred transport mode. After these factors are known, the route selection process can proceed. Chapter 2.0 of this report identifies directives and requirements that potentially could affect salt transport from the Deaf Smith site. A summary of salt disposal alternatives and reference cases is contained in Chapter 3.0. Chapter 4.0 identifies and discusses current methods of salt handling and transport in the United States, and also provides some perspective as to the volume of excess salt to be transported from the Deaf Smith site relative to current industry practices. Chapter 5.0 identifies an approach to the salt transportation issue, and suggests one system for evaluating alternative highway routes for truck shipments

  10. Characterization and process development for the selective removal of Sn, Sb, and As from anode slime obtained from electrolytic copper refining

    Directory of Open Access Journals (Sweden)

    Steinlechner S.

    2018-01-01

    Full Text Available The aim of this work was to develop a process for the removal of Sn, Sb and As from anode slime out of copper refinery to disburden a subsequent pyrometallurgical processing for precious metals refinement. For this reason, a detailed literature survey was conducted, followed by a characterization to find the present compounds/alloys and their morphology. A newly developed process concept for the separate extraction of the afore mentioned three target metals was developed and verified by leaching experiments, combined with thermodynamic calculations on their behavior under varying conditions. In this context, the influence of leaching temperature, alkalinity of leaching solution, and solid-liquid ration were evaluated on the extraction yields of Sn, As, and Sb, as well as how to exploit these findings to obtain separate streams enriched in the respective metals.

  11. practical common weight maximin approach for technology selection

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... Keywords: Technology selection, Robot selection, Maximin .... manufacturers have consequently oriented the researchers to ..... value judgements in data envelopment analysis: Evolution, development and future directions',.

  12. Differentiation of Bread Made with Whole Grain and Refined Wheat (T. aestivum) Flour Using LC/MS-based chromatographic Fingerprinting and Chemometric Approaches

    Science.gov (United States)

    A fuzzy chromatography mass spectrometric (FCMS) fingerprinting method combined with chemometric analysis was established to diffrentiate between whole wheat (WW) flours and refined wheat (RW) flour, and the breads made from them. The chemical compositions of the bread samples were profiled using h...

  13. Unearthing how, why, for whom and under what health system conditions the antiretroviral treatment adherence club intervention in South Africa works: A realist theory refining approach.

    Science.gov (United States)

    Mukumbang, Ferdinand C; Marchal, Bruno; Van Belle, Sara; van Wyk, Brian

    2018-05-09

    Poor retention in care and suboptimal adherence to antiretroviral treatment (ART) undermine its successful rollout in South Africa. The adherence club intervention was designed as an adherence-enhancing intervention to enhance the retention in care of patients on ART and their adherence to medication. Although empirical evidence suggests the effective superiority of the adherence club intervention to standard clinic ART care schemes, it is poorly understood exactly how and why it works, and under what health system contexts. To this end, we aimed to develop a refined programme theory explicating how, why, for whom and under what health system contexts the adherence club intervention works (or not). We undertook a realist evaluation study to uncover the programme theory of the adherence club intervention. We elicited an initial programme theory of the adherence club intervention and tested the initial programme theory in three contrastive sites. Using a cross-case analysis approach, we delineated the conceptualisation of the intervention, context, actor and mechanism components of the three contrastive cases to explain the outcomes of the adherence club intervention, guided by retroductive inferencing. We found that an intervention that groups clinically stable patients on ART in a convenient space to receive a quick and uninterrupted supply of medication, health talks, counselling, and immediate access to a clinician when required works because patients' self-efficacy improves and they become motivated and nudged to remain in care and adhere to medication. The successful implementation and rollout of the adherence club intervention are contingent on the separation of the adherence club programme from other patients who are HIV-negative. In addition, there should be available convenient space for the adherence club meetings, continuous support of the adherence club facilitators by clinicians and buy-in from the health workers at the health-care facility and the

  14. The economics of window selection: An incremental approach

    International Nuclear Information System (INIS)

    Dixon, W.T.

    1993-01-01

    The options available to Energy Service Companies when improving the energy performance of an existing building are often driven by short-term payback cycles. The value of a measure is based on how quickly it pays for itself. The more quickly the energy savings created by the measure exceed the cost of purchasing and installing the measure, the more comfortable the engineer feels recommending that improvement. In the best cases, the short-term approach will quickly retire the debts associated with a particular retrofit and provide a dependable, albeit limited net savings stream for the property owner. The engineer has obtained energy savings for his client. The problem with this short-term approach is that it automatically eliminates other conservation measures which, over longer time horizons, could add far more value for the customer. The installation of new, extremely energy efficient replacement windows is a case in point. During preliminary discussions with our clients, (typically Public Housing Authorities or owners of subsidized, multi-family housing), the conversation eventually turns to the issue of replacement windows. The perception is that new windows are a luxury. The decision to install new windows is driven by maintenance costs and, in some cases, resident complaints over operability or draftiness associated with the existing windows. Typically the windows are not handled as part of the mainstream energy conservation program. If the client has already installed new windows, he probably based his selection on the low bidder of a unit that has marginal thermal performance. Every property has a budget and compromises must often be made to meet budgets. The purchaser may have not gotten the Cadillac of windows, but at least he got a good deal on the window that he did buy. His maintenance problems have been solved for the near term and resident complaints have gone down, for now

  15. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  16. Agricultural Tractor Selection: A Hybrid and Multi-Attribute Approach

    Directory of Open Access Journals (Sweden)

    Jorge L. García-Alcaraz

    2016-02-01

    Full Text Available Usually, agricultural tractor investments are assessed using traditional economic techniques that only involve financial attributes, resulting in reductionist evaluations. However, tractors have qualitative and quantitative attributes that must be simultaneously integrated into the evaluation process. This article reports a hybrid and multi-attribute approach to assessing a set of agricultural tractors based on AHP-TOPSIS. To identify the attributes in the model, a survey including eighteen attributes was given to agricultural machinery salesmen and farmers for determining their importance. The list of attributes was presented to a decision group for a case of study, and their importance was estimated using AHP and integrated into the TOPSIS technique. In this case, one tractor was selected from a set of six alternatives, integrating six attributes in the model: initial cost, annual maintenance cost, liters of diesel per hour, safety of the operator, maintainability and after-sale customer service offered by the supplier. Based on the results obtained, the model can be considered easy to apply and to have good acceptance among farmers and salesmen, as there are no special software requirements for the application.

  17. Refining margins and prospects

    International Nuclear Information System (INIS)

    Baudouin, C.; Favennec, J.P.

    1997-01-01

    Refining margins throughout the world have remained low in 1996. In Europe, in spite of an improvement, particularly during the last few weeks, they are still not high enough to finance new investments. Although the demand for petroleum products is increasing, experts are still sceptical about any rapid recovery due to prevailing overcapacity and to continuing capacity growth. After a historical review of margins and an analysis of margins by regions, we analyse refining over-capacities in Europe and the unbalances between production and demand. Then we discuss the current situation concerning barriers to the rationalization, agreements between oil companies, and the consequences on the future of refining capacities and margins. (author)

  18. North American refining

    International Nuclear Information System (INIS)

    Osten, James; Haltmaier, Susan

    2000-01-01

    This article examines the current status of the North American refining industry, and considers the North American economy and the growth in demand in the petroleum industry, petroleum product demand and quality, crude oil upgrading to meet product standards, and changes in crude oil feedstocks such as the use of heavier crudes and bitumens. Refining expansion, the declining profits in refining, and changes due to environmental standards are discussed. The Gross Domestic Product and oil demand for the USA, Canada, Mexico, and Venezuela for the years 1995-2020 are tabulated

  19. International market selection and subsidiary performance : A neural network approach

    NARCIS (Netherlands)

    Brouthers, L.E.; Wilkinson, T.; Mukhopadhyay, S.; Brouthers, K.D.

    2009-01-01

    How should multinational enterprises (MNEs) select international markets? We develop a model of international market selection that adds firm-specific advantages and transaction cost considerations to previously explored target market factors based on Dunning's Eclectic Framework. Results obtained

  20. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  1. Grain refinement of zinc-aluminium alloys

    International Nuclear Information System (INIS)

    Zaid, A.I.O.

    2006-01-01

    It is now well-established that the structure of the zinc-aluminum die casting alloys can be modified by the binary Al-Ti or the ternary Al-Ti-B master alloys. in this paper, grain refinement of zinc-aluminum alloys by rare earth materials is reviewed and discussed. The importance of grain refining of these alloys and parameters affecting it are presented and discussed. These include parameters related to the Zn-Al alloys cast, parameters related to the grain refining elements or alloys and parameters related to the process. The effect of addition of other alloying elements e.g. Zr either alone or in the presence of the main grain refiners Ti or Ti + B on the grain refining efficiency is also reviewed and discussed. Furthermore, based on the grain refinement and the parameters affecting it, a criterion for selection of the optimum grain refiner is suggested. Finally, the recent research work on the effect of grain refiners on the mechanical behaviour, impact strength, wear resistance, and fatigue life of these alloys are presented and discussed. (author)

  2. Linearly Refined Session Types

    Directory of Open Access Journals (Sweden)

    Pedro Baltazar

    2012-11-01

    Full Text Available Session types capture precise protocol structure in concurrent programming, but do not specify properties of the exchanged values beyond their basic type. Refinement types are a form of dependent types that can address this limitation, combining types with logical formulae that may refer to program values and can constrain types using arbitrary predicates. We present a pi calculus with assume and assert operations, typed using a session discipline that incorporates refinement formulae written in a fragment of Multiplicative Linear Logic. Our original combination of session and refinement types, together with the well established benefits of linearity, allows very fine-grained specifications of communication protocols in which refinement formulae are treated as logical resources rather than persistent truths.

  3. Relational Demonic Fuzzy Refinement

    OpenAIRE

    Tchier, Fairouz

    2014-01-01

    We use relational algebra to define a refinement fuzzy order called demonic fuzzy refinement and also the associated fuzzy operators which are fuzzy demonic join $({\\bigsqcup }_{\\mathrm{\\text{f}}\\mathrm{\\text{u}}\\mathrm{\\text{z}}})$ , fuzzy demonic meet $({\\sqcap }_{\\mathrm{\\text{f}}\\mathrm{\\text{u}}\\mathrm{\\text{z}}})$ , and fuzzy demonic composition $({\\square }_{\\mathrm{\\text{f}}\\mathrm{\\text{u}}\\mathrm{\\text{z}}})$ . Our definitions and properties are illustrated by some examples using ma...

  4. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  5. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.

    Science.gov (United States)

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  6. Petroleum refining fitness assessment to the sectoral approaches to address climate change; Analise da aptidao do setor refino de petroleo as abordagens setoriais para lidar com as mudancas climaticas globais

    Energy Technology Data Exchange (ETDEWEB)

    Merschmann, Paulo Roberto de Campos

    2010-03-15

    The climate agreement that will take place from 2013 onwards needs to address some of the concerns that were not considered in the Kyoto Protocol. Such concerns include the absence of emission targets for big emitters developing countries and the impacts of unequal carbon-policies on the competitiveness of Annex 1 energy-intensive sectors. Sectoral approaches for energy-intensive sectors can be a solution to both concerns, mainly if they address climate change issues involving all the countries in which these sectors have a significant participation. A sector is a good candidate to the sectoral approaches if it has some characteristics. Such characteristics are high impact to the competitiveness of Annex 1 enterprises derived of the lack of compromises of enterprises located in non Annex 1 countries, high level of opportunities to mitigate GHG emissions based on the application of sectoral approaches and easy sectoral approaches implementation in the sector. Then, this work assesses the petroleum refining sector fitness to the sectoral approaches to address climate change. Also, this dissertation compares the petroleum refining sector characteristics to the characteristics of well suited sectors to the sectoral approaches. (author)

  7. Applying Fuzzy Decision Making Approach to IT Outsourcing Supplier Selection

    OpenAIRE

    Gülcin Büyüközkan; Mehmet Sakir Ersoy

    2009-01-01

    The decision of information technology (IT) outsourcing requires close attention to the evaluation of supplier selection process because the selection decision involves conflicting multiple criteria and is replete with complex decision making problems. Selecting the most appropriate suppliers is considered an important strategic decision that may impact the performance of outsourcing engagements. The objective of this paper is to aid decision makers to evaluate and assess possible IT outsourc...

  8. Selection bias in species distribution models: An econometric approach on forest trees based on structural modeling

    Science.gov (United States)

    Martin-StPaul, N. K.; Ay, J. S.; Guillemot, J.; Doyen, L.; Leadley, P.

    2014-12-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global changes on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of applications on forest trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8km). We also compared the outputs of the SSDM with outputs of a classical SDM (i.e. Biomod ensemble modelling) in terms of bioclimatic response curves and potential distributions under current climate and climate change scenarios. The shapes of the bioclimatic response curves and the modelled species distribution maps differed markedly between SSDM and classical SDMs, with contrasted patterns according to species and spatial resolutions. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents

  9. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba

    2016-07-20

    The native nature of high dimension low sample size of gene expression data make the classification task more challenging. Therefore, feature (gene) selection become an apparent need. Selecting a meaningful and relevant genes for classifier not only decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we present a new feature selection technique that takes advantage of clustering both samples and genes. Materials and methods We used leukemia gene expression dataset [1]. The effectiveness of the selected features were evaluated by four different classification methods; support vector machines, k-nearest neighbor, random forest, and linear discriminate analysis. The method evaluate the importance and relevance of each gene cluster by summing the expression level for each gene belongs to this cluster. The gene cluster consider important, if it satisfies conditions depend on thresholds and percentage otherwise eliminated. Results Initial analysis identified 7120 differentially expressed genes of leukemia (Fig. 15a), after applying our feature selection methodology we end up with specific 1117 genes discriminating two classes of leukemia (Fig. 15b). Further applying the same method with more stringent higher positive and lower negative threshold condition, number reduced to 58 genes have be tested to evaluate the effectiveness of the method (Fig. 15c). The results of the four classification methods are summarized in Table 11. Conclusions The feature selection method gave good results with minimum classification error. Our heat-map result shows distinct pattern of refines genes discriminating between two classes of leukemia.

  10. A Framework for Six Sigma Project Selection in Higher Educational Institutions, Using a Weighted Scorecard Approach

    Science.gov (United States)

    Holmes, Monica C.; Jenicke, Lawrence O.; Hempel, Jessica L.

    2015-01-01

    Purpose: This paper discusses the importance of the Six Sigma selection process, describes a Six Sigma project in a higher educational institution and presents a weighted scorecard approach for project selection. Design/Methodology/Approach: A case study of the Six Sigma approach being used to improve student support at a university computer help…

  11. Refining the Results of a Classical SELEX Experiment by Expanding the Sequence Data Set of an Aptamer Pool Selected for Protein A

    OpenAIRE

    Regina Stoltenburg; Beate Strehlitz

    2018-01-01

    New, as yet undiscovered aptamers for Protein A were identified by applying next generation sequencing (NGS) to a previously selected aptamer pool. This pool was obtained in a classical SELEX (Systematic Evolution of Ligands by EXponential enrichment) experiment using the FluMag-SELEX procedure followed by cloning and Sanger sequencing. PA#2/8 was identified as the only Protein A-binding aptamer from the Sanger sequence pool, and was shown to be able to bind intact cells of Staphylococcus aur...

  12. A fuzzy hybrid approach for project manager selection

    Directory of Open Access Journals (Sweden)

    Ahmad Jafarnejad Chaghooshi

    2016-09-01

    Full Text Available Suitable project manager has a significant impact on successful accomplishment of the project. Managers should possess such skills in order to effectively cope with the competition. In this respect, selecting managers based on their skills can lead to a competitive advantage towards the achievement of organizational goals. selection of the suitable project manager can be viewed as a multi-criteria decision making (MCDM problem and an extensive evaluation of criteria, such as Technical skills, experience skills, Personal qualities and the related criteria must be considered in the selection process of project manager. The fuzzy set theory and MCDM methods appears as an essential tools to provide a decision framework that incorporates imprecise judgments and multi criteria nature of project manager selection process inherent in this process. This paper proposes the joint use of the Fuzzy DEMATEL (FDEMATEL and Fuzzy VIKOR methods for the decision-making process of selecting the most suitable managers for projects. First, with the opinions of the senior managers based on project management competency model (ICB-IPMA, all the criteria required for the selection are gathered. Then the FDEMATEL method is used to prioritize the importance of various criteria and FVIKOR used to rank the alternatives in a preferred order to select the best project managers from a number of alternatives. Next, a real case study used to illustrate the process of the proposed method. Finally, some conclusions are discussed at the end of this study.

  13. Practical Approaches for Detecting Selection in Microbial Genomes.

    Directory of Open Access Journals (Sweden)

    Jessica Hedge

    2016-02-01

    Full Text Available Microbial genome evolution is shaped by a variety of selective pressures. Understanding how these processes occur can help to address important problems in microbiology by explaining observed differences in phenotypes, including virulence and resistance to antibiotics. Greater access to whole-genome sequencing provides microbiologists with the opportunity to perform large-scale analyses of selection in novel settings, such as within individual hosts. This tutorial aims to guide researchers through the fundamentals underpinning popular methods for measuring selection in pathogens. These methods are transferable to a wide variety of organisms, and the exercises provided are designed for researchers with any level of programming experience.

  14. Practical Approaches for Detecting Selection in Microbial Genomes.

    Science.gov (United States)

    Hedge, Jessica; Wilson, Daniel J

    2016-02-01

    Microbial genome evolution is shaped by a variety of selective pressures. Understanding how these processes occur can help to address important problems in microbiology by explaining observed differences in phenotypes, including virulence and resistance to antibiotics. Greater access to whole-genome sequencing provides microbiologists with the opportunity to perform large-scale analyses of selection in novel settings, such as within individual hosts. This tutorial aims to guide researchers through the fundamentals underpinning popular methods for measuring selection in pathogens. These methods are transferable to a wide variety of organisms, and the exercises provided are designed for researchers with any level of programming experience.

  15. Refining margins: recent trends

    International Nuclear Information System (INIS)

    Baudoin, C.; Favennec, J.P.

    1999-01-01

    Despite a business environment that was globally mediocre due primarily to the Asian crisis and to a mild winter in the northern hemisphere, the signs of improvement noted in the refining activity in 1996 were borne out in 1997. But the situation is not yet satisfactory in this sector: the low return on invested capital and the financing of environmental protection expenditure are giving cause for concern. In 1998, the drop in crude oil prices and the concomitant fall in petroleum product prices was ultimately rather favorable to margins. Two elements tended to put a damper on this relative optimism. First of all, margins continue to be extremely volatile and, secondly, the worsening of the economic and financial crisis observed during the summer made for a sharp decline in margins in all geographic regions, especially Asia. Since the beginning of 1999, refining margins are weak and utilization rates of refining capacities have decreased. (authors)

  16. Refining and petrochemicals

    Energy Technology Data Exchange (ETDEWEB)

    Constancio, Silva

    2006-07-01

    In 2004, refining margins showed a clear improvement that persisted throughout the first three quarters of 2005. This enabled oil companies to post significantly higher earnings for their refining activity in 2004 compared to 2003, with the results of the first half of 2005 confirming this trend. As for petrochemicals, despite a steady rise in the naphtha price, higher cash margins enabled a turnaround in 2004 as well as a clear improvement in oil company financial performance that should continue in 2005, judging by the net income figures reported for the first half-year. Despite this favorable business environment, capital expenditure in refining and petrochemicals remained at a low level, especially investment in new capacity, but a number of projects are being planned for the next five years. (author)

  17. Refining and petrochemicals

    International Nuclear Information System (INIS)

    Constancio, Silva

    2006-01-01

    In 2004, refining margins showed a clear improvement that persisted throughout the first three quarters of 2005. This enabled oil companies to post significantly higher earnings for their refining activity in 2004 compared to 2003, with the results of the first half of 2005 confirming this trend. As for petrochemicals, despite a steady rise in the naphtha price, higher cash margins enabled a turnaround in 2004 as well as a clear improvement in oil company financial performance that should continue in 2005, judging by the net income figures reported for the first half-year. Despite this favorable business environment, capital expenditure in refining and petrochemicals remained at a low level, especially investment in new capacity, but a number of projects are being planned for the next five years. (author)

  18. Indian refining industry

    International Nuclear Information System (INIS)

    Singh, I.J.

    2002-01-01

    The author discusses the history of the Indian refining industry and ongoing developments under the headings: the present state; refinery configuration; Indian capabilities for refinery projects; and reforms in the refining industry. Tables lists India's petroleum refineries giving location and capacity; new refinery projects together with location and capacity; and expansion projects of Indian petroleum refineries. The Indian refinery industry has undergone substantial expansion as well as technological changes over the past years. There has been progressive technology upgrading, energy efficiency, better environmental control and improved capacity utilisation. Major reform processes have been set in motion by the government of India: converting the refining industry from a centrally controlled public sector dominated industry to a delicensed regime in a competitive market economy with the introduction of a liberal exploration policy; dismantling the administered price mechanism; and a 25 year hydrocarbon vision. (UK)

  19. Practical Approaches for Detecting Selection in Microbial Genomes

    OpenAIRE

    Hedge, Jessica; Wilson, Daniel J.

    2016-01-01

    Microbial genome evolution is shaped by a variety of selective pressures. Understanding how these processes occur can help to address important problems in microbiology by explaining observed differences in phenotypes, including virulence and resistance to antibiotics. Greater access to whole-genome sequencing provides microbiologists with the opportunity to perform large-scale analyses of selection in novel settings, such as within individual hosts. This tutorial aims to guide researchers th...

  20. Refining - Panorama 2008

    International Nuclear Information System (INIS)

    2008-01-01

    Investment rallied in 2007, and many distillation and conversion projects likely to reach the industrial stage were announced. With economic growth sustained in 2006 and still pronounced in 2007, oil demand remained strong - especially in emerging countries - and refining margins stayed high. Despite these favorable business conditions, tensions persisted in the refining sector, which has fallen far behind in terms of investing in refinery capacity. It will take renewed efforts over a long period to catch up. Looking at recent events that have affected the economy in many countries (e.g. the sub-prime crisis), prudence remains advisable

  1. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    Science.gov (United States)

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  2. Reactor technology assessment and selection utilizing systems engineering approach

    Science.gov (United States)

    Zolkaffly, Muhammed Zulfakar; Han, Ki-In

    2014-02-01

    The first Nuclear power plant (NPP) deployment in a country is a complex process that needs to consider technical, economic and financial aspects along with other aspects like public acceptance. Increased interest in the deployment of new NPPs, both among newcomer countries and those with expanding programs, necessitates the selection of reactor technology among commercially available technologies. This paper reviews the Systems Decision Process (SDP) of Systems Engineering and applies it in selecting the most appropriate reactor technology for the deployment in Malaysia. The integrated qualitative and quantitative analyses employed in the SDP are explored to perform reactor technology assessment and to select the most feasible technology whose design has also to comply with the IAEA standard requirements and other relevant requirements that have been established in this study. A quick Malaysian case study result suggests that the country reside with PWR (pressurized water reactor) technologies with more detailed study to be performed in the future for the selection of the most appropriate reactor technology for Malaysia. The demonstrated technology assessment also proposes an alternative method to systematically and quantitatively select the most appropriate reactor technology.

  3. Knowledge based expert system approach to instrumentation selection (INSEL

    Directory of Open Access Journals (Sweden)

    S. Barai

    2004-08-01

    Full Text Available The selection of appropriate instrumentation for any structural measurement of civil engineering structure is a complex task. Recent developments in Artificial Intelligence (AI can help in an organized use of experiential knowledge available on instrumentation for laboratory and in-situ measurement. Usually, the instrumentation decision is based on the experience and judgment of experimentalists. The heuristic knowledge available for different types of measurement is domain dependent and the information is scattered in varied knowledge sources. The knowledge engineering techniques can help in capturing the experiential knowledge. This paper demonstrates a prototype knowledge based system for INstrument SELection (INSEL assistant where the experiential knowledge for various structural domains can be captured and utilized for making instrumentation decision. In particular, this Knowledge Based Expert System (KBES encodes the heuristics on measurement and demonstrates the instrument selection process with reference to steel bridges. INSEL runs on a microcomputer and uses an INSIGHT 2+ environment.

  4. Panorama 2012 - Refining 2030

    International Nuclear Information System (INIS)

    Marion, Pierre; Saint-Antonin, Valerie

    2011-11-01

    The major uncertainty characterizing the global energy landscape impacts particularly on transport, which remains the virtually-exclusive bastion of the oil industry. The industry must therefore respond to increasing demand for mobility against a background marked by the emergence of alternatives to oil-based fuels and the need to reduce emissions of pollutants and greenhouse gases (GHG). It is in this context that the 'Refining 2030' study conducted by IFP Energies Nouvelles (IFPEN) forecasts what the global supply and demand balance for oil products could be, and highlights the type and geographical location of the refinery investment required. Our study shows that the bulk of the refining investment will be concentrated in the emerging countries (mainly those in Asia), whilst the areas historically strong in refining (Europe and North America) face reductions in capacity. In this context, the drastic reduction in the sulphur specification of bunker oil emerges as a structural issue for European refining, in the same way as increasingly restrictive regulation of refinery CO 2 emissions (quotas/taxation) and the persistent imbalance between gasoline and diesel fuels. (authors)

  5. Selective laser sintering: A qualitative and objective approach

    Science.gov (United States)

    Kumar, Sanjay

    2003-10-01

    This article presents an overview of selective laser sintering (SLS) work as reported in various journals and proceedings. Selective laser sintering was first done mainly on polymers and nylon to create prototypes for audio-visual help and fit-to-form tests. Gradually it was expanded to include metals and alloys to manufacture functional prototypes and develop rapid tooling. The growth gained momentum with the entry of commercial entities such as DTM Corporation and EOS GmbH Electro Optical Systems. Computational modeling has been used to understand the SLS process, optimize the process parameters, and enhance the efficiency of the sintering machine.

  6. An integrated approach to site selection for nuclear power plants

    International Nuclear Information System (INIS)

    Hassan, E.M.A.

    1975-01-01

    A method of analysing and evaluating the large number of factors influencing site selection is proposed, which can interrelate these factors and associated problems in an integrated way and at the same time establish a technique for site evaluation. The objective is to develop an integrated programme that illustrates the complexity and dynamic interrelationships of the various factors to develop an improved understanding of the functions and objectives of siting nuclear power plants and would aim finally at the development of an effective procedure and technique for site evaluation and/or comparative evaluation for making rational site-selection decisions. (author)

  7. Refinement from a control problem to program

    DEFF Research Database (Denmark)

    Schenke, Michael; Ravn, Anders P.

    1996-01-01

    The distinguishing feature of the presented refinement approach is that it links formalisms from a top level requirements notation down to programs together in a mathematically coherent development trajectory. The approach uses Duration Calculus, a real-time interval logic, to specifyrequirements...

  8. A compensatory approach to optimal selection with mastery scores

    NARCIS (Netherlands)

    van der Linden, Willem J.; Vos, Hendrik J.

    1994-01-01

    This paper presents some Bayesian theories of simultaneous optimization of decision rules for test-based decisions. Simultaneous decision making arises when an institution has to make a series of selection, placement, or mastery decisions with respect to subjects from a population. An obvious

  9. A case on vendor selection methodology: An integrated approach

    Directory of Open Access Journals (Sweden)

    Nikhil C. Shil

    2009-11-01

    Full Text Available Vendor selection methodology is a highly researched area in supply chain management l terature and a very significant decision taken by supply chain managers due to technological advances in the manufacturing process. Such research has two basic dimensions: one is related to the identification of variables affecting the performance of the vendors and the other deals with the methodology to be applied. Most of the research conducted in this area deal with the upfront selection of vendors. However, it is very common to have a list of dedicated vendors due to the development of sophisticated production technologies like just in time (JIT, a lean or agile manufacturing process where continuous flow of materials is a requirement. This paper addresses the issue of selecting the optimal vendor from the internal database of a company. Factor analysis, analytical hierarchy process and regression analysis is used in an integrated way to supplement the vendor selection process. The methodology presented here is simply a proposal where every possible room for adjustment is available.

  10. Selective Attention and Attention Switching: Towards a Unified Developmental Approach

    Science.gov (United States)

    Hanania, Rima; Smith, Linda B.

    2010-01-01

    We review and relate two literatures on the development of attention in children: one concerning flexible attention switching and the other concerning selective attention. The first is a growing literature on preschool children's performances in an attention-switching task indicating that children become more flexible in their attentional control…

  11. An approach to product selection for insomnia | Smith | South African ...

    African Journals Online (AJOL)

    South African Family Practice. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 55, No 5 (2013) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected should ...

  12. Approaches taken by South African advertisers to select and appoint ...

    African Journals Online (AJOL)

    Pitch and industry guidelines play an important role in awarding advertising agency contracts, but agencies must take into account that not all advertisers will adhere to these guidelines. The exploratory research study on which this article reports provides insight into the appointment process and selection criteria applied ...

  13. A Hybrid Feature Selection Approach for Arabic Documents Classification

    NARCIS (Netherlands)

    Habib, Mena Badieh; Sarhan, Ahmed A. E.; Salem, Abdel-Badeeh M.; Fayed, Zaki T.; Gharib, Tarek F.

    Text Categorization (classification) is the process of classifying documents into a predefined set of categories based on their content. Text categorization algorithms usually represent documents as bags of words and consequently have to deal with huge number of features. Feature selection tries to

  14. Approaches taken by South African advertisers to select and appoint ...

    African Journals Online (AJOL)

    The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader). If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs.

  15. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  16. An opinion formation based binary optimization approach for feature selection

    Science.gov (United States)

    Hamedmoghadam, Homayoun; Jalili, Mahdi; Yu, Xinghuo

    2018-02-01

    This paper proposed a novel optimization method based on opinion formation in complex network systems. The proposed optimization technique mimics human-human interaction mechanism based on a mathematical model derived from social sciences. Our method encodes a subset of selected features to the opinion of an artificial agent and simulates the opinion formation process among a population of agents to solve the feature selection problem. The agents interact using an underlying interaction network structure and get into consensus in their opinions, while finding better solutions to the problem. A number of mechanisms are employed to avoid getting trapped in local minima. We compare the performance of the proposed method with a number of classical population-based optimization methods and a state-of-the-art opinion formation based method. Our experiments on a number of high dimensional datasets reveal outperformance of the proposed algorithm over others.

  17. Selective Attention and Attention Switching: Toward a Unified Developmental Approach

    OpenAIRE

    Hanania, Rima; Smith, Linda B.

    2010-01-01

    We review and relate two literatures on the development of attention in children: one concerning flexible attention switching and the other concerning selective attention. The first is a growing literature on preschool children’s performances in an attention switching task indicating that children become more flexible in their attentional control during the preschool years. The second literature encompasses a large and robust set of phenomena for the same developmental period that indicate a ...

  18. Sensory Evaluation of the Selected Coffee Products Using Fuzzy Approach

    OpenAIRE

    M.A. Lazim; M. Suriani

    2009-01-01

    Knowing consumers' preferences and perceptions of the sensory evaluation of drink products are very significant to manufacturers and retailers alike. With no appropriate sensory analysis, there is a high risk of market disappointment. This paper aims to rank the selected coffee products and also to determine the best of quality attribute through sensory evaluation using fuzzy decision making model. Three products of coffee drinks were used for sensory evaluation. Data wer...

  19. A Fuzzy-MOORA approach for ERP system selection

    Directory of Open Access Journals (Sweden)

    Prasad Karande

    2012-07-01

    Full Text Available In today’s global and dynamic business environment, manufacturing organizations face the tremendous challenge of expanding markets and meeting the customer expectations. It compels them to lower total cost in the entire supply chain, shorten throughput time, reduce inventory, expand product choice, provide more reliable delivery dates and better customer service, improve quality, and efficiently coordinate demand, supply and production. In order to accomplish these objectives, the manufacturing organizations are turning to enterprise resource planning (ERP system, which is an enterprise-wide information system to interlace all the necessary business functions, such as product planning, purchasing, inventory control, sales, financial and human resources into a single system having a shared database. Thus to survive in the global competitive environment, implementation of a suitable ERP system is mandatory. However, selecting a wrong ERP system may adversely affect the manufacturing organization’s overall performance. Due to limitations in available resources, complexity of ERP systems and diversity of alternatives, it is often difficult for a manufacturing organization to select and install the most suitable ERP system. In this paper, two ERP system selection problems are solved using fuzzy multi-objective optimization on the basis of ratio analysis (MOORA method and it is observed that in both the cases, SAP is the best solution.

  20. ANALYSIS, SELECTION AND RANKING OF FOREIGN MARKETS. A COMPREHENSIVE APPROACH

    Directory of Open Access Journals (Sweden)

    LIVIU NEAMŢU

    2013-12-01

    Full Text Available Choosing the appropriate markets for growth and development is essential for a company that wishes expanding businesses through international economic exchanges. But in this business case foreign markets research is not sufficient even though is an important chapter in the decision technology and an indispensable condition for achieving firm’s objectives. If in marketing on the national market this market is defined requiring no more than its prospection and segmentation, in the case of the international market outside the research process there is a need of a selection of markets and their classification. Companies that have this intention know little or nothing about the conditions offered by a new market or another. Therefore, they must go, step by step, through a complex analysis process, multilevel- type, composed of selection and ranking of markets followed by the proper research through exploration and segmentation, which can lead to choosing the most profitable markets. In this regard, within this study, we propose a multi-criteria model for selection and ranking of international development markets, allowing companies access to those markets which are in compliance with the company's development strategy.

  1. A refinement methodology for object-oriented programs

    OpenAIRE

    Tafat , Asma; Boulmé , Sylvain; Marché , Claude

    2010-01-01

    International audience; Refinement is a well-known approach for developing correct-byconstruction software. It has been very successful for producing high quality code e.g., as implemented in the B tool. Yet, such refinement techniques are restricted in the sense that they forbid aliasing (and more generally sharing of data-structures), which often happens in usual programming languages. We propose a sound approach for refinement in presence of aliases. Suitable abstractions of programs are d...

  2. US refining reviewed

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.

    1998-01-01

    The paper reviews the history, present position and future prospects of the petroleum industry in the USA. The main focus is on supply and demand, the high quality of the products, refinery capacity and product trade balances. Diagrams show historical trends in output, product demand, demand for transport fuels and oil, refinery capacity, refinery closures, and imports and exports. Some particularly salient points brought out were (i) production of US crude shows a marked downward trend but imports of crude will continue to increase, (ii) product demand will continue to grow even though the levels are already high, (iii) the demand is dominated by those products that typically yield the highest income for the refiner, (i.e. high quality transport fuels for environmental compliance), (iv) refinery capacity has decreased since 1980 and (v) refining will continue to have financial problems but will still be profitable. (UK)

  3. Outlook for Canadian refining

    International Nuclear Information System (INIS)

    Boje, G.

    1998-01-01

    The petroleum supply and demand balance was discussed and a comparison between Canadian and U.S. refineries was provided. The impact of changing product specifications on the petroleum industry was also discussed. The major changes include sulphur reductions in gasoline, benzene and MMT additives. These changes have been made in an effort to satisfy environmental needs. Geographic margin variations in refineries between east and west were reviewed. An overview of findings from the Solomon Refining Study of Canadian and American refineries, which has been very complimentary of the Canadian refining industry, was provided. From this writer's point of view refinery utilization has improved but there is a threat from increasing efficiency of US competitors. Environmental issues will continue to impact upon the industry and while the chances for making economic returns on investment are good for the years ahead, it will be a challenge to maintain profitability

  4. Diversification Strategies and Firm Performance: A Sample Selection Approach

    OpenAIRE

    Santarelli, Enrico; Tran, Hien Thu

    2013-01-01

    This paper is based upon the assumption that firm profitability is determined by its degree of diversification which in turn is strongly related to the antecedent decision to carry out diversification activities. This calls for an empirical approach that permits the joint analysis of the three interrelated and consecutive stages of the overall diversification process: diversification decision, degree of diversification, and outcome of diversification. We apply parametric and semiparametric ap...

  5. Fuzzy Investment Portfolio Selection Models Based on Interval Analysis Approach

    Directory of Open Access Journals (Sweden)

    Haifeng Guo

    2012-01-01

    Full Text Available This paper employs fuzzy set theory to solve the unintuitive problem of the Markowitz mean-variance (MV portfolio model and extend it to a fuzzy investment portfolio selection model. Our model establishes intervals for expected returns and risk preference, which can take into account investors' different investment appetite and thus can find the optimal resolution for each interval. In the empirical part, we test this model in Chinese stocks investment and find that this model can fulfill different kinds of investors’ objectives. Finally, investment risk can be decreased when we add investment limit to each stock in the portfolio, which indicates our model is useful in practice.

  6. Future of French refining

    International Nuclear Information System (INIS)

    Calvet, B.

    1993-01-01

    Over recent years, the refining industry has had to grapple with a growing burden of environmental and safety regulations concerning not only its plants and other facilities, but also its end products. At the same time, it has had to bear the effects of the reduction of the special status that used to apply to petroleum, and the consequences of economic freedom, to which we should add, as specifically concerns the French market, the impact of energy policy and the pro-nuclear option. The result is a drop in heavy fuel oil from 36 million tonnes per year in 1973 to 6.3 million in 1992, and in home-heating fuel from 37 to 18 million per year. This fast-moving market is highly competitive. The French market in particular is wide open to imports, but the refining companies are still heavy exporters for those products with high added-value, like lubricants, jet fuel, and lead-free gasolines. The competition has led the refining companies to commit themselves to quality, and to publicize their efforts in this direction. This is why the long-term perspectives for petroleum fuels are still wide open. This is supported by the probable expectation that the goal of economic efficiency is likely to soften the effects of the energy policy, which penalizes petroleum products, in that they have now become competitive again. In the European context, with the challenge of environmental protection and the decline in heavy fuel outlets, French refining has to keep on improving the quality of its products and plants, which means major investments. The industry absolutely must return to a more normal level of profitability, in order to sustain this financial effort, and generate the prosperity of its high-performance plants and equipment. 1 fig., 5 tabs

  7. Process for refining hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Risenfeld, E H

    1924-11-26

    A process is disclosed for the refining of hydrocarbons or other mixtures through treatment in vapor form with metal catalysts, characterized by such metals being used as catalysts, which are obtained by reduction of the oxide of minerals containing the iron group, and by the vapors of the hydrocarbons, in the presence of the water vapor, being led over these catalysts at temperatures from 200 to 300/sup 0/C.

  8. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.; Dawson, Clint N.

    2014-01-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  9. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.

    2014-03-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  10. Panorama 2009 - refining

    International Nuclear Information System (INIS)

    2008-01-01

    For oil companies to invest in new refining and conversion capacity, favorable conditions over time are required. In other words, refining margins must remain high and demand sustained over a long period. That was the situation prevailing before the onset of the financial crisis in the second half of 2008. The economic conjuncture has taken a substantial turn for the worse since then and the forecasts for 2009 do not look bright. Oil demand is expected to decrease in the OECD countries and to grow much more slowly in the emerging countries. It is anticipated that refining margins will fall in 2009 - in 2008, they slipped significantly in the United States - as a result of increasingly sluggish demand, especially for light products. The next few months will probably be unfavorable to investment. In addition to a gloomy business outlook, there may also be a problem of access to sources of financing. As for investment projects, a mainstream trend has emerged in the last few years: a shift away from the regions that have historically been most active (the OECD countries) towards certain emerging countries, mostly in Asia or the Middle East. The new conjuncture will probably not change this trend

  11. Technological studies on uranium refining at nuclear materials authority, Egypt

    International Nuclear Information System (INIS)

    Mohammed, H.S.

    1997-01-01

    In 1992 nuclear materials authority (NMA) took a decision to establish yellow cake refining. Unit so as to study refining of El-Atshan yellow cake which recently produced by ion-exchange pilot plant, production sector. The research studies followed the conventional refining rout to produce nuclear grade UO 3 . This implies investigations on some common solvents to refine the cake viz. tri alkyl phosphates, tri alkyl phosphine oxides, dialkyl phosphoric acid as well as high-molecular weight long-chain tertiary amines. Moreover, non-conventional refining process has also been presented depending on the selectivity of uranyl ion to be dissolved by carbonate and to be precipitated by hydrogen peroxide. Most of the proposed processes were found feasible to refine El-Atshan yellow cake. however, the non- conventional refining process appears to be the most promising, owing to its superior performance and economy

  12. Accounting for linkage disequilibrium in genome scans for selection without individual genotypes: The local score approach.

    Science.gov (United States)

    Fariello, María Inés; Boitard, Simon; Mercier, Sabine; Robelin, David; Faraut, Thomas; Arnould, Cécile; Recoquillay, Julien; Bouchez, Olivier; Salin, Gérald; Dehais, Patrice; Gourichon, David; Leroux, Sophie; Pitel, Frédérique; Leterrier, Christine; SanCristobal, Magali

    2017-07-01

    Detecting genomic footprints of selection is an important step in the understanding of evolution. Accounting for linkage disequilibrium in genome scans increases detection power, but haplotype-based methods require individual genotypes and are not applicable on pool-sequenced samples. We propose to take advantage of the local score approach to account for linkage disequilibrium in genome scans for selection, cumulating (possibly small) signals from single markers over a genomic segment, to clearly pinpoint a selection signal. Using computer simulations, we demonstrate that this approach detects selection with higher power than several state-of-the-art single-marker, windowing or haplotype-based approaches. We illustrate this on two benchmark data sets including individual genotypes, for which we obtain similar results with the local score and one haplotype-based approach. Finally, we apply the local score approach to Pool-Seq data obtained from a divergent selection experiment on behaviour in quail and obtain precise and biologically coherent selection signals: while competing methods fail to highlight any clear selection signature, our method detects several regions involving genes known to act on social responsiveness or autistic traits. Although we focus here on the detection of positive selection from multiple population data, the local score approach is general and can be applied to other genome scans for selection or other genomewide analyses such as GWAS. © 2017 John Wiley & Sons Ltd.

  13. A systematic community-based participatory approach to refining an evidence-based community-level intervention: the HOLA intervention for Latino men who have sex with men.

    Science.gov (United States)

    Rhodes, Scott D; Daniel, Jason; Alonzo, Jorge; Duck, Stacy; García, Manuel; Downs, Mario; Hergenrather, Kenneth C; Alegría-Ortega, José; Miller, Cindy; Boeving Allen, Alex; Gilbert, Paul A; Marsiglia, Flavio F

    2013-07-01

    Our community-based participatory research partnership engaged in a multistep process to refine a culturally congruent intervention that builds on existing community strengths to promote sexual health among immigrant Latino men who have sex with men (MSM). The steps were the following: (1) increase Latino MSM participation in the existing partnership, (2) establish an Intervention Team, (3) review the existing sexual health literature, (4) explore needs and priorities of Latino MSM, (5) narrow priorities based on what is important and changeable, (6) blend health behavior theory with Latino MSM's lived experiences, (7) design an intervention conceptual model, (8) develop training modules and (9) resource materials, and (10) pretest and (11) revise the intervention. The developed intervention contains four modules to train Latino MSM to serve as lay health advisors known as Navegantes. These modules synthesize locally collected data with other local and national data; blend health behavior theory, the lived experiences, and cultural values of immigrant Latino MSM; and harness the informal social support Latino MSM provide one another. This community-level intervention is designed to meet the expressed sexual health priorities of Latino MSM. It frames disease prevention within sexual health promotion.

  14. Towards automated crystallographic structure refinement with phenix.refine

    OpenAIRE

    Afonine, Pavel V.; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W.; Mustyakimov, Marat; Terwilliger, Thomas C.; Urzhumtsev, Alexandre; Zwart, Peter H.; Adams, Paul D.

    2012-01-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An i...

  15. Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites

    Science.gov (United States)

    2015-03-19

    Bioremediation Approaches at Chlorinated Solvent Sites March 19, 2015 SERDP & ESTCP Webinar Series (#11) SERDP & ESTCP Webinar Series Welcome and...Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites Ms. Carmen Lebrón, Independent Consultant (20 minutes + Q&A) Dr...ESTCP Webinar Series Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated

  16. Selection and approbation of new technical approaches in innovation projects

    International Nuclear Information System (INIS)

    Matvienko, V.A.; Pastushenko, V.N.; Zvorykin, K.O.; Zvorykin, L.O.

    1999-01-01

    Large-scale technical approaches based on the application of new technologies, equipment, tools and other production equipping means (PEM) are to be previously tested by technical and engineering practice and thoroughly approved. These circumstances create objective prerequisites for organization within the infrastructure of the 'OU' of a special sector to solve such a problem. A structure-and-logic model of the infrastructure sector is represented. Principal trends of activities of the proposed new sector of the 'OU' infrastructure may become the following ones: personnel training, skill rising and attestation; adaptation of new samples of universal and auxiliary equipment; new technologies and equipment approbation; approbation of actions in nonstandard situations

  17. Towards automated crystallographic structure refinement with phenix.refine

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Mustyakimov, Marat; Terwilliger, Thomas C. [Los Alamos National Laboratory, M888, Los Alamos, NM 87545 (United States); Urzhumtsev, Alexandre [CNRS–INSERM–UdS, 1 Rue Laurent Fries, BP 10142, 67404 Illkirch (France); Université Henri Poincaré, Nancy 1, BP 239, 54506 Vandoeuvre-lès-Nancy (France); Zwart, Peter H. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); University of California Berkeley, Berkeley, CA 94720 (United States)

    2012-04-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods.

  18. A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry

    OpenAIRE

    Zeynep Sener; Mehtap Dursun

    2014-01-01

    Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision meth...

  19. A Potential Approach for Low Flow Selection in Water Resource Supply and Management

    Science.gov (United States)

    Ying Ouyang

    2012-01-01

    Low flow selections are essential to water resource management, water supply planning, and watershed ecosystem restoration. In this study, a new approach, namely the frequent-low (FL) approach (or frequent-low index), was developed based on the minimum frequent-low flow or level used in minimum flows and/or levels program in northeast Florida, USA. This FL approach was...

  20. The European refining and distribution industry at the 2010 vista

    International Nuclear Information System (INIS)

    Lacour, J.J.; Tessmer, G.; Ward, I.

    1998-01-01

    Oil company chairmen belonging to the AFTP, DGMK and IP associations met together to debate about the future of the European refining industry. The following topics were discussed: is it the end of the refining crisis? Which uncertainties will have to be met? What is the situation of petroleum products supply and demand? What are the consumers' expectations? How to face the environmental constraints? Which future for the refining activities in Europe? Seven round-tables took place with the following themes: the factors of uncertainty in the future of refining activities, the petroleum products supply and demand (automotive fuels, fuel oils, lubricants), the refining activities and the supply of consumers (service stations and supermarkets), the situation of the European petroleum policy, the European refining industry and the public regulations (development of more efficient environmental approaches), the impact of environmental constraints and the technical solutions, and the future of the refining industry. (J.S.)

  1. Breast cancer tumor classification using LASSO method selection approach

    International Nuclear Information System (INIS)

    Celaya P, J. M.; Ortiz M, J. A.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Garza V, I.; Martinez F, M.; Ortiz R, J. M.

    2016-10-01

    Breast cancer is one of the leading causes of deaths worldwide among women. Early tumor detection is key in reducing breast cancer deaths and screening mammography is the widest available method for early detection. Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. In an attempt to alleviate radiological workload, this work presents a computer-aided diagnosis (CAD x) method aimed to automatically classify tumor lesions into malign or benign as a means to a second opinion. The CAD x methos, extracts image features, and classifies the screening mammogram abnormality into one of two categories: subject at risk of having malignant tumor (malign), and healthy subject (benign). In this study, 143 abnormal segmentation s (57 malign and 86 benign) from the Breast Cancer Digital Repository (BCD R) public database were used to train and evaluate the CAD x system. Percentile-rank (p-rank) was used to standardize the data. Using the LASSO feature selection methodology, the model achieved a Leave-one-out-cross-validation area under the receiver operating characteristic curve (Auc) of 0.950. The proposed method has the potential to rank abnormal lesions with high probability of malignant findings aiding in the detection of potential malign cases as a second opinion to the radiologist. (Author)

  2. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  3. Breast cancer tumor classification using LASSO method selection approach

    Energy Technology Data Exchange (ETDEWEB)

    Celaya P, J. M.; Ortiz M, J. A.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Garza V, I.; Martinez F, M.; Ortiz R, J. M., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Av. Ramon Lopez Velarde 801, Col. Centro, 98000 Zacatecas, Zac. (Mexico)

    2016-10-15

    Breast cancer is one of the leading causes of deaths worldwide among women. Early tumor detection is key in reducing breast cancer deaths and screening mammography is the widest available method for early detection. Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. In an attempt to alleviate radiological workload, this work presents a computer-aided diagnosis (CAD x) method aimed to automatically classify tumor lesions into malign or benign as a means to a second opinion. The CAD x methos, extracts image features, and classifies the screening mammogram abnormality into one of two categories: subject at risk of having malignant tumor (malign), and healthy subject (benign). In this study, 143 abnormal segmentation s (57 malign and 86 benign) from the Breast Cancer Digital Repository (BCD R) public database were used to train and evaluate the CAD x system. Percentile-rank (p-rank) was used to standardize the data. Using the LASSO feature selection methodology, the model achieved a Leave-one-out-cross-validation area under the receiver operating characteristic curve (Auc) of 0.950. The proposed method has the potential to rank abnormal lesions with high probability of malignant findings aiding in the detection of potential malign cases as a second opinion to the radiologist. (Author)

  4. Guided-wave approaches to spectrally selective energy absorption

    Science.gov (United States)

    Stegeman, G. I.; Burke, J. J.

    1987-01-01

    Results of experiments designed to demonstrate spectrally selective absorption in dielectric waveguides on semiconductor substrates are reported. These experiments were conducted with three waveguides formed by sputtering films of PSK2 glass onto silicon-oxide layers grown on silicon substrates. The three waveguide samples were studied at 633 and 532 nm. The samples differed only in the thickness of the silicon-oxide layer, specifically 256 nm, 506 nm, and 740 nm. Agreement between theoretical predictions and measurements of propagation constants (mode angles) of the six or seven modes supported by these samples was excellent. However, the loss measurements were inconclusive because of high scattering losses in the structures fabricated (in excess of 10 dB/cm). Theoretical calculations indicated that the power distribution among all the modes supported by these structures will reach its steady state value after a propagation length of only 1 mm. Accordingly, the measured loss rates were found to be almost independent of which mode was initially excited. The excellent agreement between theory and experiment leads to the conclusion that low loss waveguides confirm the predicted loss rates.

  5. Innovation During the Supplier Selection Process

    DEFF Research Database (Denmark)

    Pilkington, Alan; Pedraza, Isabel

    2014-01-01

    Established ideas on supplier selection have not moved much from the original premise of how to choose between bidders. Whilst we have added many different tools and refinements to choose between alternative suppliers, its nature has not evolved. We move the original selection process approach...... observed through an ethnographic embedded researcher study has refined the selection process and has two selection stages one for first supply covering tool/process developed and another later for resupply of mature parts. We report the details of the process, those involved, the criteria employed...... and identify benefits and weaknesses of this enhanced selection process....

  6. An Approach to Addressing Selection Bias in Survival Analysis

    Science.gov (United States)

    Carlin, Caroline S.; Solid, Craig A.

    2014-01-01

    This work proposes a frailty model that accounts for non-random treatment assignment in survival analysis. Using Monte Carlo simulation, we found that estimated treatment parameters from our proposed endogenous selection survival model (esSurv) closely parallel the consistent two-stage residual inclusion (2SRI) results, while offering computational and interpretive advantages. The esSurv method greatly enhances computational speed relative to 2SRI by eliminating the need for bootstrapped standard errors, and generally results in smaller standard errors than those estimated by 2SRI. In addition, esSurv explicitly estimates the correlation of unobservable factors contributing to both treatment assignment and the outcome of interest, providing an interpretive advantage over the residual parameter estimate in the 2SRI method. Comparisons with commonly used propensity score methods and with a model that does not account for non-random treatment assignment show clear bias in these methods that is not mitigated by increased sample size. We illustrate using actual dialysis patient data comparing mortality of patients with mature arteriovenous grafts for venous access to mortality of patients with grafts placed but not yet ready for use at the initiation of dialysis. We find strong evidence of endogeneity (with estimate of correlation in unobserved factors ρ̂ = 0.55), and estimate a mature-graft hazard ratio of 0.197 in our proposed method, with a similar 0.173 hazard ratio using 2SRI. The 0.630 hazard ratio from a frailty model without a correction for the non-random nature of treatment assignment illustrates the importance of accounting for endogeneity. PMID:24845211

  7. Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.

    Science.gov (United States)

    Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick

    2013-04-01

    Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.

  8. Hirshfeld atom refinement.

    Science.gov (United States)

    Capelli, Silvia C; Bürgi, Hans-Beat; Dittrich, Birger; Grabowsky, Simon; Jayatilaka, Dylan

    2014-09-01

    Hirshfeld atom refinement (HAR) is a method which determines structural parameters from single-crystal X-ray diffraction data by using an aspherical atom partitioning of tailor-made ab initio quantum mechanical molecular electron densities without any further approximation. Here the original HAR method is extended by implementing an iterative procedure of successive cycles of electron density calculations, Hirshfeld atom scattering factor calculations and structural least-squares refinements, repeated until convergence. The importance of this iterative procedure is illustrated via the example of crystalline ammonia. The new HAR method is then applied to X-ray diffraction data of the dipeptide Gly-l-Ala measured at 12, 50, 100, 150, 220 and 295 K, using Hartree-Fock and BLYP density functional theory electron densities and three different basis sets. All positions and anisotropic displacement parameters (ADPs) are freely refined without constraints or restraints - even those for hydrogen atoms. The results are systematically compared with those from neutron diffraction experiments at the temperatures 12, 50, 150 and 295 K. Although non-hydrogen-atom ADPs differ by up to three combined standard uncertainties (csu's), all other structural parameters agree within less than 2 csu's. Using our best calculations (BLYP/cc-pVTZ, recommended for organic molecules), the accuracy of determining bond lengths involving hydrogen atoms from HAR is better than 0.009 Å for temperatures of 150 K or below; for hydrogen-atom ADPs it is better than 0.006 Å(2) as judged from the mean absolute X-ray minus neutron differences. These results are among the best ever obtained. Remarkably, the precision of determining bond lengths and ADPs for the hydrogen atoms from the HAR procedure is comparable with that from the neutron measurements - an outcome which is obtained with a routinely achievable resolution of the X-ray data of 0.65 Å.

  9. Refining and petrochemicals

    International Nuclear Information System (INIS)

    Benazzi, E.

    2003-01-01

    Down sharply in 2002, refining margins showed a clear improvement in the first half-year of 2003. As a result, the earnings reported by oil companies for financial year 2002 were significantly lower than in 2001, but the prospects are brighter for 2003. In the petrochemicals sector, slow demand and higher feedstock prices eroded margins in 2002, especially in Europe and the United States. The financial results for the first part of 2003 seem to indicate that sector profitability will not improve before 2004. (author)

  10. Refining and petrochemicals

    International Nuclear Information System (INIS)

    Benazzi, E.; Alario, F.

    2004-01-01

    In 2003, refining margins showed a clear improvement that continued throughout the first three quarters of 2004. Oil companies posted significantly higher earnings in 2003 compared to 2002, with the results of first quarter 2004 confirming this trend. Due to higher feedstock prices, the implementation of new capacity and more intense competition, the petrochemicals industry was not able to boost margins in 2003. In such difficult business conditions, aggravated by soaring crude prices, the petrochemicals industry is not likely to see any improvement in profitability before the second half of 2004. (author)

  11. Refining shale-oil distillates

    Energy Technology Data Exchange (ETDEWEB)

    Altpeter, J

    1952-03-17

    A process is described for refining distillates from shale oil, brown coal, tar, and other tar products by extraction with selective solvents, such as lower alcohols, halogen-hydrins, dichlorodiethyl ether, liquid sulfur dioxide, and so forth, as well as treating with alkali solution, characterized in that the distillate is first treated with completely or almost completely recovered phenol or cresotate solution, the oil is separated from the phenolate with solvent, for example concentrated or adjusted to a determined water content of lower alcohol, furfural, halogen-hydrin, dichlorodiethyl ether, liquid sulfur dioxide, or the like, extracted, and the raffinate separated from the extract layer, if necessary after distillation or washing out of solvent, and freeing with alkali solution from residual phenol or creosol.

  12. Selected approaches to determining the purpose of emergency planning zones

    Science.gov (United States)

    Dobeš, Pavel; Baudišová, Barbora; Sluka, Vilém; Skřínský, Jan; Danihelka, Pavel; Dlabka, Jakub; Řeháček, Jakub

    2013-04-01

    One of the major accident hazards (hereinafter referred to as "MAH") tools to determine the range of effects of a major accident and consequent protection of the public is the determination of the emergency planning zone (hereinafter referred to as "zone"). In the Czech Republic, the determination of the zone is regulated by the Decree No. 103/2006 Coll. laying down the principles for determination of the emergency planning zone and the extent and manner of elaborating the external emergency plan (hereinafter referred to as "Decree") 3. The Decree is based on the principles of the IAEA-TECDOC-727 method - Manual for the Classification and Prioritization of Risks Due to Major Accidents in Process and Related Industries (hereinafter referred to as "method" and "manual", respectively)3. In the manual, it is pointed out that the method used is not suitable for making emergency plans for special situations (industrial activities in an inhabited area). Nonetheless, its principles and procedures are still used for such purposes in the Czech Republic. The expert scientific community dealing with MAH issues in the Czech Republic, however, realizes that the procedure of the zone boundary delineation should be modified to reflect up-to-date knowledge in protection of the public and its enhancement. Therefore, the OPTIZON Project (Optimization of the Emergency Planning Zone Designation and Elaboration of Emergency Plans Based on Threatening Effects of Dangerous Chemical Substances at Operational Accidents with Respect to Inhabitant Protection Enhancement) was developed and approved for the Program of Security Research of the Czech Republic 2010 - 2015 (BV II/2-VS). One of the main project's objectives is to define clearly the purpose of the zone because at present it is not quite apparent. From the general view, this step may seem insignificant or trivial, but the reverse is true. It represents one of the most important stages in seeking the approach to the zone designation as

  13. Genetic engineering of industrial Saccharomyces cerevisiae strains using a selection/counter-selection approach.

    Science.gov (United States)

    Kutyna, Dariusz R; Cordente, Antonio G; Varela, Cristian

    2014-01-01

    Gene modification of laboratory yeast strains is currently a very straightforward task thanks to the availability of the entire yeast genome sequence and the high frequency with which yeast can incorporate exogenous DNA into its genome. Unfortunately, laboratory strains do not perform well in industrial settings, indicating the need for strategies to modify industrial strains to enable strain development for industrial applications. Here we describe approaches we have used to genetically modify industrial strains used in winemaking.

  14. Atlantic Basin refining profitability

    International Nuclear Information System (INIS)

    Jones, R.J.

    1998-01-01

    A review of the profitability margins of oil refining in the Atlantic Basin was presented. Petroleum refiners face the continuous challenge of balancing supply with demand. It would appear that the profitability margins in the Atlantic Basin will increase significantly in the near future because of shrinking supply surpluses. Refinery capacity utilization has reached higher levels than ever before. The American Petroleum Institute reported that in August 1997, U.S. refineries used 99 per cent of their capacity for several weeks in a row. U.S. gasoline inventories have also declined as the industry has focused on reducing capital costs. This is further evidence that supply and demand are tightly balanced. Some of the reasons for tightening supplies were reviewed. It was predicted that U.S. gasoline demand will continue to grow in the near future. Gasoline demand has not declined as expected because new vehicles are not any more fuel efficient today than they were a decade ago. Although federally-mandated fuel efficiency standards were designed to lower gasoline consumption, they may actually have prevented consumption from falling. Atlantic margins were predicted to continue moving up because of the supply and demand evidence: high capacity utilization rates, low operating inventories, limited capacity addition resulting from lower capital spending, continued U.S. gasoline demand growth, and steady total oil demand growth. 11 figs

  15. Adaptive mesh refinement in titanium

    Energy Technology Data Exchange (ETDEWEB)

    Colella, Phillip; Wen, Tong

    2005-01-21

    In this paper, we evaluate Titanium's usability as a high-level parallel programming language through a case study, where we implement a subset of Chombo's functionality in Titanium. Chombo is a software package applying the Adaptive Mesh Refinement methodology to numerical Partial Differential Equations at the production level. In Chombo, the library approach is used to parallel programming (C++ and Fortran, with MPI), whereas Titanium is a Java dialect designed for high-performance scientific computing. The performance of our implementation is studied and compared with that of Chombo in solving Poisson's equation based on two grid configurations from a real application. Also provided are the counts of lines of code from both sides.

  16. Petroleum refining industry in China

    International Nuclear Information System (INIS)

    Walls, W.D.

    2010-01-01

    The oil refining industry in China has faced rapid growth in oil imports of increasingly sour grades of crude with which to satisfy growing domestic demand for a slate of lighter and cleaner finished products sold at subsidized prices. At the same time, the world petroleum refining industry has been moving from one that serves primarily local and regional markets to one that serves global markets for finished products, as world refining capacity utilization has increased. Globally, refined product markets are likely to experience continued globalization until refining investments significantly expand capacity in key demand regions. We survey the oil refining industry in China in the context of the world market for heterogeneous crude oils and growing world trade in refined petroleum products. (author)

  17. A deductive approach to select or rank journals in multifaceted subject, Oceanography

    Digital Repository Service at National Institute of Oceanography (India)

    Sahu, S.R.; Panda, K.C.

    journal) whereas Bradford’s differential approach (articles in the bibliographies of specific subject field) to account/rank the core journals. Both these methods make sense in the journal selection/ranking process to a specific subject field...

  18. Hybrid direct and iterative solvers for h refined grids with singularities

    KAUST Repository

    Paszyński, Maciej R.

    2015-04-27

    This paper describes a hybrid direct and iterative solver for two and three dimensional h adaptive grids with point singularities. The point singularities are eliminated by using a sequential linear computational cost solver O(N) on CPU [1]. The remaining Schur complements are submitted to incomplete LU preconditioned conjugated gradient (ILUPCG) iterative solver. The approach is compared to the standard algorithm performing static condensation over the entire mesh and executing the ILUPCG algorithm on top of it. The hybrid solver is applied for two or three dimensional grids automatically h refined towards point or edge singularities. The automatic refinement is based on the relative error estimations between the coarse and fine mesh solutions [2], and the optimal refinements are selected using the projection based interpolation. The computational mesh is partitioned into sub-meshes with local point and edge singularities separated. This is done by using the following greedy algorithm.

  19. Materials selection in micromechanical design: an application of the Ashby approach

    OpenAIRE

    Srikar, V.T.; Spearing, S.M.

    2003-01-01

    The set of materials available to microsystems designers is rapidly expanding. Techniques now exist to introduce and integrate a large number of metals, alloys, ceramics, glasses, polymers, and elastomers into microsystems, motivating the need for a rational approach for materials selection in microsystems design. As a step toward such an approach, we focus on the initial stages of materials selection for micromechanical structures with minimum feature sizes greater than 1 /spl mu/m. The vari...

  20. Application of integrated QFD and fuzzy AHP approach in selection of suppliers

    Directory of Open Access Journals (Sweden)

    Bojana Jovanović

    2014-10-01

    Full Text Available Supplier selection is a widely considered issue in the field of management, especially in quality management. In this paper, in the selection of suppliers of electronic components we used the integrated QFD and fuzzy AHP approaches. The QFD method is used as a tool for translating stakeholder needs into evaluating criteria for suppliers. The fuzzy AHP approach is used as a tool for prioritizing stakeholders, stakeholders’ requirements, evaluating criteria and, finally, for prioritizing suppliers. The paper showcases a case study of implementation of the integrated QFD and fuzzy AHP approaches in the selection of the electronic components supplier in one Serbian company that produces electronic devices. Also presented is the algorithm of implementation of the proposed approach. To the best of our knowledge, this is the first implementation of the proposed approach in a Serbian company.

  1. High-capacity, selective solid sequestrants for innovative chemical separation: Inorganic ion exchange approach

    International Nuclear Information System (INIS)

    Bray, L.

    1995-01-01

    The approach of this task is to develop high-capacity, selective solid inorganic ion exchangers for the recovery of cesium and strontium from nuclear alkaline and acid wastes. To achieve this goal, Pacific Northwest Laboratories (PNL) is collaborating with industry and university participants to develop high capacity, selective, solid ion exchangers for the removal of specific contaminants from nuclear waste streams

  2. A Diagnostic Approach to Increase Reusable Dinnerware Selection in a Cafeteria

    Science.gov (United States)

    Manuel, Jennifer C.; Sunseri, Mary Anne; Olson, Ryan; Scolari, Miranda

    2007-01-01

    The current project tested a diagnostic approach to selecting interventions to increase patron selection of reusable dinnerware in a cafeteria. An assessment survey, completed by a sample of 43 patrons, suggested that the primary causes of wasteful behavior were (a) environmental arrangement of dinnerware options and (b) competing motivational…

  3. MULTIPLE CRITERIA DECISION MAKING APPROACH FOR INDUSTRIAL ENGINEER SELECTION USING FUZZY AHP-FUZZY TOPSIS

    OpenAIRE

    Deliktaş, Derya; ÜSTÜN, Özden

    2018-01-01

    In this study, a fuzzy multiple criteria decision-making approach is proposed to select an industrial engineer among ten candidates in a manufacturing environment. The industrial engineer selection problem is a special case of the personal selection problem. This problem, which has hierarchical structure of criteria and many decision makers, contains many criteria. The evaluation process of decision makers also includes ambiguous parameters. The fuzzy AHP is used to determin...

  4. Feature Selection using Multi-objective Genetic Algorith m: A Hybrid Approach

    OpenAIRE

    Ahuja, Jyoti; GJUST - Guru Jambheshwar University of Sciecne and Technology; Ratnoo, Saroj Dahiya; GJUST - Guru Jambheshwar University of Sciecne and Technology

    2015-01-01

    Feature selection is an important pre-processing task for building accurate and comprehensible classification models. Several researchers have applied filter, wrapper or hybrid approaches using genetic algorithms which are good candidates for optimization problems that involve large search spaces like in the case of feature selection. Moreover, feature selection is an inherently multi-objective problem with many competing objectives involving size, predictive power and redundancy of the featu...

  5. Comparison of a rational vs. high throughput approach for rapid salt screening and selection.

    Science.gov (United States)

    Collman, Benjamin M; Miller, Jonathan M; Seadeek, Christopher; Stambek, Julie A; Blackburn, Anthony C

    2013-01-01

    In recent years, high throughput (HT) screening has become the most widely used approach for early phase salt screening and selection in a drug discovery/development setting. The purpose of this study was to compare a rational approach for salt screening and selection to those results previously generated using a HT approach. The rational approach involved a much smaller number of initial trials (one salt synthesis attempt per counterion) that were selected based on a few strategic solubility determinations of the free form combined with a theoretical analysis of the ideal solvent solubility conditions for salt formation. Salt screening results for sertraline, tamoxifen, and trazodone using the rational approach were compared to those previously generated by HT screening. The rational approach produced similar results to HT screening, including identification of the commercially chosen salt forms, but with a fraction of the crystallization attempts. Moreover, the rational approach provided enough solid from the very initial crystallization of a salt for more thorough and reliable solid-state characterization and thus rapid decision-making. The crystallization techniques used in the rational approach mimic larger-scale process crystallization, allowing smoother technical transfer of the selected salt to the process chemist.

  6. Refining of raw materials, lignite present economic problems

    Energy Technology Data Exchange (ETDEWEB)

    Schirmer, G.

    1985-06-01

    East Germany seeks an economic intensification program that involves refining raw materials to a higher level. Lignite briquetting prior to liquefaction and gasification illustrates both the theoretical and practical aspects of that goal and also introduces questions of secure supplies. The author describes the special labor processes, use of technology, recycling of waste materials, and other new problems that the approach entails as the refined raw materials become new materials or energy sources. Economics based on the value of the refined product and the cost of the materials determine the degree of refinement. The concept also involves the relationship of producer and user as profits increase.

  7. A potential approach for low flow selection in water resource supply and management

    Science.gov (United States)

    Ouyang, Ying

    2012-08-01

    SummaryLow flow selections are essential to water resource management, water supply planning, and watershed ecosystem restoration. In this study, a new approach, namely the frequent-low (FL) approach (or frequent-low index), was developed based on the minimum frequent-low flow or level used in minimum flows and/or levels program in northeast Florida, USA. This FL approach was then compared to the conventional 7Q10 approach for low flow selections prior to its applications, using the USGS flow data from the freshwater environment (Big Sunflower River, Mississippi) as well as from the estuarine environment (St. Johns River, Florida). Unlike the FL approach that is associated with the biological and ecological impacts, the 7Q10 approach could lead to the selections of extremely low flows (e.g., near-zero flows) that may hinder its use for establishing criteria to prevent streams from significant harm to biological and ecological communities. Additionally, the 7Q10 approach could not be used when the period of data records is less than 10 years by definition while this may not the case for the FL approach. Results from both approaches showed that the low flows from the Big Sunflower River and the St. Johns River decreased as time elapsed, demonstrating that these two rivers have become drier during the last several decades with a potential of salted water intrusion to the St. Johns River. Results from the FL approach further revealed that the recurrence probability of low flow increased while the recurrence interval of low flow decreased as time elapsed in both rivers, indicating that low flows occurred more frequent in these rivers as time elapsed. This report suggests that the FL approach, developed in this study, is a useful alternative for low flow selections in addition to the 7Q10 approach.

  8. Comparing Refinements for Failure and Bisimulation Semantics

    NARCIS (Netherlands)

    Eshuis, H.; Fokkinga, M.M.

    2002-01-01

    Refinement in bisimulation semantics is defined differently from refinement in failure semantics: in bisimulation semantics refinement is based on simulations between labelled transition systems, whereas in failure semantics refinement is based on inclusions between failure systems. There exist

  9. Biofuels Refining Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Lobban, Lance [Univ. of Oklahoma, Norman, OK (United States)

    2017-03-28

    carbon capture and hydrogen efficiency. Our research approach combined catalyst synthesis, measurements of catalyst activity and selectivity in different reactor systems and conditions, and detailed catalyst characterization to develop fundamental understanding of reaction pathways and the capability to predict product distributions. Nearly all of the candidate catalysts were prepared in-house via standard techniques such as impregnation, co-impregnation, or chemical vapor deposition. Supports were usually purchased, but in some cases coprecipitation was used to simultaneously create the support and active component, which can be advantageous for strong active component-support interactions and for achieving high active component dispersion. In-house synthesis also allowed for studies of the effects on catalyst activity and selectivity of such factors as support porosity, calcination temperature, and reduction/activation conditions. Depending on the physical characteristics of the molecule, catalyst activity measurements were carried out in tubular flow reactors (for vapor phase reactions) or stirred tank reactors (for liquid phase reactions) over a wide range of pressures and temperatures. Reactant and product concentrations were measured using gas chromatography (both on-line and off-line, with TCD, FID, and/or mass spectrometric detection). For promising catalysts, detailed physicochemical characterization was carried out using FTIR, Raman, XPS, and XRD spectroscopies (all available in our laboratories) and TEM spectroscopy (available at OU). Additional methods included temperature programmed techniques (TPD, TPO) and surface area measurements by nitrogen adsorption techniques.

  10. Commercial refining in the Mediterranean

    International Nuclear Information System (INIS)

    Packer, P.

    1999-01-01

    About 9% of the world's oil refining capacity is on the Mediterranean: some of the world's biggest and most advanced refineries are on Sicily and Sardinia. The Mediterranean refineries are important suppliers to southern Europe and N. Africa. The article discusses commercial refining in the Mediterranean under the headings of (i) historic development, (ii) product demand, (iii) refinery configurations, (iv) refined product trade, (v) financial performance and (vi) future outlook. Although some difficulties are foreseen, refining in the Mediterranean is likely to continue to be important well into the 21st century. (UK)

  11. ERP system implementation costs and selection factors of an implementation approach

    DEFF Research Database (Denmark)

    Johansson, Björn; Sudzina, Frantisek; Newman, Mike

    2011-01-01

    , which influence the implementation approach in an ERP project, cause also an increase of the project cost in a European context? Our survey was conducted in Denmark, Slovakia and Slovenia and focused on this issue. Our main findings are that: 1) the number of implemented modules influences selection......Different approaches on implementation of enterprise resource planning (ERPs) systems exist. In this article, we investigate relationship between factors influencing selection of implementation approach and companies' ability to stay within budget when implementing ERPs. The question is: do factors...... of an implementation approach; 2) companies with information strategies are more likely to stay within budget regarding ERP systems implementation. However, we also found that: 3) implementation approach does not significantly influence ability to stay within budget; 4) a clear relationship between factors influencing...

  12. METHOD FOR SELECTION OF PROJECT MANAGEMENT APPROACH BASED ON FUZZY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Igor V. KONONENKO

    2017-03-01

    Full Text Available Literature analysis of works that devoted to research of the selection a project management approach and development of effective methods for this problem solution is given. Mathematical model and method for selection of project management approach with fuzzy concepts of applicability of existing approaches are proposed. The selection is made of such approaches as the PMBOK Guide, the ISO21500 standard, the PRINCE2 methodology, the SWEBOK Guide, agile methodologies Scrum, XP, and Kanban. The number of project parameters which have a great impact on the result of the selection and measure of their impact is determined. Project parameters relate to information about the project, team, communication, critical project risks. They include the number of people involved in the project, the customer's experience with this project team, the project team's experience in this field, the project team's understanding of requirements, adapting ability, initiative, and others. The suggested method is considered on the example of its application for selection a project management approach to software development project.

  13. Application of the Sensor Selection Approach in Polymer Electrolyte Membrane Fuel Cell Prognostics and Health Management

    Directory of Open Access Journals (Sweden)

    Lei Mao

    2017-09-01

    Full Text Available In this paper, the sensor selection approach is investigated with the aim of using fewer sensors to provide reliable fuel cell diagnostic and prognostic results. The sensitivity of sensors is firstly calculated with a developed fuel cell model. With sensor sensitivities to different fuel cell failure modes, the available sensors can be ranked. A sensor selection algorithm is used in the analysis, which considers both sensor sensitivity to fuel cell performance and resistance to noise. The performance of the selected sensors in polymer electrolyte membrane (PEM fuel cell prognostics is also evaluated with an adaptive neuro-fuzzy inference system (ANFIS, and results show that the fuel cell voltage can be predicted with good quality using the selected sensors. Furthermore, a fuel cell test is performed to investigate the effectiveness of selected sensors in fuel cell fault diagnosis. From the results, different fuel cell states can be distinguished with good quality using the selected sensors.

  14. A MOORA based fuzzy multi-criteria decision making approach for supply chain strategy selection

    Directory of Open Access Journals (Sweden)

    Bijan Sarkar

    2012-08-01

    Full Text Available To acquire the competitive advantages in order to survive in the global business scenario, modern companies are now facing the problems of selecting key supply chain strategies. Strategy selection becomes difficult as the number of alternatives and conflicting criteria increases. Multi criteria decision making (MCDM methodologies help the supply chain managers take a lead in a complex industrial set-up. The present investigation applies fuzzy MCDM technique entailing multi-objective optimization on the basis of ratio analysis (MOORA in selection of alternatives in a supply chain. The MOORA method is utilized to three suitable numerical examples for the selection of supply chain strategies (warehouse location selection and vendor/supplier selection. The results obtained by using current approach almost match with those of previous research works published in various open journals. The empirical study has demonstrated the simplicity and applicability of this method as a strategic decision making tool in a supply chain.

  15. A patient and community-centered approach selecting endpoints for a randomized trial of a novel advance care planning tool

    Directory of Open Access Journals (Sweden)

    Bridges JFP

    2018-02-01

    Full Text Available John FP Bridges,1,2 Norah L Crossnohere,2 Anne L Schuster,1 Judith A Miller,3 Carolyn Pastorini,3,† Rebecca A Aslakson2,4,5 1Department of Health Policy and Management, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 2Department of Health, Behavior, and Society, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 3Patient-Centered Outcomes Research Institute (PCORI Project, Baltimore, MD, 4Department of Anesthesiology and Critical Care Medicine, The Johns Hopkins School of Medicine, Baltimore, MD, 5Armstrong Institute for Patient Safety and Quality, The Johns Hopkins School of Medicine, Baltimore, MD, USA †Carolyn Pastorini passed away on August 24, 2015 Background: Despite a movement toward patient-centered outcomes, best practices on how to gather and refine patients’ perspectives on research endpoints are limited. Advanced care planning (ACP is inherently patient centered and would benefit from patient prioritization of endpoints for ACP-related tools and studies.Objective: This investigation sought to prioritize patient-centered endpoints for the content and evaluation of an ACP video being developed for patients undergoing major surgery. We also sought to highlight an approach using complementary engagement and research strategies to document priorities and preferences of patients and other stakeholders.Materials and methods: Endpoints identified from a previously published environmental scan were operationalized following rating by a caregiver co-investigator, refinement by a patient co-investigator, review by a stakeholder committee, and validation by patients and family members. Finalized endpoints were taken to a state fair where members of the public who indicated that they or a loved one had undergone major surgery prioritized their most relevant endpoints and provided comments.Results: Of the initial 50 ACP endpoints identified from the review, 12 endpoints were selected for public

  16. On Modal Refinement and Consistency

    DEFF Research Database (Denmark)

    Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej

    2007-01-01

    Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...

  17. Multi criteria decision making approaches for green supplier evaluation and selection

    DEFF Research Database (Denmark)

    Govindan, Kannan; Rajendran, S.; Sarkis, J.

    2015-01-01

    A large and growing body of literature to supplier evaluation and selection exists. Literature on green supplier evaluation that considers environmental factors are relatively limited. Recently, in supply chain management decision making, approaches for evaluating green supplier performance have ...... us to identify improvements for green supplier selection process and possible future directions.......A large and growing body of literature to supplier evaluation and selection exists. Literature on green supplier evaluation that considers environmental factors are relatively limited. Recently, in supply chain management decision making, approaches for evaluating green supplier performance have...... used both qualitative and quantitative environmental data. Given this evolving research field, the goal and purpose of this paper is to analyze research in international scientific journals and international conference proceedings that focus on green supplier selection. We propose the following...

  18. Selecting measures to prevent deleterious alkali-silica reaction in concrete : rationale for the AASHTO PP65 prescriptive approach.

    Science.gov (United States)

    2012-10-01

    PP65-11 provides two approaches for selecting preventive measures: (i) a performance approach based on laboratory testing, and (ii) a prescriptive approach based on a consideration of the reactivity of the aggregate, type and size of structure, expos...

  19. An Examination of HR Strategic Recruitment and Selection Approaches in China

    OpenAIRE

    Zhou, Guozhen

    2006-01-01

    Abstract In the past two decades, the manner in which organisations in the People's Republic of China (PRC) managed their human resources has changed dramatically (Braun and Warner, 2002). As the economy grows and moves into higher value-added work, strategic recruitment and selection are vital to an organisation's success. This dissertation seeks to examine the recruitment and selection strategy approaches in China. This research is based on 15 well-known firms, of which 11 are multinati...

  20. Crystal structure refinement with SHELXL

    Energy Technology Data Exchange (ETDEWEB)

    Sheldrick, George M., E-mail: gsheldr@shelx.uni-ac.gwdg.de [Department of Structural Chemistry, Georg-August Universität Göttingen, Tammannstraße 4, Göttingen 37077 (Germany)

    2015-01-01

    New features added to the refinement program SHELXL since 2008 are described and explained. The improvements in the crystal structure refinement program SHELXL have been closely coupled with the development and increasing importance of the CIF (Crystallographic Information Framework) format for validating and archiving crystal structures. An important simplification is that now only one file in CIF format (for convenience, referred to simply as ‘a CIF’) containing embedded reflection data and SHELXL instructions is needed for a complete structure archive; the program SHREDCIF can be used to extract the .hkl and .ins files required for further refinement with SHELXL. Recent developments in SHELXL facilitate refinement against neutron diffraction data, the treatment of H atoms, the determination of absolute structure, the input of partial structure factors and the refinement of twinned and disordered structures. SHELXL is available free to academics for the Windows, Linux and Mac OS X operating systems, and is particularly suitable for multiple-core processors.

  1. Ranking and selection of commercial off-the-shelf using fuzzy distance based approach

    Directory of Open Access Journals (Sweden)

    Rakesh Garg

    2015-06-01

    Full Text Available There is a tremendous growth of the use of the component based software engineering (CBSE approach for the development of software systems. The selection of the best suited COTS components which fulfils the necessary requirement for the development of software(s has become a major challenge for the software developers. The complexity of the optimal selection problem increases with an increase in alternative potential COTS components and the corresponding selection criteria. In this research paper, the problem of ranking and selection of Data Base Management Systems (DBMS components is modeled as a multi-criteria decision making problem. A ‘Fuzzy Distance Based Approach (FDBA’ method is proposed for the optimal ranking and selection of DBMS COTS components of an e-payment system based on 14 selection criteria grouped under three major categories i.e. ‘Vendor Capabilities’, ‘Business Issues’ and ‘Cost’. The results of this method are compared with other Analytical Hierarchy Process (AHP which is termed as a typical multi-criteria decision making approach. The proposed methodology is explained with an illustrated example.

  2. Selection of suitable e-learning approach using TOPSIS technique with best ranked criteria weights

    Science.gov (United States)

    Mohammed, Husam Jasim; Kasim, Maznah Mat; Shaharanee, Izwan Nizal Mohd

    2017-11-01

    This paper compares the performances of four rank-based weighting assessment techniques, Rank Sum (RS), Rank Reciprocal (RR), Rank Exponent (RE), and Rank Order Centroid (ROC) on five identified e-learning criteria to select the best weights method. A total of 35 experts in a public university in Malaysia were asked to rank the criteria and to evaluate five e-learning approaches which include blended learning, flipped classroom, ICT supported face to face learning, synchronous learning, and asynchronous learning. The best ranked criteria weights are defined as weights that have the least total absolute differences with the geometric mean of all weights, were then used to select the most suitable e-learning approach by using TOPSIS method. The results show that RR weights are the best, while flipped classroom approach implementation is the most suitable approach. This paper has developed a decision framework to aid decision makers (DMs) in choosing the most suitable weighting method for solving MCDM problems.

  3. Flanking region sequence information to refine microRNA target ...

    Indian Academy of Sciences (India)

    Prakash

    (SVM)-based target prediction refinement approach has been introduced through .... are kernel-based statistical learning machines, where a discriminant ...... Cox T and Cuff J 2002 The Ensembl genome database project;. Nucleic Acids Res.

  4. A robust optimisation approach to the problem of supplier selection and allocation in outsourcing

    Science.gov (United States)

    Fu, Yelin; Keung Lai, Kin; Liang, Liang

    2016-03-01

    We formulate the supplier selection and allocation problem in outsourcing under an uncertain environment as a stochastic programming problem. Both the decision-maker's attitude towards risk and the penalty parameters for demand deviation are considered in the objective function. A service level agreement, upper bound for each selected supplier's allocation and the number of selected suppliers are considered as constraints. A novel robust optimisation approach is employed to solve this problem under different economic situations. Illustrative examples are presented with managerial implications highlighted to support decision-making.

  5. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    OpenAIRE

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S.; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality i...

  6. Formal refinement of extended state machines

    Directory of Open Access Journals (Sweden)

    Thomas Fayolle

    2016-06-01

    Full Text Available In a traditional formal development process, e.g. using the B method, the informal user requirements are (manually translated into a global abstract formal specification. This translation is especially difficult to achieve. The Event-B method was developed to incrementally and formally construct such a specification using stepwise refinement. Each increment takes into account new properties and system aspects. In this paper, we propose to couple a graphical notation called Algebraic State-Transition Diagrams (ASTD with an Event-B specification in order to provide a better understanding of the software behaviour. The dynamic behaviour is captured by the ASTD, which is based on automata and process algebra operators, while the data model is described by means of an Event-B specification. We propose a methodology to incrementally refine such specification couplings, taking into account new refinement relations and consistency conditions between the control specification and the data specification. We compare the specifications obtained using each approach for readability and proof complexity. The advantages and drawbacks of the traditional approach and of our methodology are discussed. The whole process is illustrated by a railway CBTC-like case study. Our approach is supported by tools for translating ASTD's into B and Event-B into B.

  7. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  8. A New Spectral Shape-Based Record Selection Approach Using Np and Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Edén Bojórquez

    2013-01-01

    Full Text Available With the aim to improve code-based real records selection criteria, an approach inspired in a parameter proxy of spectral shape, named Np, is analyzed. The procedure is based on several objectives aimed to minimize the record-to-record variability of the ground motions selected for seismic structural assessment. In order to select the best ground motion set of records to be used as an input for nonlinear dynamic analysis, an optimization approach is applied using genetic algorithms focuse on finding the set of records more compatible with a target spectrum and target Np values. The results of the new Np-based approach suggest that the real accelerograms obtained with this procedure, reduce the scatter of the response spectra as compared with the traditional approach; furthermore, the mean spectrum of the set of records is very similar to the target seismic design spectrum in the range of interest periods, and at the same time, similar Np values are obtained for the selected records and the target spectrum.

  9. Selection and provisioning of services in a cloud using recommender systems approach for SMME

    CSIR Research Space (South Africa)

    Manqele, S

    2013-06-01

    Full Text Available Peninsula University of Technology, 10 September 2013 Selection and provisioning of services in a cloud using recommender systems approach for SMME S. Manqele1, N.Dlodlo2, P.Mvelase3, M. Dlodlo4 , S.S. Xulu5, M. Adigun6 1, 2, 3 CSIR – Meraka...

  10. Multi-Layer Approach for the Detection of Selective Forwarding Attacks.

    Science.gov (United States)

    Alajmi, Naser; Elleithy, Khaled

    2015-11-19

    Security breaches are a major threat in wireless sensor networks (WSNs). WSNs are increasingly used due to their broad range of important applications in both military and civilian domains. WSNs are prone to several types of security attacks. Sensor nodes have limited capacities and are often deployed in dangerous locations; therefore, they are vulnerable to different types of attacks, including wormhole, sinkhole, and selective forwarding attacks. Security attacks are classified as data traffic and routing attacks. These security attacks could affect the most significant applications of WSNs, namely, military surveillance, traffic monitoring, and healthcare. Therefore, there are different approaches to detecting security attacks on the network layer in WSNs. Reliability, energy efficiency, and scalability are strong constraints on sensor nodes that affect the security of WSNs. Because sensor nodes have limited capabilities in most of these areas, selective forwarding attacks cannot be easily detected in networks. In this paper, we propose an approach to selective forwarding detection (SFD). The approach has three layers: MAC pool IDs, rule-based processing, and anomaly detection. It maintains the safety of data transmission between a source node and base station while detecting selective forwarding attacks. Furthermore, the approach is reliable, energy efficient, and scalable.

  11. Multi-Layer Approach for the Detection of Selective Forwarding Attacks

    Directory of Open Access Journals (Sweden)

    Naser Alajmi

    2015-11-01

    Full Text Available Security breaches are a major threat in wireless sensor networks (WSNs. WSNs are increasingly used due to their broad range of important applications in both military and civilian domains. WSNs are prone to several types of security attacks. Sensor nodes have limited capacities and are often deployed in dangerous locations; therefore, they are vulnerable to different types of attacks, including wormhole, sinkhole, and selective forwarding attacks. Security attacks are classified as data traffic and routing attacks. These security attacks could affect the most significant applications of WSNs, namely, military surveillance, traffic monitoring, and healthcare. Therefore, there are different approaches to detecting security attacks on the network layer in WSNs. Reliability, energy efficiency, and scalability are strong constraints on sensor nodes that affect the security of WSNs. Because sensor nodes have limited capabilities in most of these areas, selective forwarding attacks cannot be easily detected in networks. In this paper, we propose an approach to selective forwarding detection (SFD. The approach has three layers: MAC pool IDs, rule-based processing, and anomaly detection. It maintains the safety of data transmission between a source node and base station while detecting selective forwarding attacks. Furthermore, the approach is reliable, energy efficient, and scalable.

  12. Empowering breeding programs with new approaches to overcome constraints for selecting superior quality traits of rice

    NARCIS (Netherlands)

    Calingacion, M.N.

    2015-01-01

    Empowering breeding programs with new approaches to overcome constraints for selecting superior quality traits of rice

    Mariafe N. Calingacion

    Most rice breeding programs have focused on improving agronomic traits such as yield, while enhancing grain quality traits

  13. Multi-Objective Particle Swarm Optimization Approach for Cost-Based Feature Selection in Classification.

    Science.gov (United States)

    Zhang, Yong; Gong, Dun-Wei; Cheng, Jian

    2017-01-01

    Feature selection is an important data-preprocessing technique in classification problems such as bioinformatics and signal processing. Generally, there are some situations where a user is interested in not only maximizing the classification performance but also minimizing the cost that may be associated with features. This kind of problem is called cost-based feature selection. However, most existing feature selection approaches treat this task as a single-objective optimization problem. This paper presents the first study of multi-objective particle swarm optimization (PSO) for cost-based feature selection problems. The task of this paper is to generate a Pareto front of nondominated solutions, that is, feature subsets, to meet different requirements of decision-makers in real-world applications. In order to enhance the search capability of the proposed algorithm, a probability-based encoding technology and an effective hybrid operator, together with the ideas of the crowding distance, the external archive, and the Pareto domination relationship, are applied to PSO. The proposed PSO-based multi-objective feature selection algorithm is compared with several multi-objective feature selection algorithms on five benchmark datasets. Experimental results show that the proposed algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems.

  14. Accounting for selection bias in species distribution models: An econometric approach on forested trees based on structural modeling

    Science.gov (United States)

    Ay, Jean-Sauveur; Guillemot, Joannès; Martin-StPaul, Nicolas K.; Doyen, Luc; Leadley, Paul

    2015-04-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global change on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of application on forested trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8 km). We also compared the output of the SSDM with outputs of a classical SDM in term of bioclimatic response curves and potential distribution under current climate. According to the species and the spatial resolution of the calibration dataset, shapes of bioclimatic response curves the modelled species distribution maps differed markedly between the SSDM and classical SDMs. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents a crucial step to account for economic constraints on tree

  15. A Novel Approach to Selecting Contractor in Agent-based Multi-sensor Battlefield Reconnaissance Simulation

    Directory of Open Access Journals (Sweden)

    Xiong Li

    2012-11-01

    Full Text Available This paper presents a novel approach towards showing how contractor in agent-based simulation for complex warfare system such as multi-sensor battlefield reconnaissance system can be selected in Contract Net Protocol (CNP with high efficiency. We first analyze agent and agent-based simulation framework, CNP and collaborators, and present agents interaction chain used to actualize CNP and establish agents trust network. We then obtain contractor's importance weight and dynamic trust by presenting fuzzy similarity-based algorithm and trust modifying algorithm, thus we propose contractor selecting approach based on maximum dynamic integrative trust. We validate the feasibility and capability of this approach by implementing simulation, analyzing compared results and checking the model.

  16. Selecting concepts for a concept-based curriculum: application of a benchmark approach.

    Science.gov (United States)

    Giddens, Jean Foret; Wright, Mary; Gray, Irene

    2012-09-01

    In response to a transformational movement in nursing education, faculty across the country are considering changes to curricula and approaches to teaching. As a result, an emerging trend in many nursing programs is the adoption of a concept-based curriculum. As part of the curriculum development process, the selection of concepts, competencies, and exemplars on which to build courses and base content is needed. This article presents a benchmark approach used to validate and finalize concept selection among educators developing a concept-based curriculum for a statewide nursing consortium. These findings are intended to inform other nurse educators who are currently involved with or are considering this curriculum approach. Copyright 2012, SLACK Incorporated.

  17. A synbio approach for selection of highly expressed gene variants in Gram-positive bacteria

    DEFF Research Database (Denmark)

    Ferro, Roberto; Rennig, Maja; Hernández Rollán, Cristina

    2018-01-01

    with a long history in food fermentation. We have developed a synbio approach for increasing gene expression in two Gram-positive bacteria. First of all, the gene of interest was coupled to an antibiotic resistance gene to create a growth-based selection system. We then randomised the translation initiation...... region (TIR) preceding the gene of interest and selected clones that produced high protein titres, as judged by their ability to survive on high concentrations of antibiotic. Using this approach, we were able to significantly increase production of two industrially relevant proteins; sialidase in B....... subtilis and tyrosine ammonia lyase in L. lactis. Gram-positive bacteria are widely used to produce industrial enzymes. High titres are necessary to make the production economically feasible. The synbio approach presented here is a simple and inexpensive way to increase protein titres, which can be carried...

  18. Breeding approaches in simultaneous selection for multiple stress tolerance of maize in tropical environments

    Directory of Open Access Journals (Sweden)

    Denić M.

    2007-01-01

    Full Text Available Maize is the principal crop and major staple food in the most countries of Sub-Saharan Africa. However, due to the influence of abiotic and biotic stress factors, maize production faces serious constraints. Among the agro-ecological conditions, the main constraints are: lack and poor distribution of rainfall; low soil fertility; diseases (maize streak virus, downy mildew, leaf blights, rusts, gray leaf spot, stem/cob rots and pests (borers and storage pests. Among the socio-economic production constraints are: poor economy, serious shortage of trained manpower; insufficient management expertise, lack of use of improved varieties and poor cultivation practices. To develop desirable varieties, and thus consequently alleviate some of these constraints, appropriate breeding approaches and field-based methodologies in selection for multiple stress tolerance, were implemented. These approaches are mainly based on: a Crossing selected genotypes with more desirable stress tolerant and other agronomic traits; b Using the disease/pest spreader row method, combined with testing and selection of created progenies under strong to intermediate pressure of drought and low soil fertility in nurseries; and c Evaluation of the varieties developed in multi-location trials under low and "normal" inputs. These approaches provide testing and selection of large number of progenies, which is required for simultaneous selection for multiple stress tolerance. Data obtained revealed that remarkable improvement of the traits under selection was achieved. Biggest progress was obtained in selection for maize streak virus and downy mildew resistance, flintiness and earliness. In the case of drought stress, statistical analyses revealed significant negative correlation between yield and anthesis-silking interval, and between yield and days to silk, but positive correlation between yield and grain weight per ear.

  19. THE STAKEHOLDER MODEL REFINED

    OpenAIRE

    Y. FASSIN

    2008-01-01

    The popularity of the stakeholder model has been achieved thanks to its powerful visual scheme and its very simplicity. Stakeholder management has become an important tool to transfer ethics to management practice and strategy. Nevertheless, legitimate criticism continues to insist on clarification and emphasises on the perfectible nature of the model. Here, rather than building on the discussion from a philosophical or theoretical point of view, a different and innovative approach has been c...

  20. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Directory of Open Access Journals (Sweden)

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2select key risk factors in the Cox’s proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  1. An Integrated Fuzzy Approach for Strategic Alliance Partner Selection in Third-Party Logistics

    Directory of Open Access Journals (Sweden)

    Burak Erkayman

    2012-01-01

    Full Text Available Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model.

  2. An integrated fuzzy approach for strategic alliance partner selection in third-party logistics.

    Science.gov (United States)

    Erkayman, Burak; Gundogar, Emin; Yilmaz, Aysegul

    2012-01-01

    Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM) approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP) technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS) to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model.

  3. An Integrated Fuzzy Approach for Strategic Alliance Partner Selection in Third-Party Logistics

    Science.gov (United States)

    Gundogar, Emin; Yılmaz, Aysegul

    2012-01-01

    Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM) approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP) technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS) to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model. PMID:23365520

  4. A grey DEMATEL approach to develop third-party logistics provider selection criteria

    DEFF Research Database (Denmark)

    Govindan, Kannan; Khodaverdi, Roohollah; Vafadarnikjoo, Amin

    2016-01-01

    - The paper's results help managers of automotive industries, particularly in developing countries, to outsource logistics activities to 3PL providers effectively and to create a significant competitive advantage. Originality/value - The main contributions of this paper are twofold. First, this paper proposes...... an integrated grey DEMATEL method to consider interdependent relationships among the 3PL provider selection criteria. Second, this study is one of the first studies to consider 3PL provider selection in a developing country like Iran....... identifies important criteria for 3PL provider selection and evaluation, and the purpose of this paper is to select 3PL providers from the viewpoint of firms which were already outsourcing their logistics services. Design/methodology/approach - This study utilized the grey decision-making trial...

  5. Comparing geological and statistical approaches for element selection in sediment tracing research

    Science.gov (United States)

    Laceby, J. Patrick; McMahon, Joe; Evrard, Olivier; Olley, Jon

    2015-04-01

    Elevated suspended sediment loads reduce reservoir capacity and significantly increase the cost of operating water treatment infrastructure, making the management of sediment supply to reservoirs of increasingly importance. Sediment fingerprinting techniques can be used to determine the relative contributions of different sources of sediment accumulating in reservoirs. The objective of this research is to compare geological and statistical approaches to element selection for sediment fingerprinting modelling. Time-integrated samplers (n=45) were used to obtain source samples from four major subcatchments flowing into the Baroon Pocket Dam in South East Queensland, Australia. The geochemistry of potential sources were compared to the geochemistry of sediment cores (n=12) sampled in the reservoir. The geochemical approach selected elements for modelling that provided expected, observed and statistical discrimination between sediment sources. Two statistical approaches selected elements for modelling with the Kruskal-Wallis H-test and Discriminatory Function Analysis (DFA). In particular, two different significance levels (0.05 & 0.35) for the DFA were included to investigate the importance of element selection on modelling results. A distribution model determined the relative contributions of different sources to sediment sampled in the Baroon Pocket Dam. Elemental discrimination was expected between one subcatchment (Obi Obi Creek) and the remaining subcatchments (Lexys, Falls and Bridge Creek). Six major elements were expected to provide discrimination. Of these six, only Fe2O3 and SiO2 provided expected, observed and statistical discrimination. Modelling results with this geological approach indicated 36% (+/- 9%) of sediment sampled in the reservoir cores were from mafic-derived sources and 64% (+/- 9%) were from felsic-derived sources. The geological and the first statistical approach (DFA0.05) differed by only 1% (σ 5%) for 5 out of 6 model groupings with only

  6. Basic effects of pulp refining on fiber properties--a review.

    Science.gov (United States)

    Gharehkhani, Samira; Sadeghinezhad, Emad; Kazi, Salim Newaz; Yarmand, Hooman; Badarudin, Ahmad; Safaei, Mohammad Reza; Zubir, Mohd Nashrul Mohd

    2015-01-22

    The requirement for high quality pulps which are widely used in paper industries has increased the demand for pulp refining (beating) process. Pulp refining is a promising approach to improve the pulp quality by changing the fiber characteristics. The diversity of research on the effect of refining on fiber properties which is due to the different pulp sources, pulp consistency and refining equipment has interested us to provide a review on the studies over the last decade. In this article, the influence of pulp refining on structural properties i.e., fibrillations, fine formation, fiber length, fiber curl, crystallinity and distribution of surface chemical compositions is reviewed. The effect of pulp refining on electrokinetic properties of fiber e.g., surface and total charges of pulps is discussed. In addition, an overview of different refining theories, refiners as well as some tests for assessing the pulp refining is presented. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. South Korea - oil refining overview

    International Nuclear Information System (INIS)

    Hayes, D.

    1999-01-01

    Following the economic problems of the 1990s, the petroleum refining industry of South Korea underwent much involuntary restructuring in 1999 with respect to takeovers and mergers and these are discussed. The demand for petroleum has now pretty well recovered. The reasons for fluctuating prices in the 1990s, how the new structure should be cushioned against changes in the future, and the potential for South Korea to export refined petroleum, are all discussed

  8. Adaptive Mesh Refinement in CTH

    International Nuclear Information System (INIS)

    Crawford, David

    1999-01-01

    This paper reports progress on implementing a new capability of adaptive mesh refinement into the Eulerian multimaterial shock- physics code CTH. The adaptivity is block-based with refinement and unrefinement occurring in an isotropic 2:1 manner. The code is designed to run on serial, multiprocessor and massive parallel platforms. An approximate factor of three in memory and performance improvements over comparable resolution non-adaptive calculations has-been demonstrated for a number of problems

  9. Steel refining possibilities in LF

    Science.gov (United States)

    Dumitru, M. G.; Ioana, A.; Constantin, N.; Ciobanu, F.; Pollifroni, M.

    2018-01-01

    This article presents the main possibilities for steel refining in Ladle Furnace (LF). These, are presented: steelmaking stages, steel refining through argon bottom stirring, online control of the bottom stirring, bottom stirring diagram during LF treatment of a heat, porous plug influence over the argon stirring, bottom stirring porous plug, analysis of porous plugs disposal on ladle bottom surface, bottom stirring simulation with ANSYS, bottom stirring simulation with Autodesk CFD.

  10. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    Science.gov (United States)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable

  11. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco

    2012-01-01

    to a small set of abstract events that can be handled by automata learning tools. In this article, we show how such abstractions can be constructed fully automatically for a restricted class of extended finite state machines in which one can test for equality of data parameters, but no operations on data...... are allowed. Our approach uses counterexample-guided abstraction refinement: whenever the current abstraction is too coarse and induces nondeterministic behavior, the abstraction is refined automatically. Using Tomte, a prototype tool implementing our algorithm, we have succeeded to learn – fully......Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete events...

  12. A Bayesian Approach to Model Selection in Hierarchical Mixtures-of-Experts Architectures.

    Science.gov (United States)

    Tanner, Martin A.; Peng, Fengchun; Jacobs, Robert A.

    1997-03-01

    There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable; investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive "divide-and-conquer" strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.

  13. Refinement-Animation for Event-B - Towards a Method of Validation

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Leuschel, Michael; Plagge, Daniel

    2010-01-01

    We provide a detailed description of refinement in Event-B, both as a contribution in itself and as a foundation for the approach to simultaneous animation of multiple levels of refinement that we propose. We present an algorithm for simultaneous multi-level animation of refinement, and show how ...

  14. Strategic project selection based on evidential reasoning approach for high-end equipment manufacturing industry

    Directory of Open Access Journals (Sweden)

    Lu Guangyan

    2017-01-01

    Full Text Available With the rapid development of science and technology, emerging information technologies have significantly changed the daily life of people. In such context, strategic project selection for high-end equipment manufacturing industries faces more and more complexities and uncertainties with the consideration of several complex criteria. For example, a group of experts rather than a single expert should be invited to select strategic project for high-end equipment manufacturing industries and the experts may feel difficulty to express their preferences towards different strategic projects due to their limited cognitive capabilities. In order to handle these complexities and uncertainties, the criteria framework of strategic project selection is firstly constructed based on the characteristics of high-end equipment manufacturing industries and then evidential reasoning (ER approach is introduced in this paper to help experts express their uncertain preferences and aggregate these preferences to generate an appropriate strategic project. A real case of strategic project selection in a high-speed train manufacturing enterprise is investigated to demonstrate the validity of the ER approach in solving strategic project selection problem.

  15. Safety approach to the selection of design criteria for the CRBRP reactor refueling system

    International Nuclear Information System (INIS)

    Meisl, C.J.; Berg, G.E.; Sharkey, N.F.

    1979-01-01

    The selection of safety design criteria for Liquid Metal Fast Breeder Reactor (LMFBR) refueling systems required the extrapolation of regulations and guidelines intended for Light Water Reactor refueling systems and was encumbered by the lack of benefit from a commercially licensed predecessor other than Fermi. The overall approach and underlying logic are described for developing safety design criteria for the reactor refueling system (RRS) of the Clinch River Breeder Reactor Plant (CRBRP). The complete selection process used to establish the criteria is presented, from the definition of safety functions to the finalization of safety design criteria in the appropriate documents. The process steps are illustrated by examples

  16. Refinement of Parallel and Reactive Programs

    OpenAIRE

    Back, R. J. R.

    1992-01-01

    We show how to apply the refinement calculus to stepwise refinement of parallel and reactive programs. We use action systems as our basic program model. Action systems are sequential programs which can be implemented in a parallel fashion. Hence refinement calculus methods, originally developed for sequential programs, carry over to the derivation of parallel programs. Refinement of reactive programs is handled by data refinement techniques originally developed for the sequential refinement c...

  17. Refinement of boards' role required.

    Science.gov (United States)

    Umbdenstock, R J

    1987-01-01

    The governing board's role in health care is not changing, but new competitive forces necessitate a refinement of the board's approach to fulfilling its role. In a free-standing, community, not-for-profit hospital, the board functions as though it were the "owner." Although it does not truly own the facility in the legal sense, the board does have legal, fiduciary, and financial responsibilities conferred on it by the state. In a religious-sponsored facility, the board fulfills these same obligations on behalf of the sponsoring institute, subject to the institute's reserved powers. In multi-institutional systems, the hospital board's power and authority depend on the role granted it by the system. Boards in all types of facilities are currently faced with the following challenges: Fulfilling their basic responsibilities, such as legal requirements, financial duties, and obligations for the quality of care. Encouraging management and the board itself to "think strategically" in attacking new competitive market forces while protecting the organization's traditional mission and values. Assessing recommended strategies in light of consequences if constituencies think the organization is abandoning its commitments. Boards can take several steps to match their mode of operation with the challenges of the new environment. Boards must rededicate themselves to the hospital's mission. Trustees must expand their understanding of health care trends and issues and their effect on the organization. Boards must evaluate and help strengthen management's performance, rather than acting as a "watchdog" in an adversarial position. Boards must think strategically, rather than focusing solely on operational details. Boards must evaluate the methods they use for conducting business.

  18. Selecting appropriate wastewater treatment technologies using a choosing-by-advantages approach.

    Science.gov (United States)

    Arroyo, Paz; Molinos-Senante, María

    2018-06-01

    Selecting the most sustainable wastewater treatment (WWT) technology among possible alternatives is a very complex task because the choice must integrate economic, environmental, and social criteria. Traditionally, several multi-criteria decision-making approaches have been applied, with the most often used being the analytical hierarchical process (AHP). However, AHP allows users to offset poor environmental and/or social performance with low cost. To overcome this limitation, our study examines a choosing-by-advantages (CBA) approach to rank seven WWT technologies for secondary WWT. CBA results were compared with results obtained by using the AHP approach. The rankings of WWT alternatives differed, depending on whether the CBA or AHP approach was used, which highlights the importance of the method used to support decision-making processes, particularly ones that rely on subjective interpretations by experts. This paper uses a holistic perspective to demonstrate the benefits of using the CBA approach to support a decision-making process when a group of experts must come to a consensus in selecting the most suitable WWT technology among several available. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A robust fuzzy possibilistic AHP approach for partner selection in international strategic alliance

    Directory of Open Access Journals (Sweden)

    Vahid Reza Salamat

    2018-09-01

    Full Text Available The international strategic alliance is an inevitable solution for making competitive advantage and reducing the risk in today’s business environment. Partner selection is an important part in success of partnerships, and meanwhile it is a complicated decision because of various dimensions of the problem and inherent conflicts of stockholders. The purpose of this paper is to provide a practical approach to the problem of partner selection in international strategic alliances, which fulfills the gap between theories of inter-organizational relationships and quantitative models. Thus, a novel Robust Fuzzy Possibilistic AHP approach is proposed for combining the benefits of two complementary theories of inter-organizational relationships named, (1 Resource-based view, and (2 Transaction-cost theory and considering Fit theory as the perquisite of alliance success. The Robust Fuzzy Possibilistic AHP approach is a novel development of Interval-AHP technique employing robust formulation; aimed at handling the ambiguity of the problem and let the use of intervals as pairwise judgments. The proposed approach was compared with existing approaches, and the results show that it provides the best quality solutions in terms of minimum error degree. Moreover, the framework implemented in a case study and its applicability were discussed.

  20. Robot Evaluation and Selection with Entropy-Based Combination Weighting and Cloud TODIM Approach

    Directory of Open Access Journals (Sweden)

    Jing-Jing Wang

    2018-05-01

    Full Text Available Nowadays robots have been commonly adopted in various manufacturing industries to improve product quality and productivity. The selection of the best robot to suit a specific production setting is a difficult decision making task for manufacturers because of the increase in complexity and number of robot systems. In this paper, we explore two key issues of robot evaluation and selection: the representation of decision makers’ diversified assessments and the determination of the ranking of available robots. Specifically, a decision support model which utilizes cloud model and TODIM (an acronym in Portuguese of interactive and multiple criteria decision making method is developed for the purpose of handling robot selection problems with hesitant linguistic information. Besides, we use an entropy-based combination weighting technique to estimate the weights of evaluation criteria. Finally, we illustrate the proposed cloud TODIM approach with a robot selection example for an automobile manufacturer, and further validate its effectiveness and benefits via a comparative analysis. The results show that the proposed robot selection model has some unique advantages, which is more realistic and flexible for robot selection under a complex and uncertain environment.

  1. Local multigrid mesh refinement in view of nuclear fuel 3D modelling in pressurised water reactors

    International Nuclear Information System (INIS)

    Barbie, L.

    2013-01-01

    The aim of this study is to improve the performances, in terms of memory space and computational time, of the current modelling of the Pellet-Cladding mechanical Interaction (PCI), complex phenomenon which may occurs during high power rises in pressurised water reactors. Among the mesh refinement methods - methods dedicated to efficiently treat local singularities - a local multi-grid approach was selected because it enables the use of a black-box solver while dealing few degrees of freedom at each level. The Local Defect Correction (LDC) method, well suited to a finite element discretization, was first analysed and checked in linear elasticity, on configurations resulting from the PCI, since its use in solid mechanics is little widespread. Various strategies concerning the implementation of the multilevel algorithm were also compared. Coupling the LDC method with the Zienkiewicz-Zhu a posteriori error estimator in order to automatically detect the zones to be refined, was then tested. Performances obtained on two-dimensional and three-dimensional cases are very satisfactory, since the algorithm proposed is more efficient than h-adaptive refinement methods. Lastly, the LDC algorithm was extended to nonlinear mechanics. Space/time refinement as well as transmission of the initial conditions during the re-meshing step were looked at. The first results obtained are encouraging and show the interest of using the LDC method for PCI modelling. (author) [fr

  2. Predictive Validity of an Empirical Approach for Selecting Promising Message Topics: A Randomized-Controlled Study

    Science.gov (United States)

    Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert

    2016-01-01

    Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218

  3. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  4. Logistics Service Provider Selection through an Integrated Fuzzy Multicriteria Decision Making Approach

    OpenAIRE

    Gülşen Akman; Kasım Baynal

    2014-01-01

    Nowadays, the demand of third-party logistics provider becomes an increasingly important issue for companies to improve their customer service and to decrease logistics costs. This paper presents an integrated fuzzy approach for the evaluation and selection of 3rd party logistics service providers. This method consists of two techniques: (1) use fuzzy analytic hierarchy process to identify weights of evaluation criteria; (2) apply fuzzy technique for order preference by similarity to ideal so...

  5. A behavioural approach to financial portfolio selection problem: an empirical study using heuristics

    OpenAIRE

    Grishina, Nina

    2014-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University The behaviourally based portfolio selection problem with investor's loss aversion and risk aversion biases in portfolio choice under uncertainty are studied. The main results of this work are developed heuristic approaches for the prospect theory and cumulative prospect theory models proposed by Kahneman and Tversky in 1979 and 1992 as well as an empirical comparative analysis of these models ...

  6. A trait-based approach reveals the feeding selectivity of a small endangered Mediterranean fish

    OpenAIRE

    Rodriguez-Lozano, Pablo; Verkaik, Iraima; Maceda Veiga, Alberto; Monroy, Mario; de Sostoa, Adolf; Rieradevall, Maria; Prat, Narcis

    2016-01-01

    Abstract Functional traits are growing in popularity in modern ecology, but feeding studies remain primarily rooted in a taxonomic?based perspective. However, consumers do not have any reason to select their prey using a taxonomic criterion, and prey assemblages are variable in space and time, which makes taxon?based studies assemblage?specific. To illustrate the benefits of the trait?based approach to assessing food choice, we studied the feeding ecology of the endangered freshwater fish Bar...

  7. Selection of engineering materials for heat exchangers (An expert system approach)

    International Nuclear Information System (INIS)

    Ahmed, K.; Abou-Ali, M.; Bassuni, M.

    1997-01-01

    The materials selection as a part of the design process of the heat exchangers is one of the most important steps in the whole industry. The clear recognition of the service requirements of the different types of the heat exchangers is very important to select the adequate and economic materials to meet such requirements. of course the manufacturer should ensure that failure does not occur in service specially it is one of the main and fetal component of the nuclear reactor, pressurized water type (PWR). It is necessary to know the possible mechanisms of failure. Also the achievement of the materials selection using the expert system approach in the process sequence of heat exchanger manufacturing is introduced. Different parameters and requirements controlling each process and the linkage between these parameters and the final product will be shown. 2 figs., 3 tabs

  8. A semiparametric graphical modelling approach for large-scale equity selection.

    Science.gov (United States)

    Liu, Han; Mulvey, John; Zhao, Tianqi

    2016-01-01

    We propose a new stock selection strategy that exploits rebalancing returns and improves portfolio performance. To effectively harvest rebalancing gains, we apply ideas from elliptical-copula graphical modelling and stability inference to select stocks that are as independent as possible. The proposed elliptical-copula graphical model has a latent Gaussian representation; its structure can be effectively inferred using the regularized rank-based estimators. The resulting algorithm is computationally efficient and scales to large data-sets. To show the efficacy of the proposed method, we apply it to conduct equity selection based on a 16-year health care stock data-set and a large 34-year stock data-set. Empirical tests show that the proposed method is superior to alternative strategies including a principal component analysis-based approach and the classical Markowitz strategy based on the traditional buy-and-hold assumption.

  9. Usefulness of selective cerebral intra-arterial digital subtraction angiography by transbrachial approach

    International Nuclear Information System (INIS)

    Matsunaga, Naofumi; Hayashi, Kuniaki; Uetani, Masataka; Hirao, Koichi; Fukuda, Toshio; Aikawa, Hisayuki; Iwao, Masaaki; Hombo, Zen-ichiro

    1988-01-01

    Selective cerebral intra-arterial digital subtraction angiography (IA-DSA) by the transbrachial approach was performed on 53 patients (including 34 outpatients) with suspected cerebrovascular diseases or brain tumors. 80-cm-long, 4F modified Simmons catheter was used. Success rates of selective catheterization to the common carotid and vertebral arteries were 86.0 % from right transbrachial approach (35 cases) and 79.6 % from left approach (18 cases). Successful catheterization to the common carotid and ipsilateral vertebral arteries is obtained in 91.3 % from right transbrachial approach, and 78.7 % from left approach. Righ common carotid artery could be catheterized in all 55 cases from right transbrachial approach, but in only 6 of 15 patients (40 %) from left approach. As for contrast material, 4 or 6 ml of Iopamidol 300 mgI/ml were mechanically injected into common carotid artery at a flow rate of 2 - 3 ml/sec, and 9 ml two-fold diluted Iopamidol were injected into the vertebral artery at a flow rate of 6 ml/sec. There was no recoil of the catheter. Visualization of the relatively small vessels such as cortical branches was excellent in most cases. However, smaller vessel such as meningohypophyseal trunk was not well visualized with IA-DSA. Spatial resolution of IA-DSA was generally satisfactory. However, conventional angiography was still required, particularly to clearly delineate small cerebral aneurysms. Major complications were never experienced. It was concluded that this procedure is useful, particularly for the screening and postoperative follow-up studies, and can also be applied to outpatients. (author)

  10. Fuzzy hybrid MCDM approach for selection of wind turbine service technicians

    Directory of Open Access Journals (Sweden)

    Goutam Kumar Bose

    2016-01-01

    Full Text Available This research paper is aimed to present a fuzzy Hybrid Multi-criteria decision making (MCDM methodology for selecting employees. The present study aspires to present the hybrid approach of Fuzzy multiple MCDM techniques with tactical viewpoint to support the recruitment process of wind turbine service technicians. The methodology is based on the application of Fuzzy ARAS (Additive Ratio Assessment and Fuzzy MOORA (Multi-Objective Optimization on basis of Ratio Analysis which are integrated through group decision making (GDM method in the model for selection of wind turbine service technicians’ ranking. Here a group of experts from different fields of expertise are engaged to finalize the decision. Series of tests are conducted regarding physical fitness, technical written test, practical test along with general interview and medical examination to facilitate the final selection using the above techniques. In contrast to single decision making approaches, the proposed group decision making model efficiently supports the wind turbine service technicians ranking process. The effectiveness of the proposed approach manifest from the case study of service technicians required for the maintenance department of wind power plant using Fuzzy ARAS and Fuzzy MOORA. This set of potential technicians is evaluated based on five main criteria.

  11. A Fuzzy MCDM Approach for Green Supplier Selection from the Economic and Environmental Aspects

    Directory of Open Access Journals (Sweden)

    Hsiu Mei Wang Chen

    2016-01-01

    Full Text Available Due to the challenge of rising public awareness of environmental issues and governmental regulations, green supply chain management (SCM has become an important issue for companies to gain environmental sustainability. Supplier selection is one of the key operational tasks necessary to construct a green SCM. To select the most suitable suppliers, many economic and environmental criteria must be considered in the decision process. Although numerous studies have used economic criteria such as cost, quality, and lead time in the supplier selection process, only some studies have taken into account the environmental issues. This study proposes a comprehensive fuzzy multicriteria decision making (MCDM approach for green supplier selection and evaluation, using both economic and environmental criteria. In the proposed approach, a fuzzy analytic hierarchy process (AHP is employed to determine the important weights of criteria under vague environment. In addition, a fuzzy technique for order performance by similarity to ideal solution (TOPSIS is used to evaluate and rank the potential suppliers. Finally, a case study in Luminance Enhancement Film (LEF industry is presented to illustrate the applicability and efficiency of the proposed method.

  12. An Adaptive Learning Based Network Selection Approach for 5G Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Xiaohong Li

    2018-03-01

    Full Text Available Networks will continue to become increasingly heterogeneous as we move toward 5G. Meanwhile, the intelligent programming of the core network makes the available radio resource be more changeable rather than static. In such a dynamic and heterogeneous network environment, how to help terminal users select optimal networks to access is challenging. Prior implementations of network selection are usually applicable for the environment with static radio resources, while they cannot handle the unpredictable dynamics in 5G network environments. To this end, this paper considers both the fluctuation of radio resources and the variation of user demand. We model the access network selection scenario as a multiagent coordination problem, in which a bunch of rationally terminal users compete to maximize their benefits with incomplete information about the environment (no prior knowledge of network resource and other users’ choices. Then, an adaptive learning based strategy is proposed, which enables users to adaptively adjust their selections in response to the gradually or abruptly changing environment. The system is experimentally shown to converge to Nash equilibrium, which also turns out to be both Pareto optimal and socially optimal. Extensive simulation results show that our approach achieves significantly better performance compared with two learning and non-learning based approaches in terms of load balancing, user payoff and the overall bandwidth utilization efficiency. In addition, the system has a good robustness performance under the condition with non-compliant terminal users.

  13. Romanian refining industry assesses restructuring

    International Nuclear Information System (INIS)

    Tanasescu, D.G.

    1991-01-01

    The Romanian crude oil refining industry, as all the other economic sectors, faces the problems accompanying the transition from a centrally planned economy to a market economy. At present, all refineries have registered as joint-stock companies and all are coordinated and assisted by Rafirom S.A., from both a legal and a production point of view. Rafirom S.A. is a joint-stock company that holds shares in refineries and other stock companies with activities related to oil refining. Such activities include technological research, development, design, transportation, storage, and domestic and foreign marketing. This article outlines the market forces that are expected to: drive rationalization and restructuring of refining operations and define the targets toward which the reconfigured refineries should strive

  14. Data refinement for true concurrency

    Directory of Open Access Journals (Sweden)

    Brijesh Dongol

    2013-05-01

    Full Text Available The majority of modern systems exhibit sophisticated concurrent behaviour, where several system components modify and observe the system state with fine-grained atomicity. Many systems (e.g., multi-core processors, real-time controllers also exhibit truly concurrent behaviour, where multiple events can occur simultaneously. This paper presents data refinement defined in terms of an interval-based framework, which includes high-level operators that capture non-deterministic expression evaluation. By modifying the type of an interval, our theory may be specialised to cover data refinement of both discrete and continuous systems. We present an interval-based encoding of forward simulation, then prove that our forward simulation rule is sound with respect to our data refinement definition. A number of rules for decomposing forward simulation proofs over both sequential and parallel composition are developed.

  15. Bauxite Mining and Alumina Refining

    Science.gov (United States)

    Frisch, Neale; Olney, David

    2014-01-01

    Objective: To describe bauxite mining and alumina refining processes and to outline the relevant physical, chemical, biological, ergonomic, and psychosocial health risks. Methods: Review article. Results: The most important risks relate to noise, ergonomics, trauma, and caustic soda splashes of the skin/eyes. Other risks of note relate to fatigue, heat, and solar ultraviolet and for some operations tropical diseases, venomous/dangerous animals, and remote locations. Exposures to bauxite dust, alumina dust, and caustic mist in contemporary best-practice bauxite mining and alumina refining operations have not been demonstrated to be associated with clinically significant decrements in lung function. Exposures to bauxite dust and alumina dust at such operations are also not associated with the incidence of cancer. Conclusions: A range of occupational health risks in bauxite mining and alumina refining require the maintenance of effective control measures. PMID:24806720

  16. An improved discriminative filter bank selection approach for motor imagery EEG signal classification using mutual information.

    Science.gov (United States)

    Kumar, Shiu; Sharma, Alok; Tsunoda, Tatsuhiko

    2017-12-28

    Common spatial pattern (CSP) has been an effective technique for feature extraction in electroencephalography (EEG) based brain computer interfaces (BCIs). However, motor imagery EEG signal feature extraction using CSP generally depends on the selection of the frequency bands to a great extent. In this study, we propose a mutual information based frequency band selection approach. The idea of the proposed method is to utilize the information from all the available channels for effectively selecting the most discriminative filter banks. CSP features are extracted from multiple overlapping sub-bands. An additional sub-band has been introduced that cover the wide frequency band (7-30 Hz) and two different types of features are extracted using CSP and common spatio-spectral pattern techniques, respectively. Mutual information is then computed from the extracted features of each of these bands and the top filter banks are selected for further processing. Linear discriminant analysis is applied to the features extracted from each of the filter banks. The scores are fused together, and classification is done using support vector machine. The proposed method is evaluated using BCI Competition III dataset IVa, BCI Competition IV dataset I and BCI Competition IV dataset IIb, and it outperformed all other competing methods achieving the lowest misclassification rate and the highest kappa coefficient on all three datasets. Introducing a wide sub-band and using mutual information for selecting the most discriminative sub-bands, the proposed method shows improvement in motor imagery EEG signal classification.

  17. SIMULATION EXPERIMENT ON LANDING SITE SELECTION USING A SIMPLE GEOMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    W. Zhao

    2017-07-01

    Full Text Available Safe landing is an important part of the planetary exploration mission. Even fine scale terrain hazards (such as rocks, small craters, steep slopes, which would not be accurately detected from orbital reconnaissance could also pose a serious risk on planetary lander or rover and scientific instruments on-board it. In this paper, a simple geometric approach on planetary landing hazard detection and safe landing site selection is proposed. In order to achieve full implementation of this algorithm, two easy-to-compute metrics are presented for extracting the terrain slope and roughness information. Unlike conventional methods which must do the robust plane fitting and elevation interpolation for DEM generation, in this work, hazards is identified through the processing directly on LiDAR point cloud. For safe landing site selection, a Generalized Voronoi Diagram is constructed. Based on the idea of maximum empty circle, the safest landing site can be determined. In this algorithm, hazards are treated as general polygons, without special simplification (e.g. regarding hazards as discrete circles or ellipses. So using the aforementioned method to process hazards is more conforming to the real planetary exploration scenario. For validating the approach mentioned above, a simulated planetary terrain model was constructed using volcanic ash with rocks in indoor environment. A commercial laser scanner mounted on a rail was used to scan the terrain surface at different hanging positions. The results demonstrate that fairly hazard detection capability and reasonable site selection was obtained compared with conventional method, yet less computational time and less memory usage was consumed. Hence, it is a feasible candidate approach for future precision landing selection on planetary surface.

  18. Simulation Experiment on Landing Site Selection Using a Simple Geometric Approach

    Science.gov (United States)

    Zhao, W.; Tong, X.; Xie, H.; Jin, Y.; Liu, S.; Wu, D.; Liu, X.; Guo, L.; Zhou, Q.

    2017-07-01

    Safe landing is an important part of the planetary exploration mission. Even fine scale terrain hazards (such as rocks, small craters, steep slopes, which would not be accurately detected from orbital reconnaissance) could also pose a serious risk on planetary lander or rover and scientific instruments on-board it. In this paper, a simple geometric approach on planetary landing hazard detection and safe landing site selection is proposed. In order to achieve full implementation of this algorithm, two easy-to-compute metrics are presented for extracting the terrain slope and roughness information. Unlike conventional methods which must do the robust plane fitting and elevation interpolation for DEM generation, in this work, hazards is identified through the processing directly on LiDAR point cloud. For safe landing site selection, a Generalized Voronoi Diagram is constructed. Based on the idea of maximum empty circle, the safest landing site can be determined. In this algorithm, hazards are treated as general polygons, without special simplification (e.g. regarding hazards as discrete circles or ellipses). So using the aforementioned method to process hazards is more conforming to the real planetary exploration scenario. For validating the approach mentioned above, a simulated planetary terrain model was constructed using volcanic ash with rocks in indoor environment. A commercial laser scanner mounted on a rail was used to scan the terrain surface at different hanging positions. The results demonstrate that fairly hazard detection capability and reasonable site selection was obtained compared with conventional method, yet less computational time and less memory usage was consumed. Hence, it is a feasible candidate approach for future precision landing selection on planetary surface.

  19. Combining epidemiology and biomechanics in sports injury prevention research: a new approach for selecting suitable controls.

    Science.gov (United States)

    Finch, Caroline F; Ullah, Shahid; McIntosh, Andrew S

    2011-01-01

    Several important methodological issues need to be considered when designing sports injury case-control studies. Major design goals for case-control studies include the accounting for prior injury risk exposure, and optimal definitions of both cases and suitable controls are needed to ensure this. This article reviews methodological aspects of published sports injury case-control studies, particularly with regard to the selection of controls. It argues for a new approach towards selecting controls for case-control studies that draws on an interface between epidemiological and biomechanical concepts. A review was conducted to identify sport injury case-control studies published in the peer-review literature during 1985-2008. Overall, 32 articles were identified, of which the majority related to upper or lower extremity injuries. Matching considerations were used for control selection in 16 studies. Specific mention of application of biomechanical principles in the selection of appropriate controls was absent from all studies, including those purporting to evaluate the benefits of personal protective equipment to protect against impact injury. This is a problem because it could lead to biased conclusions, as cases and controls are not fully comparable in terms of similar biomechanical impact profiles relating to the injury incident, such as site of the impact on the body. The strength of the conclusions drawn from case-control studies, and the extent to which results can be generalized, is directly influenced by the definition and recruitment of cases and appropriate controls. Future studies should consider the interface between epidemiological and biomechanical concepts when choosing appropriate controls to ensure that proper adjustment of prior exposure to injury risk is made. To provide necessary guidance for the optimal selection of controls in case-control studies of interventions to prevent sports-related impact injury, this review outlines a new case

  20. Refinement of RAIM via Implementation of Implicit Euler Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoonhee; Kim, Han-Chul [Korea Institute of Nuclear and Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The first approach is a mechanistic approach which is used in LIRIC in which more than 200 reactions are modeled in detail. This approach enables to perform the detailed analysis. However, it requires huge computation burden. The other approach is a simplified model approach which is used in the IMOD, ASTEC/IODE, and etc. Recently, KINS has developed RAIM (Radio-Active Iodine chemistry Model) based on the simplified model approach. Since the numerical analysis module in RAIM is based on the explicit Euler method, there are major issues on the stability of the module. Therefore, implementation of a stable numerical method becomes essential. In this study, RAIM is refined via implementation of implicit Euler method in which the Newton method is used to find the solutions at each time step. The refined RAIM is tested by comparing to RAIM based on the explicit Euler method. In this paper, RAIM was refined by implementing the implicit Euler method. At each time step of the method in the refined RAIM, the reaction kinetics equations are solved by the Newton method in which elements of the Jacobian matrix are expressed analytically. With the results of OECD-BIP P10T2 test, the refined RAIM was compared to RAIM with the explicit Euler method. The refined RAIM shows better agreement with the experimental data than those from the explicit Euler method. For the rapid change of pH during the experiment, the refined RAIM gives more realistic changes in the concentrations of chemical species than those from the explicit Euler method. In addition, in terms of computing time, the refined RAIM shows comparable computing time to that with explicit Euler method. These comparisons are attributed to ⁓10 times larger time step size used in the implicit Euler method, even though computation burden at each time step in the refined RAIM is much higher than that of the explicit Euler method. Compared to the experimental data, the refined RAIM still shows discrepancy, which are attributed

  1. Refinement of RAIM via Implementation of Implicit Euler Method

    International Nuclear Information System (INIS)

    Lee, Yoonhee; Kim, Han-Chul

    2016-01-01

    The first approach is a mechanistic approach which is used in LIRIC in which more than 200 reactions are modeled in detail. This approach enables to perform the detailed analysis. However, it requires huge computation burden. The other approach is a simplified model approach which is used in the IMOD, ASTEC/IODE, and etc. Recently, KINS has developed RAIM (Radio-Active Iodine chemistry Model) based on the simplified model approach. Since the numerical analysis module in RAIM is based on the explicit Euler method, there are major issues on the stability of the module. Therefore, implementation of a stable numerical method becomes essential. In this study, RAIM is refined via implementation of implicit Euler method in which the Newton method is used to find the solutions at each time step. The refined RAIM is tested by comparing to RAIM based on the explicit Euler method. In this paper, RAIM was refined by implementing the implicit Euler method. At each time step of the method in the refined RAIM, the reaction kinetics equations are solved by the Newton method in which elements of the Jacobian matrix are expressed analytically. With the results of OECD-BIP P10T2 test, the refined RAIM was compared to RAIM with the explicit Euler method. The refined RAIM shows better agreement with the experimental data than those from the explicit Euler method. For the rapid change of pH during the experiment, the refined RAIM gives more realistic changes in the concentrations of chemical species than those from the explicit Euler method. In addition, in terms of computing time, the refined RAIM shows comparable computing time to that with explicit Euler method. These comparisons are attributed to ⁓10 times larger time step size used in the implicit Euler method, even though computation burden at each time step in the refined RAIM is much higher than that of the explicit Euler method. Compared to the experimental data, the refined RAIM still shows discrepancy, which are attributed

  2. Integrative approaches to the prediction of protein functions based on the feature selection

    Directory of Open Access Journals (Sweden)

    Lee Hyunju

    2009-12-01

    Full Text Available Abstract Background Protein function prediction has been one of the most important issues in functional genomics. With the current availability of various genomic data sets, many researchers have attempted to develop integration models that combine all available genomic data for protein function prediction. These efforts have resulted in the improvement of prediction quality and the extension of prediction coverage. However, it has also been observed that integrating more data sources does not always increase the prediction quality. Therefore, selecting data sources that highly contribute to the protein function prediction has become an important issue. Results We present systematic feature selection methods that assess the contribution of genome-wide data sets to predict protein functions and then investigate the relationship between genomic data sources and protein functions. In this study, we use ten different genomic data sources in Mus musculus, including: protein-domains, protein-protein interactions, gene expressions, phenotype ontology, phylogenetic profiles and disease data sources to predict protein functions that are labelled with Gene Ontology (GO terms. We then apply two approaches to feature selection: exhaustive search feature selection using a kernel based logistic regression (KLR, and a kernel based L1-norm regularized logistic regression (KL1LR. In the first approach, we exhaustively measure the contribution of each data set for each function based on its prediction quality. In the second approach, we use the estimated coefficients of features as measures of contribution of data sources. Our results show that the proposed methods improve the prediction quality compared to the full integration of all data sources and other filter-based feature selection methods. We also show that contributing data sources can differ depending on the protein function. Furthermore, we observe that highly contributing data sets can be similar among

  3. Refining Nodes and Edges of State Machines

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Snook, Colin

    2011-01-01

    State machines are hierarchical automata that are widely used to structure complex behavioural specifications. We develop two notions of refinement of state machines, node refinement and edge refinement. We compare the two notions by means of examples and argue that, by adopting simple conventions...... refinement theory and UML-B state machine refinement influences the style of node refinement. Hence we propose a method with direct proof of state machine refinement avoiding the detour via Event-B that is needed by UML-B....

  4. Current approach to male infertility treatment: sperm selection procedure based on hyaluronic acid binding ability

    Directory of Open Access Journals (Sweden)

    A. V. Zobova

    2015-01-01

    Full Text Available Intracytoplasmic sperm injection into an oocyte is widely used throughout the world in assisted reproductive technologies programs in the presence of male infertility factor. However, this approach can allow selection of a single sperm, which is carrying different types of pathologies. Minimizing of any potential risks, entailing the occurrence of abnormalities in the embryos development (apoptosis, fragmentation of embryos, alterations in gene expression, aneuploidies is a very important condition for reducing the potential negative consequences resulting the manipulation with gametes. Processes that could be influenced by the embryologist must be fulfilled in safe and physiological way as much as it is possible. Data of numerous publications reporting about the positive effects of using the technology of sperm selection by hyaluronic acid binding, let make a conclusion about the high prospects of this approach in the treatment of male infertility by methods of in vitro fertilization. The selection of sperm with improved characteristics, which determine the maturity and genetic integrity, provides an opportunity to improve the parameters of pre-implantation embryogenesis, having thus a positive effect on clinical outcomes of assisted reproductive technologies programs.

  5. Development of a fraction collection approach in capillary electrophoresis SELEX for aptamer selection.

    Science.gov (United States)

    Luo, Zhaofeng; Zhou, Hongmin; Jiang, Hao; Ou, Huichao; Li, Xin; Zhang, Liyun

    2015-04-21

    Aptamers have attracted much attention due to their ability to bind to target molecules with high affinity and specificity. The development of an approach capable of efficiently generating aptamers through systematic evolution of ligands by exponential enrichment (SELEX) is particularly challenging. Herein, a fraction collection approach in capillary electrophoresis SELEX (FCE-SELEX) for the partition of a bound DNA-target complex is developed. By integrating fraction collection with a facile oil seal method for avoiding contamination while amplifying the bound DNA-target complex, in a single round of selection, a streptavidin-binding aptamer (SBA) has been generated. The affinity of aptamer SBA-36 for streptavidin (SA) is determined as 30.8 nM by surface plasmon resonance (SPR). Selectivity and biotin competition experiments demonstrate that the SBA-36 aptamer selected by FCE-SELEX is as efficient as those from other methods. Based on the ability of fraction collection in partition and collection of the aptamer-target complex from the original DNA library, FCE-SELEX can be a universal tool for the development of aptamers.

  6. Evaluation of peptide selection approaches for epitope‐based vaccine design

    DEFF Research Database (Denmark)

    Schubert, B.; Lund, Ole; Nielsen, Morten

    2013-01-01

    A major challenge in epitope-based vaccine (EV) design stems from the vast genomic variation of pathogens and the diversity of the host cellular immune system. Several computational approaches have been published to assist the selection of potential T cell epitopes for EV design. So far, no thoro......A major challenge in epitope-based vaccine (EV) design stems from the vast genomic variation of pathogens and the diversity of the host cellular immune system. Several computational approaches have been published to assist the selection of potential T cell epitopes for EV design. So far...... in terms of in silico measurements simulating important vaccine properties like the ability of inducing protection against a multivariant pathogen in a population; the predicted immunogenicity; pathogen, allele, and population coverage; as well as the conservation of selected epitopes. Additionally, we...... evaluate the use of human leukocyte antigen (HLA) supertypes with regards to their applicability for population-spanning vaccine design. The results showed that in terms of induced protection methods that simultaneously aim to optimize pathogen and HLA coverage significantly outperform methods focusing...

  7. Process for refining shale bitumen

    Energy Technology Data Exchange (ETDEWEB)

    Plauson, H

    1920-09-19

    A process is disclosed for refining shale bitumen for use as heavy mineral oil, characterized by mixtures of blown hard shale pitch and heavy mineral oil being blown with hot air at temperatures of 120 to 150/sup 0/ with 1 to 3 percent sulfur, and if necessary with 0.5 to 3 percent of an aldehyde.

  8. Panorama 2007: Refining and Petrochemicals

    International Nuclear Information System (INIS)

    Silva, C.

    2007-01-01

    The year 2005 saw a new improvement in refining margins that continued during the first three quarters of 2006. The restoration of margins in the last three years has allowed the refining sector to regain its profitability. In this context, the oil companies reported earnings for fiscal year 2005 that were up significantly compared to 2004, and the figures for the first half-year 2006 confirm this trend. Despite this favorable business environment, investments only saw a minimal increase in 2005 and the improvement expected for 2006 should remain fairly limited. Looking to 2010-2015, it would appear that the planned investment projects with the highest probability of reaching completion will be barely adequate to cover the increase in demand. Refining sector should continue to find itself under pressure. As for petrochemicals, despite a steady up-trend in the naphtha price, the restoration of margins consolidated a comeback that started in 2005. All in all, capital expenditure remained fairly low in both the refining and petrochemicals sectors, but many projects are planned for the next ten years. (author)

  9. Multigrid for refined triangle meshes

    Energy Technology Data Exchange (ETDEWEB)

    Shapira, Yair

    1997-02-01

    A two-level preconditioning method for the solution of (locally) refined finite element schemes using triangle meshes is introduced. In the isotropic SPD case, it is shown that the condition number of the preconditioned stiffness matrix is bounded uniformly for all sufficiently regular triangulations. This is also verified numerically for an isotropic diffusion problem with highly discontinuous coefficients.

  10. A synbio approach for selection of highly expressed gene variants in Gram-positive bacteria.

    Science.gov (United States)

    Ferro, Roberto; Rennig, Maja; Hernández-Rollán, Cristina; Daley, Daniel O; Nørholm, Morten H H

    2018-03-08

    The market for recombinant proteins is on the rise, and Gram-positive strains are widely exploited for this purpose. Bacillus subtilis is a profitable host for protein production thanks to its ability to secrete large amounts of proteins, and Lactococcus lactis is an attractive production organism with a long history in food fermentation. We have developed a synbio approach for increasing gene expression in two Gram-positive bacteria. First of all, the gene of interest was coupled to an antibiotic resistance gene to create a growth-based selection system. We then randomised the translation initiation region (TIR) preceding the gene of interest and selected clones that produced high protein titres, as judged by their ability to survive on high concentrations of antibiotic. Using this approach, we were able to significantly increase production of two industrially relevant proteins; sialidase in B. subtilis and tyrosine ammonia lyase in L. lactis. Gram-positive bacteria are widely used to produce industrial enzymes. High titres are necessary to make the production economically feasible. The synbio approach presented here is a simple and inexpensive way to increase protein titres, which can be carried out in any laboratory within a few days. It could also be implemented as a tool for applications beyond TIR libraries, such as screening of synthetic, homologous or domain-shuffled genes.

  11. Research Notes ~ Selecting Research Areas and Research Design Approaches in Distance Education: Process Issues

    Directory of Open Access Journals (Sweden)

    Sudarshan Mishra

    2004-11-01

    Full Text Available The purpose of this paper is to study the process used for selecting research areas and methodological approaches in distance education in India. Experts from the field of distance education in India were interviewed at length, with the aim of collecting qualitative data on opinions on process-issues for selecting areas for research, research design, and appropriate methodological approaches in distance education. Data collected from these interviews were subjected to content analysis; triangulation and peer consultation techniques were used for cross-checking and data verification. While the findings and recommendations of this study have limited application in that they can only be used in the specific context outlined in this paper, respondents in this study nonetheless revealed the pressing need for more process-oriented research in examining media and technology, learners and learning, and distance learning evaluation processes. Our research, which yielded interesting empirical findings, also determined that a mixed approach – one that involves both quantitative and qualitative methods – is more appropriate for conducting research in distance education in India. Qualitative evidence from our research also indicates that respondents interviewed felt that emphasis should be placed on interdisciplinary and systemic research, over that of traditional disciplinary research. Research methods such as student self-reporting, extensive and highly targeted interviews, conversation and discourse analysis, were determined to as useful for data collection for this study.

  12. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  13. An Appraisal Model Based on a Synthetic Feature Selection Approach for Students’ Academic Achievement

    Directory of Open Access Journals (Sweden)

    Ching-Hsue Cheng

    2017-11-01

    Full Text Available Obtaining necessary information (and even extracting hidden messages from existing big data, and then transforming them into knowledge, is an important skill. Data mining technology has received increased attention in various fields in recent years because it can be used to find historical patterns and employ machine learning to aid in decision-making. When we find unexpected rules or patterns from the data, they are likely to be of high value. This paper proposes a synthetic feature selection approach (SFSA, which is combined with a support vector machine (SVM to extract patterns and find the key features that influence students’ academic achievement. For verifying the proposed model, two databases, namely, “Student Profile” and “Tutorship Record”, were collected from an elementary school in Taiwan, and were concatenated into an integrated dataset based on students’ names as a research dataset. The results indicate the following: (1 the accuracy of the proposed feature selection approach is better than that of the Minimum-Redundancy-Maximum-Relevance (mRMR approach; (2 the proposed model is better than the listing methods when the six least influential features have been deleted; and (3 the proposed model can enhance the accuracy and facilitate the interpretation of the pattern from a hybrid-type dataset of students’ academic achievement.

  14. Implementation of multi-criteria decision making approach for the team leader selection in IT sector

    Directory of Open Access Journals (Sweden)

    Sandhya

    2016-12-01

    Full Text Available In the era of technology, the demand of the software development increases at a very high speed, as software has touched the human’s life in all aspects. The better quality software development acquiring minimum development time leads to the team work in which a group of people has been formed that work together in a team for the software development. One of the most signifi-cant issues in effective and efficient teamwork is the team leader selection because the team lead-er is the person in any team that is going to handle all types of managerial activities such as lead-ership, motivation to others, etc. The team leader selection process may be dependent on numer-ous conflicting selection indexes that make it a Multi-Criteria Decision Making (MCDM prob-lem. In the present research, an MCDM approach namely, Euclidean Distance Based Approxi-mation (EDBA which is based on the calculation of the composite distance value for each alter-native from a hypothetical optimal point is presented. The result of this study provides a compre-hensive ranking of team leaders that leads to the right selection of team leader in information technology (IT sector.

  15. Utilization integrated Fuzzy-QFD and TOPSIS approach in supplier selection

    Directory of Open Access Journals (Sweden)

    2016-02-01

    Full Text Available Supplier selection is a typical multi-attribute problem that involves both qualitative and quantitative factors. To deal with this problem, different techniques have suggested. Being based on purely mathematical data, these techniques have significant drawbacks especially when we want to consider qualitative factors, which are very important in supplier selection and are not easy to measure. Some innovative approaches, based on artificial intelligence techniques such as Fuzzy Logic match very well with decision-making situations especially when decision makers express heterogeneous judgments. In this research, by the combination of Fuzzy logic and the House of Quality (HOQ, qualitative criteria are considered in the forward parts of car suppliers’ selection process in Sazehgostar SAIPA Company. Then, TOPSIS technique is adopted to consider quantitative metrics. Finally, by combining of Fuzzy QFD and TOPSIS techniques, these suppliers will be selected and ranked in this Company. Concern to the both qualitative and quantitative criteria, is the important point used in this research and also methodology utilized, counts innovative aspect. Limited number of experts associated with each piece and unavailability of some quantitative criteria has been limitations across of this study’s accomplishment.

  16. Fingerprint-Based Machine Learning Approach to Identify Potent and Selective 5-HT2BR Ligands

    Directory of Open Access Journals (Sweden)

    Krzysztof Rataj

    2018-05-01

    Full Text Available The identification of subtype-selective GPCR (G-protein coupled receptor ligands is a challenging task. In this study, we developed a computational protocol to find compounds with 5-HT2BR versus 5-HT1BR selectivity. Our approach employs the hierarchical combination of machine learning methods, docking, and multiple scoring methods. First, we applied machine learning tools to filter a large database of druglike compounds by the new Neighbouring Substructures Fingerprint (NSFP. This two-dimensional fingerprint contains information on the connectivity of the substructural features of a compound. Preselected subsets of the database were then subjected to docking calculations. The main indicators of compounds’ selectivity were their different interactions with the secondary binding pockets of both target proteins, while binding modes within the orthosteric binding pocket were preserved. The combined methodology of ligand-based and structure-based methods was validated prospectively, resulting in the identification of hits with nanomolar affinity and ten-fold to ten thousand-fold selectivities.

  17. A two-phased multi-criteria decision-making approach for selecting the best smartphone

    Directory of Open Access Journals (Sweden)

    Yildiz, Aytac

    2015-11-01

    Full Text Available In the last 20 years, rapid and significant developments have occurred in communication and information technologies. In parallel with these developments, the importance of smartphones has increased. In addition, many smartphone manufacturers have launched and continue to launch a number of new models with many features. People who want to buy a new smartphone have difficulties selecting the best smartphone among the numerous models available on the technology markets. Therefore, smartphone selection has become a complex multi-criteria decision-making (MCDM problem for people. Hence, decision-making processes will be facilitated by using MCDM methods, and these will provide the most appropriate decision. In this paper, the best smartphone among the 28 alternatives determined by the person who will buy them are selected by using three main criteria and 17 sub-criteria with the help of a two-phased MCDM approach. In the first phase, 28 smartphone alternatives are ranked using the analytic network process (ANP. In the second phase, a model that includes the best four alternatives of ANP is created. Afterwards, the best smartphone is selected using the generalised Choquet integral (GCI method according to this model. Finally, the findings and the results are given.

  18. Transshipment site selection using the AHP and TOPSIS approaches under fuzzy environment

    International Nuclear Information System (INIS)

    Onuet, Semih; Soner, Selin

    2008-01-01

    Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker to describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP

  19. Approaches to LLW disposal site selection and current progress of host states

    International Nuclear Information System (INIS)

    Walsh, J.J.; Kerr, T.A.

    1990-11-01

    In accordance with the Low-Level Radioactive Waste Policy Amendments Act of 1985 and under the guidance of 10 CFR 61, States have begun entering into compacts to establish and operate regional disposal facilities for low-level radioactive waste. The progress a state makes in implementing a process to identify a specific location for a disposal site is one indication of the level of a state's commitment to meeting its responsibilities under Federal law and interstate compact agreements. During the past few years, several States have been engaged in site selection processes. The purpose of this report is to summarize the site selection approaches of some of the Host States (California, Michigan, Nebraska, New York, North Carolina, Texas, and Illinois), and their progress to date. An additional purpose of the report is to discern whether the Host States's site selection processes were heavily influenced by any common factors. One factor each state held in common was that political and public processes exerted a powerful influence on the site selection process at virtually every stage. 1 ref

  20. Guided wave mode selection for inhomogeneous elastic waveguides using frequency domain finite element approach.

    Science.gov (United States)

    Chillara, Vamshi Krishna; Ren, Baiyang; Lissenden, Cliff J

    2016-04-01

    This article describes the use of the frequency domain finite element (FDFE) technique for guided wave mode selection in inhomogeneous waveguides. Problems with Rayleigh-Lamb and Shear-Horizontal mode excitation in isotropic homogeneous plates are first studied to demonstrate the application of the approach. Then, two specific cases of inhomogeneous waveguides are studied using FDFE. Finally, an example of guided wave mode selection for inspecting disbonds in composites is presented. Identification of sensitive and insensitive modes for defect inspection is demonstrated. As the discretization parameters affect the accuracy of the results obtained from FDFE, effect of spatial discretization and the length of the domain used for the spatial fast Fourier transform are studied. Some recommendations with regard to the choice of the above parameters are provided. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Material Selection for Dye Sensitized Solar Cells Using Multiple Attribute Decision Making Approach

    Directory of Open Access Journals (Sweden)

    Sarita Baghel

    2014-01-01

    Full Text Available Dye sensitized solar cells (DSCs provide a potential alternative to conventional p-n junction photovoltaic devices. The semiconductor thin film plays a crucial role in the working of DSC. This paper aims at formulating a process for the selection of optimum semiconductor material for nanostructured thin film using multiple attribute decision making (MADM approach. Various possible available semiconducting materials and their properties like band gap, cost, mobility, rate of electron injection, and static dielectric constant are considered and MADM technique is applied to select the best suited material. It was found that, out of all possible candidates, titanium dioxide (TiO2 is the best semiconductor material for application in DSC. It was observed that the proposed results are in good agreement with the experimental findings.

  2. The teacher's role in selecting a methodological approach to the interpretation of a literary work

    Directory of Open Access Journals (Sweden)

    Stakić Mirjana M.

    2016-01-01

    Full Text Available The paper looks at the teacher's role in selecting a methodological approach to the interpretation of a literary work. The choice of methodological approach is dependent on: 1 the semiotic structure of the literary text; 2 the specific educational goals of interpretation; 3 the students' age, psychophysical abilities and knowledge, and 4 the planned circumstances of instruction. In selecting a method of interpretation, the teacher should take into consideration not only these factors, but also contemporary literary theory and its methodological apparatus. This can be a challenging task whose fulfillment does not guarantee that the interpretation will be successful, since the validity and functionality of the methodological approach cannot be established in theory but rather through teaching practice. It is up to the teacher to be creative, because a literary work cannot be interpreted by means of a single method but always through a combination of methods, certain of which have their origins in literary theory. There is a widespread belief among teachers that these methods, which have the status of technical/special methods in literary methodology, cannot be used in the first four grades of elementary school. This paper offers an example illustrating that the interpretive model can be used as early as first grade. A teacher's knowledge, as well as their creativity in selecting a method and their openness to creative methodological combinations and skill in applying them, directly affect the effectiveness of interpretation, either succeeding in developing a fondness for books and reading, or, failing that, resulting in a permanent loss of interest in the world of literature.

  3. Heuristic Optimization Approach to Selecting a Transport Connection in City Public Transport

    Directory of Open Access Journals (Sweden)

    Kul’ka Jozef

    2017-02-01

    Full Text Available The article presents a heuristic optimization approach to select a suitable transport connection in the framework of a city public transport. This methodology was applied on a part of the public transport in Košice, because it is the second largest city in the Slovak Republic and its network of the public transport creates a complex transport system, which consists of three different transport modes, namely from the bus transport, tram transport and trolley-bus transport. This solution focused on examining the individual transport services and their interconnection in relevant interchange points.

  4. A Genetic Algorithm-based Antenna Selection Approach for Large-but-Finite MIMO Networks

    KAUST Repository

    Makki, Behrooz

    2016-12-29

    We study the performance of antenna selectionbased multiple-input-multiple-output (MIMO) networks with large but finite number of transmit antennas and receivers. Considering the continuous and bursty communication scenarios with different users’ data request probabilities, we develop an efficient antenna selection scheme using genetic algorithms (GA). As demonstrated, the proposed algorithm is generic in the sense that it can be used in the cases with different objective functions, precoding methods, levels of available channel state information and channel models. Our results show that the proposed GAbased algorithm reaches (almost) the same throughput as the exhaustive search-based optimal approach, with substantially less implementation complexity.

  5. A Genetic Algorithm-based Antenna Selection Approach for Large-but-Finite MIMO Networks

    KAUST Repository

    Makki, Behrooz; Ide, Anatole; Svensson, Tommy; Eriksson, Thomas; Alouini, Mohamed-Slim

    2016-01-01

    We study the performance of antenna selectionbased multiple-input-multiple-output (MIMO) networks with large but finite number of transmit antennas and receivers. Considering the continuous and bursty communication scenarios with different users’ data request probabilities, we develop an efficient antenna selection scheme using genetic algorithms (GA). As demonstrated, the proposed algorithm is generic in the sense that it can be used in the cases with different objective functions, precoding methods, levels of available channel state information and channel models. Our results show that the proposed GAbased algorithm reaches (almost) the same throughput as the exhaustive search-based optimal approach, with substantially less implementation complexity.

  6. Decision-Making Approach to Selecting Optimal Platform of Service Variants

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2016-01-01

    Full Text Available Nowadays, it is anticipated that service sector companies will be inspired to follow mass customization trends of industrial sector. However, services are more abstract than products and therefore concepts for mass customization in manufacturing domain cannot be transformed without a methodical change. This paper is focused on the development of a methodological framework to support decisions in a selection of optimal platform of service variants when compatibility problems between service options occurred. The approach is based on mutual relations between waste and constrained design space entropy. For this purpose, software for quantification of constrained and waste design space is developed. Practicability of the methodology is presented on a realistic case.

  7. An integrated approach to selecting materials for fuel cladding in advanced high-temperature reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rangacharyulu, C., E-mail: chary.r@usask.ca [Univ. of Saskatchewan, Saskatoon, SK (Canada); Guzonas, D.A.; Pencer, J.; Nava-Dominguez, A.; Leung, L.K.H. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    An integrated approach has been developed for selection of fuel cladding materials for advanced high-temperature reactors. Reactor physics, thermalhydraulic and material analyses are being integrated in a systematic study comparing various candidate fuel-cladding alloys. The analyses established the axial and radial neutron fluxes, power distributions, axial and radial temperature distributions, rates of defect formation and helium production using AECL analytical toolsets and experimentally measured corrosion rates to optimize the material composition for fuel cladding. The project has just been initiated at University of Saskatchewan. Some preliminary results of the analyses are presented together with the path forward for the project. (author)

  8. Local Refinement of the Super Element Model of Oil Reservoir

    Directory of Open Access Journals (Sweden)

    A.B. Mazo

    2017-12-01

    Full Text Available In this paper, we propose a two-stage method for petroleum reservoir simulation. The method uses two models with different degrees of detailing to describe hydrodynamic processes of different space-time scales. At the first stage, the global dynamics of the energy state of the deposit and reserves is modeled (characteristic scale of such changes is km / year. The two-phase flow equations in the model of global dynamics operate with smooth averaged pressure and saturation fields, and they are solved numerically on a large computational grid of super-elements with a characteristic cell size of 200-500 m. The tensor coefficients of the super-element model are calculated using special procedures of upscaling of absolute and relative phase permeabilities. At the second stage, a local refinement of the super-element model is constructed for calculating small-scale processes (with a scale of m / day, which take place, for example, during various geological and technical measures aimed at increasing the oil recovery of a reservoir. Then we solve the two-phase flow problem in the selected area of the measure exposure on a detailed three-dimensional grid, which resolves the geological structure of the reservoir, and with a time step sufficient for describing fast-flowing processes. The initial and boundary conditions of the local problem are formulated on the basis of the super-element solution. This approach allows us to reduce the computational costs in order to solve the problems of designing and monitoring the oil reservoir. To demonstrate the proposed approach, we give an example of the two-stage modeling of the development of a layered reservoir with a local refinement of the model during the isolation of a water-saturated high-permeability interlayer. We show a good compliance between the locally refined solution of the super-element model in the area of measure exposure and the results of numerical modeling of the whole history of reservoir

  9. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2017-03-01

    Full Text Available This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  10. Comparisons of single-stage and two-stage approaches to genomic selection.

    Science.gov (United States)

    Schulz-Streeck, Torben; Ogutu, Joseph O; Piepho, Hans-Peter

    2013-01-01

    Genomic selection (GS) is a method for predicting breeding values of plants or animals using many molecular markers that is commonly implemented in two stages. In plant breeding the first stage usually involves computation of adjusted means for genotypes which are then used to predict genomic breeding values in the second stage. We compared two classical stage-wise approaches, which either ignore or approximate correlations among the means by a diagonal matrix, and a new method, to a single-stage analysis for GS using ridge regression best linear unbiased prediction (RR-BLUP). The new stage-wise method rotates (orthogonalizes) the adjusted means from the first stage before submitting them to the second stage. This makes the errors approximately independently and identically normally distributed, which is a prerequisite for many procedures that are potentially useful for GS such as machine learning methods (e.g. boosting) and regularized regression methods (e.g. lasso). This is illustrated in this paper using componentwise boosting. The componentwise boosting method minimizes squared error loss using least squares and iteratively and automatically selects markers that are most predictive of genomic breeding values. Results are compared with those of RR-BLUP using fivefold cross-validation. The new stage-wise approach with rotated means was slightly more similar to the single-stage analysis than the classical two-stage approaches based on non-rotated means for two unbalanced datasets. This suggests that rotation is a worthwhile pre-processing step in GS for the two-stage approaches for unbalanced datasets. Moreover, the predictive accuracy of stage-wise RR-BLUP was higher (5.0-6.1%) than that of componentwise boosting.

  11. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    Science.gov (United States)

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  12. Game Theoretic Approach for Systematic Feature Selection; Application in False Alarm Detection in Intensive Care Units

    Directory of Open Access Journals (Sweden)

    Fatemeh Afghah

    2018-03-01

    Full Text Available Intensive Care Units (ICUs are equipped with many sophisticated sensors and monitoring devices to provide the highest quality of care for critically ill patients. However, these devices might generate false alarms that reduce standard of care and result in desensitization of caregivers to alarms. Therefore, reducing the number of false alarms is of great importance. Many approaches such as signal processing and machine learning, and designing more accurate sensors have been developed for this purpose. However, the significant intrinsic correlation among the extracted features from different sensors has been mostly overlooked. A majority of current data mining techniques fail to capture such correlation among the collected signals from different sensors that limits their alarm recognition capabilities. Here, we propose a novel information-theoretic predictive modeling technique based on the idea of coalition game theory to enhance the accuracy of false alarm detection in ICUs by accounting for the synergistic power of signal attributes in the feature selection stage. This approach brings together techniques from information theory and game theory to account for inter-features mutual information in determining the most correlated predictors with respect to false alarm by calculating Banzhaf power of each feature. The numerical results show that the proposed method can enhance classification accuracy and improve the area under the ROC (receiver operating characteristic curve compared to other feature selection techniques, when integrated in classifiers such as Bayes-Net that consider inter-features dependencies.

  13. Fishmeal Supplier Evaluation and Selection for Aquaculture Enterprise Sustainability with a Fuzzy MCDM Approach

    Directory of Open Access Journals (Sweden)

    Tsung-Hsien Wu

    2017-11-01

    Full Text Available In the aquaculture industry, feed that is of poor quality or nutritionally imbalanced can cause problems including low weight, poor growth, poor palatability, and increased mortality, all of which can induce a decrease in aquaculture production. Fishmeal is considered a better source of protein and its addition as an ingredient in the aquafeed makes aquatic animals grow fast and healthy. This means that fishmeal is the most important feed ingredient in aquafeed for the aquaculture industry. For the aquaculture industry in Taiwan, about 144,000 ton/USD $203,245,000 of fishmeal was imported, mostly from Peru, in 2016. Therefore, the evaluation and selection of fishmeal suppliers is a very important part of the decision-making process for a Taiwanese aquaculture enterprise. This study constructed a multiple criteria decision-making evaluation model for the selection of fishmeal suppliers using the VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR approach based on the weights obtained with the entropy method in a fuzzy decision-making environment. This hybrid approach could effectively and conveniently measure the comprehensive performance of the main Peruvian fishmeal suppliers for practical applications. In addition, the results and processes described herein function as a good reference for an aquaculture enterprise in making decisions when purchasing fishmeal.

  14. European refining: evolution or revolution?

    International Nuclear Information System (INIS)

    Cuthbert, N.

    1999-01-01

    A recent detailed analysis of the refining business in Europe (by Purvin and Gurtz) was used to highlight some key issues facing the industry. The article was written under five sub-sections: (i) economic environment (assessment of the economic prospects for Europe), (ii) energy efficiency and global warming (lists the four points of the EU car makers' voluntary agreement), (iii) fuel quality and refinery investment (iv) refinery capacity and utilisation and (v) industry structure and development. Diagrams show GDP per capita for East and West, European road fuel demand to 2015 and European net trade and European refinery ownership by crude capacity. It was concluded that the future of refining in Europe is 'exciting and challenging' and there are likely to be more large joint venture refineries. (UK)

  15. Prediction of selective estrogen receptor beta agonist using open data and machine learning approach

    Directory of Open Access Journals (Sweden)

    Niu AQ

    2016-07-01

    Full Text Available Ai-qin Niu,1 Liang-jun Xie,2 Hui Wang,1 Bing Zhu,1 Sheng-qi Wang3 1Department of Gynecology, the First People’s Hospital of Shangqiu, Shangqiu, Henan, People’s Republic of China; 2Department of Image Diagnoses, the Third Hospital of Jinan, Jinan, Shandong, People’s Republic of China; 3Department of Mammary Disease, Guangdong Provincial Hospital of Chinese Medicine, the Second Clinical College of Guangzhou University of Chinese Medicine, Guangzhou, People’s Republic of China Background: Estrogen receptors (ERs are nuclear transcription factors that are involved in the regulation of many complex physiological processes in humans. ERs have been validated as important drug targets for the treatment of various diseases, including breast cancer, ovarian cancer, osteoporosis, and cardiovascular disease. ERs have two subtypes, ER-α and ER-β. Emerging data suggest that the development of subtype-selective ligands that specifically target ER-β could be a more optimal approach to elicit beneficial estrogen-like activities and reduce side effects. Methods: Herein, we focused on ER-β and developed its in silico quantitative structure-activity relationship models using machine learning (ML methods. Results: The chemical structures and ER-β bioactivity data were extracted from public chemogenomics databases. Four types of popular fingerprint generation methods including MACCS fingerprint, PubChem fingerprint, 2D atom pairs, and Chemistry Development Kit extended fingerprint were used as descriptors. Four ML methods including Naïve Bayesian classifier, k-nearest neighbor, random forest, and support vector machine were used to train the models. The range of classification accuracies was 77.10% to 88.34%, and the range of area under the ROC (receiver operating characteristic curve values was 0.8151 to 0.9475, evaluated by the 5-fold cross-validation. Comparison analysis suggests that both the random forest and the support vector machine are superior

  16. Uranium refining by solvent extraction

    International Nuclear Information System (INIS)

    Kraikaew, J.

    1996-01-01

    The yellow cake refining was studied in both laboratory and semi-pilot scales. The process units mainly consist of dissolution and filtration, solvent extraction, and precipitation and filtration. Effect of flow ratio (organic flow rate/ aqueous flow rate) on working efficiencies of solvent extraction process was studied. Detailed studies were carried out on extraction, scrubbing and stripping processes. Purity of yellow cake product obtained is high as 90.32% U 3 O 8

  17. Process for refining naphthalene, etc

    Energy Technology Data Exchange (ETDEWEB)

    Petroff, G

    1922-05-13

    A process is described for the refining of naphthalene, its distillates, and mineral oils by the use of dilute sulfuric acid, characterized in that the oils are oxidized with oxygen of the air and thereafter are treated with 65 to 75 percent sulfuric acid to separate the unsaturated hydrocarbons in the form of polymerized products whereby, if necessary, heating and application of usual or higher pressure can take place.

  18. Preparation of refined oils, etc

    Energy Technology Data Exchange (ETDEWEB)

    1931-02-03

    A process is disclosed for the preparation of refined sulfur-containing oils from sulfur-containing crude oils obtained by distillation of bituminous limestone, characterized by this crude oil being first subjected to a purification by distillation with steam in the known way, then treated with lime and chloride of lime and distilled preferably in the presence of zinc powder, whereby in this purification a rectification can be added for the purpose of recovering definite fractions.

  19. Bauxite Mining and Alumina Refining

    OpenAIRE

    Donoghue, A. Michael; Frisch, Neale; Olney, David

    2014-01-01

    Objective: To describe bauxite mining and alumina refining processes and to outline the relevant physical, chemical, biological, ergonomic, and psychosocial health risks. Methods: Review article. Results: The most important risks relate to noise, ergonomics, trauma, and caustic soda splashes of the skin/eyes. Other risks of note relate to fatigue, heat, and solar ultraviolet and for some operations tropical diseases, venomous/dangerous animals, and remote locations. Exposures to bauxite dust,...

  20. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  1. Lactic Acid Bacteria Selection for Biopreservation as a Part of Hurdle Technology Approach Applied on Seafood

    Directory of Open Access Journals (Sweden)

    Norman Wiernasz

    2017-05-01

    Full Text Available As fragile food commodities, microbial, and organoleptic qualities of fishery and seafood can quickly deteriorate. In this context, microbial quality and security improvement during the whole food processing chain (from catch to plate, using hurdle technology, a combination of mild preserving technologies such as biopreservation, modified atmosphere packaging, and superchilling, are of great interest. As natural flora and antimicrobial metabolites producers, lactic acid bacteria (LAB are commonly studied for food biopreservation. Thirty-five LAB known to possess interesting antimicrobial activity were selected for their potential application as bioprotective agents as a part of hurdle technology applied to fishery products. The selection approach was based on seven criteria including antimicrobial activity, alteration potential, tolerance to chitosan coating, and superchilling process, cross inhibition, biogenic amines production (histamine, tyramine, and antibiotics resistance. Antimicrobial activity was assessed against six common spoiling bacteria in fishery products (Shewanella baltica, Photobacterium phosphoreum, Brochothrix thermosphacta, Lactobacillus sakei, Hafnia alvei, Serratia proteamaculans and one pathogenic bacterium (Listeria monocytogenes in co-culture inhibitory assays miniaturized in 96-well microtiter plates. Antimicrobial activity and spoilage evaluation, both performed in cod and salmon juice, highlighted the existence of sensory signatures and inhibition profiles, which seem to be species related. Finally, six LAB with no unusual antibiotics resistance profile nor histamine production ability were selected as bioprotective agents for further in situ inhibitory assays in cod and salmon based products, alone or in combination with other hurdles (chitosan, modified atmosphere packing, and superchilling.

  2. A Selectivity based approach to Continuous Pattern Detection in Streaming Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Choudhury, Sutanay; Holder, Larry; Chin, George; Agarwal, Khushbu; Feo, John T.

    2015-05-27

    Cyber security is one of the most significant technical challenges in current times. Detecting adversarial activities, prevention of theft of intellectual properties and customer data is a high priority for corporations and government agencies around the world. Cyber defenders need to analyze massive-scale, high-resolution network flows to identify, categorize, and mitigate attacks involving networks spanning institutional and national boundaries. Many of the cyber attacks can be described as subgraph patterns, with prominent examples being insider infiltrations (path queries), denial of service (parallel paths) and malicious spreads (tree queries). This motivates us to explore subgraph matching on streaming graphs in a continuous setting. The novelty of our work lies in using the subgraph distributional statistics collected from the streaming graph to determine the query processing strategy. We introduce a ``Lazy Search" algorithm where the search strategy is decided on a vertex-to-vertex basis depending on the likelihood of a match in the vertex neighborhood. We also propose a metric named ``Relative Selectivity" that is used to select between different query processing strategies. Our experiments performed on real online news, network traffic stream and a synthetic social network benchmark demonstrate 10-100x speedups over non-incremental, selectivity agnostic approaches.

  3. Brake fault diagnosis using Clonal Selection Classification Algorithm (CSCA – A statistical learning approach

    Directory of Open Access Journals (Sweden)

    R. Jegadeeshwaran

    2015-03-01

    Full Text Available In automobile, brake system is an essential part responsible for control of the vehicle. Any failure in the brake system impacts the vehicle's motion. It will generate frequent catastrophic effects on the vehicle cum passenger's safety. Thus the brake system plays a vital role in an automobile and hence condition monitoring of the brake system is essential. Vibration based condition monitoring using machine learning techniques are gaining momentum. This study is one such attempt to perform the condition monitoring of a hydraulic brake system through vibration analysis. In this research, the performance of a Clonal Selection Classification Algorithm (CSCA for brake fault diagnosis has been reported. A hydraulic brake system test rig was fabricated. Under good and faulty conditions of a brake system, the vibration signals were acquired using a piezoelectric transducer. The statistical parameters were extracted from the vibration signal. The best feature set was identified for classification using attribute evaluator. The selected features were then classified using CSCA. The classification accuracy of such artificial intelligence technique has been compared with other machine learning approaches and discussed. The Clonal Selection Classification Algorithm performs better and gives the maximum classification accuracy (96% for the fault diagnosis of a hydraulic brake system.

  4. A DYNAMIC FEATURE SELECTION METHOD FOR DOCUMENT RANKING WITH RELEVANCE FEEDBACK APPROACH

    Directory of Open Access Journals (Sweden)

    K. Latha

    2010-07-01

    Full Text Available Ranking search results is essential for information retrieval and Web search. Search engines need to not only return highly relevant results, but also be fast to satisfy users. As a result, not all available features can be used for ranking, and in fact only a small percentage of these features can be used. Thus, it is crucial to have a feature selection mechanism that can find a subset of features that both meets latency requirements and achieves high relevance. In this paper we describe a 0/1 knapsack procedure for automatically selecting features to use within Generalization model for Document Ranking. We propose an approach for Relevance Feedback using Expectation Maximization method and evaluate the algorithm on the TREC Collection for describing classes of feedback textual information retrieval features. Experimental results, evaluated on standard TREC-9 part of the OHSUMED collections, show that our feature selection algorithm produces models that are either significantly more effective than, or equally effective as, models such as Markov Random Field model, Correlation Co-efficient and Count Difference method

  5. Prediction of selective estrogen receptor beta agonist using open data and machine learning approach.

    Science.gov (United States)

    Niu, Ai-Qin; Xie, Liang-Jun; Wang, Hui; Zhu, Bing; Wang, Sheng-Qi

    2016-01-01

    Estrogen receptors (ERs) are nuclear transcription factors that are involved in the regulation of many complex physiological processes in humans. ERs have been validated as important drug targets for the treatment of various diseases, including breast cancer, ovarian cancer, osteoporosis, and cardiovascular disease. ERs have two subtypes, ER-α and ER-β. Emerging data suggest that the development of subtype-selective ligands that specifically target ER-β could be a more optimal approach to elicit beneficial estrogen-like activities and reduce side effects. Herein, we focused on ER-β and developed its in silico quantitative structure-activity relationship models using machine learning (ML) methods. The chemical structures and ER-β bioactivity data were extracted from public chemogenomics databases. Four types of popular fingerprint generation methods including MACCS fingerprint, PubChem fingerprint, 2D atom pairs, and Chemistry Development Kit extended fingerprint were used as descriptors. Four ML methods including Naïve Bayesian classifier, k-nearest neighbor, random forest, and support vector machine were used to train the models. The range of classification accuracies was 77.10% to 88.34%, and the range of area under the ROC (receiver operating characteristic) curve values was 0.8151 to 0.9475, evaluated by the 5-fold cross-validation. Comparison analysis suggests that both the random forest and the support vector machine are superior for the classification of selective ER-β agonists. Chemistry Development Kit extended fingerprints and MACCS fingerprint performed better in structural representation between active and inactive agonists. These results demonstrate that combining the fingerprint and ML approaches leads to robust ER-β agonist prediction models, which are potentially applicable to the identification of selective ER-β agonists.

  6. The Charfuel coal refining process

    International Nuclear Information System (INIS)

    Meyer, L.G.

    1991-01-01

    The patented Charfuel coal refining process employs fluidized hydrocracking to produce char and liquid products from virtually all types of volatile-containing coals, including low rank coal and lignite. It is not gasification or liquefaction which require the addition of expensive oxygen or hydrogen or the use of extreme heat or pressure. It is not the German pyrolysis process that merely 'cooks' the coal, producing coke and tar-like liquids. Rather, the Charfuel coal refining process involves thermal hydrocracking which results in the rearrangement of hydrogen within the coal molecule to produce a slate of co-products. In the Charfuel process, pulverized coal is rapidly heated in a reducing atmosphere in the presence of internally generated process hydrogen. This hydrogen rearrangement allows refinement of various ranks of coals to produce a pipeline transportable, slurry-type, environmentally clean boiler fuel and a slate of value-added traditional fuel and chemical feedstock co-products. Using coal and oxygen as the only feedstocks, the Charfuel hydrocracking technology economically removes much of the fuel nitrogen, sulfur, and potential air toxics (such as chlorine, mercury, beryllium, etc.) from the coal, resulting in a high heating value, clean burning fuel which can increase power plant efficiency while reducing operating costs. The paper describes the process, its thermal efficiency, its use in power plants, its pipeline transport, co-products, environmental and energy benefits, and economics

  7. A Macdonald refined topological vertex

    Science.gov (United States)

    Foda, Omar; Wu, Jian-Feng

    2017-07-01

    We consider the refined topological vertex of Iqbal et al (2009 J. High Energy Phys. JHEP10(2009)069), as a function of two parameters ≤ft\\lgroup x, y \\right\\rgroup , and deform it by introducing the Macdonald parameters ≤ft\\lgroup q, t \\right\\rgroup , as in the work of Vuletić on plane partitions (Vuletić M 2009 Trans. Am. Math. Soc. 361 2789-804), to obtain ‘a Macdonald refined topological vertex’. In the limit q → t , we recover the refined topological vertex of Iqbal et al and in the limit x → y , we obtain a qt-deformation of the original topological vertex of Aganagic et al (2005 Commun. Math. Phys. 25 425-78). Copies of the vertex can be glued to obtain qt-deformed 5D instanton partition functions that have well-defined 4D limits and, for generic values of ≤ft\\lgroup q, t\\right\\rgroup , contain infinite-towers of poles for every pole present in the limit q → t .

  8. Refining's-clean new jingle

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that at a time when profit margins are slim and gasoline demand is down, the U.S. petroleum-refining industry is facing one of its greatest challenges; How to meet new federal and state laws for reformulated gasoline, oxygenated fuels, low-sulfur diesel and other measures to improve the environment. The American Petroleum Institute (API) estimates that industry will spend between $15 and $23 billion by the end of the decade to meet the U.S. Clean Air Act Amendments (CAAA) of 1990, and other legislation. ENSR Consulting and Engineering's capital-spending figure runs to between $70 and 100 billion this decade, including $24 billion to produce reformulated fuels and $10-12 billion to reduce refinery emissions. M.W. Kellogg Co. estimates that refiners may have to spend up to $30 billion this decade to meet the demand for reformulated gasoline. The estimates are wide-ranging because refiners are still studying their options and delaying final decisions as long as they can, to try to ensure they are the best and least-costly decisions. Oxygenated fuels will be required next winter, but federal regulations for reformulated gasoline won't go into effect until 1995, while California's tougher reformulated-fuels law will kick in the following year

  9. Southeast Asian oil markets and refining

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D. [FACTS, Inc., Honolulu, Hawaii (United States)

    1999-09-01

    An overview of the Southeast Asian oil markets and refining is presented concentrating on Brunei, Malaysia, the Philippines, Singapore and Thailand refiners. Key statistics of the refiners in this region are tabulated. The demand and the quality of Indonesian, Malaysian, Philippine, Singapore and Thai petroleum products are analysed. Crude distillation unit capacity trends in the Southeastern Asian refining industry are discussed along with cracking to distillation ratios, refining in these countries, and the impact of changes in demand and refining on the product trade.

  10. Southeast Asian oil markets and refining

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.

    1999-01-01

    An overview of the Southeast Asian oil markets and refining is presented concentrating on Brunei, Malaysia, the Philippines, Singapore and Thailand refiners. Key statistics of the refiners in this region are tabulated. The demand and the quality of Indonesian, Malaysian, Philippine, Singapore and Thai petroleum products are analysed. Crude distillation unit capacity trends in the Southeastern Asian refining industry are discussed along with cracking to distillation ratios, refining in these countries, and the impact of changes in demand and refining on the product trade

  11. Hybrid Feature Selection Approach Based on GRASP for Cancer Microarray Data

    Directory of Open Access Journals (Sweden)

    Arpita Nagpal

    2017-01-01

    Full Text Available Microarray data usually contain a large number of genes, but a small number of samples. Feature subset selection for microarray data aims at reducing the number of genes so that useful information can be extracted from the samples. Reducing the dimension of data sets further helps in improving the computational efficiency of the learning model. In this paper, we propose a modified algorithm based on the tabu search as local search procedures to a Greedy Randomized Adaptive Search Procedure (GRASP for high dimensional microarray data sets. The proposed Tabu based Greedy Randomized Adaptive Search Procedure algorithm is named as TGRASP. In TGRASP, a new parameter has been introduced named as Tabu Tenure and the existing parameters, NumIter and size have been modified. We observed that different parameter settings affect the quality of the optimum. The second proposed algorithm known as FFGRASP (Firefly Greedy Randomized Adaptive Search Procedure uses a firefly optimization algorithm in the local search optimzation phase of the greedy randomized adaptive search procedure (GRASP. Firefly algorithm is one of the powerful algorithms for optimization of multimodal applications. Experimental results show that the proposed TGRASP and FFGRASP algorithms are much better than existing algorithm with respect to three performance parameters viz. accuracy, run time, number of a selected subset of features. We have also compared both the approaches with a unified metric (Extended Adjusted Ratio of Ratios which has shown that TGRASP approach outperforms existing approach for six out of nine cancer microarray datasets and FFGRASP performs better on seven out of nine datasets.

  12. The stock selection problem: Is the stock selection approach more important than the optimization method? Evidence from the Danish stock market

    OpenAIRE

    Grobys, Klaus

    2011-01-01

    Passive investment strategies basically aim to replicate an underlying benchmark. Thereby, the management usually selects a subset of stocks being employed in the optimization procedure. Apart from the optimization procedure, the stock selection approach determines the stock portfolios' out-of-sample performance. The empirical study here takes into account the Danish stock market from 2000-2010 and gives evidence that stock portfolios including small companies' stocks being estimated via coin...

  13. Identifying applicants suitable to a career in nursing: a value-based approach to undergraduate selection.

    Science.gov (United States)

    Traynor, Marian; Galanouli, Despina; Roberts, Martin; Leonard, Lawrence; Gale, Thomas

    2017-06-01

    The aim of this study was to complement existing evidence on the suitability of Multiple Mini Interviews as a potential tool for the selection of nursing candidates on to a BSc (Hons) nursing programme. This study aimed to trial the Multiple Mini Interview approach to recruitment with a group of first year nursing students (already selected using traditional interviews). Cross-sectional validation study. This paper reports on the evaluation of the participants' detailed scores from the Multiple Mini Interview stations; their original interview scores and their end of year results. This study took place in March 2015. Scores from the seven Multiple Mini Interview stations were analysed to show the internal structure, reliability and generalizability of the stations. Original selection scores from interviews and in-course assessment were correlated with the MMI scores and variation by students' age, gender and disability status was explored. Reliability of the Multiple Mini Interview score was moderate (G = 0·52). The Multiple Mini Interview score provided better differentiation between more able students than did the original interview score but neither score was correlated with the module results. Multiple Mini Interview scores were positively associated with students' age but not their gender or disability status. The Multiple Mini Interview reported in this study offers a selection process that is based on the values and personal attributes regarded as desirable for a career in nursing and does not necessarily predict academic success. Its moderate reliability indicates the need for further improvement but it is capable of discriminating between candidates and shows little evidence of bias. © 2016 John Wiley & Sons Ltd.

  14. Review of tri-generation technologies: Design evaluation, optimization, decision-making, and selection approach

    International Nuclear Information System (INIS)

    Al Moussawi, Houssein; Fardoun, Farouk; Louahlia-Gualous, Hasna

    2016-01-01

    diagram of the main CHP/CCHP system components is summarized. A general selection approach of the appropriate CCHP system according to specific needs is finally suggested. In almost all reviewed works, CCHP systems are found to have positive technical and performance impacts.

  15. An engineering thermodynamic approach to select the electromagnetic wave effective on cell growth.

    Science.gov (United States)

    Lucia, Umberto; Grisolia, Giulia; Ponzetto, Antonio; Silvagno, Francesca

    2017-09-21

    To date, the choice of the characteristics of the extremely low-frequency electromagnetic field beneficial in proliferative disorders is still empirical. In order to make the ELF interaction selective, we applied the thermodynamic and biochemical principles to the analysis of the thermo-chemical output generated by the cell in the environment. The theoretical approach applied an engineering bio-thermodynamic approach recently developed in order to obtain a physical-mathematical model that calculated the frequency of the field able to maximize the mean entropy changes as a function of cellular parameters. The combined biochemical approach envisioned the changes of entropy as a metabolic shift leading to a reduction of cell growth. The proliferation of six human cancer cell lines was evaluated as the output signal able to confirm the correctness of the mathematical model. By considering the cell as a reactive system able to respond to the unbalancing external stimuli, for the first time we could calculate and validate the frequencies of the field specifically effective on distinct cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. The influence of parent's body mass index on peer selection: an experimental approach using virtual reality.

    Science.gov (United States)

    Martarelli, Corinna S; Borter, Natalie; Bryjova, Jana; Mast, Fred W; Munsch, Simone

    2015-11-30

    Relatively little is known about the influence of psychosocial factors, such as familial role modeling and social network on the development and maintenance of childhood obesity. We investigated peer selection using an immersive virtual reality environment. In a virtual schoolyard, children were confronted with normal weight and overweight avatars either eating or playing. Fifty-seven children aged 7-13 participated. Interpersonal distance to the avatars, child's BMI, self-perception, eating behavior and parental BMI were assessed. Parental BMI was the strongest predictor for the children's minimal distance to the avatars. Specifically, a higher mothers' BMI was associated with greater interpersonal distance and children approached closer to overweight eating avatars. A higher father's BMI was associated with a lower interpersonal distance to the avatars. These children approached normal weight playing and overweight eating avatar peers closest. The importance of parental BMI for the child's social approach/avoidance behavior can be explained through social modeling mechanisms. Differential effects of paternal and maternal BMI might be due to gender specific beauty ideals. Interventions to promote social interaction with peer groups could foster weight stabilization or weight loss in children. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Application of site and haplotype-frequency based approaches for detecting selection signatures in cattle

    Directory of Open Access Journals (Sweden)

    Moore Stephen

    2011-06-01

    Full Text Available Abstract Background 'Selection signatures' delimit regions of the genome that are, or have been, functionally important and have therefore been under either natural or artificial selection. In this study, two different and complementary methods--integrated Haplotype Homozygosity Score (|iHS| and population differentiation index (FST--were applied to identify traces of decades of intensive artificial selection for traits of economic importance in modern cattle. Results We scanned the genome of a diverse set of dairy and beef breeds from Germany, Canada and Australia genotyped with a 50 K SNP panel. Across breeds, a total of 109 extreme |iHS| values exceeded the empirical threshold level of 5% with 19, 27, 9, 10 and 17 outliers in Holstein, Brown Swiss, Australian Angus, Hereford and Simmental, respectively. Annotating the regions harboring clustered |iHS| signals revealed a panel of interesting candidate genes like SPATA17, MGAT1, PGRMC2 and ACTC1, COL23A1, MATN2, respectively, in the context of reproduction and muscle formation. In a further step, a new Bayesian FST-based approach was applied with a set of geographically separated populations including Holstein, Brown Swiss, Simmental, North American Angus and Piedmontese for detecting differentiated loci. In total, 127 regions exceeding the 2.5 per cent threshold of the empirical posterior distribution were identified as extremely differentiated. In a substantial number (56 out of 127 cases the extreme FST values were found to be positioned in poor gene content regions which deviated significantly (p ST values were found in regions of some relevant genes such as SMCP and FGF1. Conclusions Overall, 236 regions putatively subject to recent positive selection in the cattle genome were detected. Both |iHS| and FST suggested selection in the vicinity of the Sialic acid binding Ig-like lectin 5 gene on BTA18. This region was recently reported to be a major QTL with strong effects on productive life

  18. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba; Bajic, Vladimir B.

    2016-01-01

    decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we

  19. Task Refinement for Autonomous Robots using Complementary Corrective Human Feedback

    Directory of Open Access Journals (Sweden)

    Cetin Mericli

    2011-06-01

    Full Text Available A robot can perform a given task through a policy that maps its sensed state to appropriate actions. We assume that a hand-coded controller can achieve such a mapping only for the basic cases of the task. Refining the controller becomes harder and gets more tedious and error prone as the complexity of the task increases. In this paper, we present a new learning from demonstration approach to improve the robot's performance through the use of corrective human feedback as a complement to an existing hand-coded algorithm. The human teacher observes the robot as it performs the task using the hand-coded algorithm and takes over the control to correct the behavior when the robot selects a wrong action to be executed. Corrections are captured as new state-action pairs and the default controller output is replaced by the demonstrated corrections during autonomous execution when the current state of the robot is decided to be similar to a previously corrected state in the correction database. The proposed approach is applied to a complex ball dribbling task performed against stationary defender robots in a robot soccer scenario, where physical Aldebaran Nao humanoid robots are used. The results of our experiments show an improvement in the robot's performance when the default hand-coded controller is augmented with corrective human demonstration.

  20. A hybrid gene selection approach for microarray data classification using cellular learning automata and ant colony optimization.

    Science.gov (United States)

    Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein

    2016-06-01

    This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. A Holistic Quality Evaluation, Selection and Improvement Approach driven by Multilevel Goals and Strategies

    Directory of Open Access Journals (Sweden)

    Belen Rivera

    2016-12-01

    Full Text Available Organizations should establish business goals and check for their achievement in a systematic and disciplined way. In order to know if a business goal is achieved, it should be necessary to consider information need goals that also can require satisfying measurement and evaluation goals at operational level. Furthermore, if measurement and evaluation goals are not aligned with top-level business goals such as tactical or strategic level goals, the organization could waste its effort and resources. Usually, the different goals established in an organization are operationalized through projects. For a given project, strategies should be used in order to help in the goal achievement. A strategy defines a set of activities and methods to be followed for a specific goal purpose. Ultimately, to engineering all these issues in a systematic way, organizations should adopt a holistic evaluation approach supported by a set of integrated strategies. By means of a systematic literature review as research method, we have observed that very few approaches support integrated strategies and multilevel goals. To bridge this gap, we have developed a holistic quality multilevel and multipurpose evaluation approach that ties together multilevel goals, projects and integrated strategies. As contributions, this paper discusses an enhanced conceptual base (specified by ontologies for linking business and information need goal concepts with project, strategy and nonfunctional requirements concepts. Then, it defines the step by step of our holistic quality evaluation approach, by listing the necessary activities to establish goals and projects at different organizational levels. Lastly, it specifies and illustrates evaluation scenarios for business/information need goal purposes such as understanding, improving, monitoring and controlling, comparing and selecting entities, which are supported by strategies and strategy patterns.

  2. Latin American oil markets and refining

    International Nuclear Information System (INIS)

    Yamaguchi, N.D.; Obadia, C.

    1999-01-01

    This paper provides an overview of the oil markets and refining in Argentina, Brazil, Chile, Colombia, Ecuador, Mexico, Peru and Venezuela, and examines the production of crude oil in these countries. Details are given of Latin American refiners highlighting trends in crude distillation unit capacity, cracking to distillation ratios, and refining in the different countries. Latin American oil trade is discussed, and charts are presented illustrating crude production, oil consumption, crude refining capacity, cracking to distillation ratios, and oil imports and exports

  3. Potency of high-intensity ultrasonic treatment for grain refinement of magnesium alloys

    International Nuclear Information System (INIS)

    Ramirez, A.; Qian Ma; Davis, B.; Wilks, T.; StJohn, D.H.

    2008-01-01

    High-intensity ultrasonic treatment (UT) for grain refinement of magnesium alloys has been investigated using a novel theoretical approach in order to better understand its grain-refining potential and the mechanism of nucleation. The process demonstrated significantly superior grain-refining potency to carbon inoculation for Al-containing magnesium alloys but inferior potency to zirconium for Al-free alloys. Details revealed by applying the theoretical approach to ultrasonic grain refinement provide new clues to understanding the mechanism of grain nucleation by UT

  4. New Molecules and Old Drugs as Emerging Approaches to Selectively Target Human Glioblastoma Cancer Stem Cells

    Directory of Open Access Journals (Sweden)

    Roberto Würth

    2014-01-01

    Full Text Available Despite relevant progress obtained by multimodal treatment, glioblastoma (GBM, the most aggressive primary brain tumor, is still incurable. The most encouraging advancement of GBM drug research derives from the identification of cancer stem cells (CSCs, since these cells appear to represent the determinants of resistance to current standard therapies. The goal of most ongoing studies is to identify drugs able to affect CSCs biology, either inducing selective toxicity or differentiating this tumor cell population into nontumorigenic cells. Moreover, the therapeutic approach for GBM could be improved interfering with chemo- or radioresistance mechanisms, microenvironment signals, and the neoangiogenic process. During the last years, molecular targeted compounds such as sorafenib and old drugs, like metformin, displayed interesting efficacy in preclinical studies towards several tumors, including GBM, preferentially affecting CSC viability. In this review, the latest experimental results, controversies, and prospective application concerning these promising anticancer drugs will be discussed.

  5. Project Management Consultancy (PMC) procurement approach: Supplier's evaluation and selection dilemma

    Science.gov (United States)

    Nawi, Mohd Nasrun Mohd; Azimi, Mohd Azrulfitri; Pozin, Mohd Affendi Ahmad; Osman, Wan Nadri; Anuar, Herman Shah

    2016-08-01

    Project Management Consultancy (PMC) is part of the management oriented procurement method in which a sole consultant is hired by the client to deal with the contactors in place of the client. Appointing contractors in this method or approach looks to be interesting as client could play a pivotal role in evaluating and selecting the supplier/contractor for the work package. In some cases, client gives the authority for the PMC to hire the supplier/contractor of their choice while in some cases the client is the one who made the decision. This research paper seeks to investigate the dilemma arises from this situation and for the purpose of this research, a real case study was studied to assess the impacts of such dilemma to the performance of the project. Recommendations on how to tackle the dilemma will also be addressed in the later part of this research paper.

  6. Multi-tier sustainable global supplier selection using a fuzzy AHP-VIKOR based approach

    DEFF Research Database (Denmark)

    Awasthi, Anjali; Govindan, Kannan; Gold, Stefan

    2018-01-01

    Politico-economic deregulation, new communication technologies, and cheap transport have pushed companies to increasingly outsource business activities to geographically distant countries. Such outsourcing has often resulted in complex supply chain configurations. Because social and environmental...... and global risk displayed the least weight. This result clearly shows that global risks are still not considered a major criterion for supplier selection. Further, the proposed framework may serve as a starting point for developing managerial decision-making tools to help companies more effectively address...... regulations in those countries are often weak or poorly enforced, stakeholders impose responsibility on focal companies to ensure socially and environmentally sustainable production standards throughout their supply chains. In this paper, we present an integrated fuzzy AHP-VIKOR approach-based framework...

  7. Examining Mechanical Strength Characteristics of Selective Inhibition Sintered HDPE Specimens Using RSM and Desirability Approach

    Science.gov (United States)

    Rajamani, D.; Esakki, Balasubramanian

    2017-09-01

    Selective inhibition sintering (SIS) is a powder based additive manufacturing (AM) technique to produce functional parts with an inexpensive system compared with other AM processes. Mechanical properties of SIS fabricated parts are of high dependence on various process parameters importantly layer thickness, heat energy, heater feedrate, and printer feedrate. In this paper, examining the influence of these process parameters on evaluating mechanical properties such as tensile and flexural strength using Response Surface Methodology (RSM) is carried out. The test specimens are fabricated using high density polyethylene (HDPE) and mathematical models are developed to correlate the control factors to the respective experimental design response. Further, optimal SIS process parameters are determined using desirability approach to enhance the mechanical properties of HDPE specimens. Optimization studies reveal that, combination of high heat energy, low layer thickness, medium heater feedrate and printer feedrate yielded superior mechanical strength characteristics.

  8. Adaptive temporal refinement in injection molding

    Science.gov (United States)

    Karyofylli, Violeta; Schmitz, Mauritius; Hopmann, Christian; Behr, Marek

    2018-05-01

    Mold filling is an injection molding stage of great significance, because many defects of the plastic components (e.g. weld lines, burrs or insufficient filling) can occur during this process step. Therefore, it plays an important role in determining the quality of the produced parts. Our goal is the temporal refinement in the vicinity of the evolving melt front, in the context of 4D simplex-type space-time grids [1, 2]. This novel discretization method has an inherent flexibility to employ completely unstructured meshes with varying levels of resolution both in spatial dimensions and in the time dimension, thus allowing the use of local time-stepping during the simulations. This can lead to a higher simulation precision, while preserving calculation efficiency. A 3D benchmark case, which concerns the filling of a plate-shaped geometry, is used for verifying our numerical approach [3]. The simulation results obtained with the fully unstructured space-time discretization are compared to those obtained with the standard space-time method and to Moldflow simulation results. This example also serves for providing reliable timing measurements and the efficiency aspects of the filling simulation of complex 3D molds while applying adaptive temporal refinement.

  9. The Correction of a Secondary Bilateral Cleft Lip Nasal Deformity Using Refined Open Rhinoplasty with Reverse-U Incision, V-Y Plasty, and Selective Combination with Composite Grafting: Long-term Results

    Directory of Open Access Journals (Sweden)

    Byung Chae Cho

    2012-05-01

    Full Text Available Background This article presents long-term outcomes after correcting secondary bilateralcleft lip nasal deformities using a refined reverse-U incision and V-Y plasty or in combinationwith a composite graft in order to elongate the short columella.Methods A total of forty-six patients underwent surgery between September 1996 andDecember 2008. The age of the patients ranged from 3 to 19 years of age. A bilateral reverse-Uincision and V-Y plasty were used in 24 patients. A composite graft from the helical rootwas combined with a bilateral reverse-U incision in the 22 patients who possessed a severelyshortened columella. The follow-up period ranged between 2 and 10 years.Results A total of 32 patients out of 46 were evaluated postoperatively. The average columellalength was significantly improved from an average of 3.7 mm preoperatively to 8.5 mmpostoperatively. The average ratio of the columella height to the alar base width was 0.18preoperatively and 0.29 postoperatively. The postoperative basal and lateral views revealed abetter shape of the nostrils and columella. The elongated columella, combined with a compositegraft, presented good maintenance of the corrected position with no growth disturbance. Acomposite graft showed color mismatching in several patients. Twenty-six patients demonstratedno alar-columella web deformity and satisfactory symmetry of the nostrils. Four patientsexperienced a drooping and overhanging of the corrected alar-columella web.Conclusions A bilateral reverse-U incision with V-Y plasty or in combination with acomposite graft was effective in correcting secondary bilateral cleft lip nasal deformity.

  10. The Correction of a Secondary Bilateral Cleft Lip Nasal Deformity Using Refined Open Rhinoplasty with Reverse-U Incision, V-Y Plasty, and Selective Combination with Composite Grafting: Long-term Results

    Directory of Open Access Journals (Sweden)

    Byung Chae Cho

    2012-05-01

    Full Text Available BackgroundThis article presents long-term outcomes after correcting secondary bilateral cleft lip nasal deformities using a refined reverse-U incision and V-Y plasty or in combination with a composite graft in order to elongate the short columella.MethodsA total of forty-six patients underwent surgery between September 1996 and December 2008. The age of the patients ranged from 3 to 19 years of age. A bilateral reverse-U incision and V-Y plasty were used in 24 patients. A composite graft from the helical root was combined with a bilateral reverse-U incision in the 22 patients who possessed a severely shortened columella. The follow-up period ranged between 2 and 10 years.ResultsA total of 32 patients out of 46 were evaluated postoperatively. The average columella length was significantly improved from an average of 3.7 mm preoperatively to 8.5 mm postoperatively. The average ratio of the columella height to the alar base width was 0.18 preoperatively and 0.29 postoperatively. The postoperative basal and lateral views revealed a better shape of the nostrils and columella. The elongated columella, combined with a composite graft, presented good maintenance of the corrected position with no growth disturbance. A composite graft showed color mismatching in several patients. Twenty-six patients demonstrated no alar-columella web deformity and satisfactory symmetry of the nostrils. Four patients experienced a drooping and overhanging of the corrected alar-columella web.ConclusionsA bilateral reverse-U incision with V-Y plasty or in combination with a composite graft was effective in correcting secondary bilateral cleft lip nasal deformity.

  11. Neutron Powder Diffraction and Constrained Refinement

    DEFF Research Database (Denmark)

    Pawley, G. S.; Mackenzie, Gordon A.; Dietrich, O. W.

    1977-01-01

    The first use of a new program, EDINP, is reported. This program allows the constrained refinement of molecules in a crystal structure with neutron diffraction powder data. The structures of p-C6F4Br2 and p-C6F4I2 are determined by packing considerations and then refined with EDINP. Refinement is...

  12. Game Theoretical Approaches for Transport-Aware Channel Selection in Cognitive Radio Networks

    Directory of Open Access Journals (Sweden)

    Chen Shih-Ho

    2010-01-01

    Full Text Available Effectively sharing channels among secondary users (SUs is one of the greatest challenges in cognitive radio network (CRN. In the past, many studies have proposed channel selection schemes at the physical or the MAC layer that allow SUs swiftly respond to the spectrum states. However, they may not lead to enhance performance due to slow response of the transport layer flow control mechanism. This paper presents a cross-layer design framework called Transport Aware Channel Selection (TACS scheme to optimize the transport throughput based on states, such as RTT and congestion window size, of TCP flow control mechanism. We formulate the TACS problem as two different game theoretic approaches: Selfish Spectrum Sharing Game (SSSG and Cooperative Spectrum Sharing Game (CSSG and present novel distributed heuristic algorithms to optimize TCP throughput. Computer simulations show that SSSG and CSSG could double the SUs throughput of current MAC-based scheme when primary users (PUs use their channel infrequently, and with up to 12% to 100% throughput increase when PUs are more active. The simulation results also illustrated that CSSG performs up to 20% better than SSSG in terms of the throughput.

  13. Recent advances in magnesium assessment: From single selective sensors to multisensory approach.

    Science.gov (United States)

    Lvova, Larisa; Gonçalves, Carla Guanais; Di Natale, Corrado; Legin, Andrey; Kirsanov, Dmitry; Paolesse, Roberto

    2018-03-01

    The development of efficient analytical procedures for the selective detection of magnesium is an important analytical task, since this element is one of the most abundant metals in cells and plays an essential role in a plenty of cellular processes. Magnesium misbalance has been related to several pathologies and diseases both in plants and animals, as far as in humans, but the number of suitable methods for magnesium detection especially in life sample and biological environments is scarce. Chemical sensors, due to their high reliability, simplicity of handling and instrumentation, fast and real-time in situ and on site analysis are promising candidates for magnesium analysis and represent an attractive alternative to the standard instrumental methods. Here the recent achievements in the development of chemical sensors for magnesium ions detection over the last decade are reviewed. The working principles and the main types of sensors applied are described. Focus is placed on the optical sensors and multisensory systems applications for magnesium assessment in different media. Further, a critical outlook on the employment of multisensory approach in comparison to single selective sensors application in biological samples is presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A Hybrid MCDM Approach for Strategic Project Portfolio Selection of Agro By-Products

    Directory of Open Access Journals (Sweden)

    Animesh Debnath

    2017-07-01

    Full Text Available Due to the increasing size of the population, society faces several challenges for sustainable and adequate agricultural production, quality, distribution, and food safety in the strategic project portfolio selection (SPPS. The initial adaptation of strategic portfolio management of genetically modified (GM Agro by-products (Ab-Ps is a huge challenge in terms of processing the agro food product supply-chain practices in an environmentally nonthreatening way. As a solution to the challenges, the socio-economic characteristics for SPPS of GM food purchasing scenarios are studied. Evaluation and selection of the GM agro portfolio management are the dynamic issues due to physical and immaterial criteria involving a hybrid multiple criteria decision making (MCDM approach, combining modified grey Decision-Making Trial and Evaluation Laboratory (DEMATEL, Multi-Attributive Border Approximation area Comparison (MABAC and sensitivity analysis. Evaluation criteria are grouped into social, differential and beneficial clusters, and the modified DEMATEL procedure is used to derive the criteria weights. The MABAC method is applied to rank the strategic project portfolios according to the aggregated preferences of decision makers (DMs. The usefulness of the proposed research framework is validated with a case study. The GM by-products are found to be the best portfolio. Moreover, this framework can unify the policies of agro technological improvement, corporate social responsibility (CSR and agro export promotion.

  15. Prediction of Protein Structural Class Based on Gapped-Dipeptides and a Recursive Feature Selection Approach

    Directory of Open Access Journals (Sweden)

    Taigang Liu

    2015-12-01

    Full Text Available The prior knowledge of protein structural class may offer useful clues on understanding its functionality as well as its tertiary structure. Though various significant efforts have been made to find a fast and effective computational approach to address this problem, it is still a challenging topic in the field of bioinformatics. The position-specific score matrix (PSSM profile has been shown to provide a useful source of information for improving the prediction performance of protein structural class. However, this information has not been adequately explored. To this end, in this study, we present a feature extraction technique which is based on gapped-dipeptides composition computed directly from PSSM. Then, a careful feature selection technique is performed based on support vector machine-recursive feature elimination (SVM-RFE. These optimal features are selected to construct a final predictor. The results of jackknife tests on four working datasets show that our method obtains satisfactory prediction accuracies by extracting features solely based on PSSM and could serve as a very promising tool to predict protein structural class.

  16. Multi-criteria approach for selecting the best solid waste management technologies

    International Nuclear Information System (INIS)

    Latifah, A.M.; Hassan Basri; Noor Ezlin Ahmad Basri

    2010-01-01

    The growth in urbanization and industrial activities has caused solid waste management problems. As a solution the integrated approach has been chosen to manage the solid waste. Developing and implementing integrated solid waste management involve combined technologies and alternatives which are suitable with local laws condition. This research showed that Analytical Hierarchy Process (AHP) has the potential as a decision making tool that can be used in selecting process of solid waste management technology. Three levels hierarchy was developed with the goal at the top level, followed by criteria and alternatives. By using this technique, the priority of each considered technology will be determined where technology with the highest priority is more suitable to be developed. Sensitivity analysis was carried out to test the sensitivity of final decision towards inconsistency of judgement. Application of AHP to determine priority in selecting solid waste management technology was explained in this research based on a case study in the Port Dickson Municipal Council. Analysis of result showed that the combination of recycling technology and composting are suitable to be applied in the Port Dickson district. (author)

  17. Structured methodology review identified seven (RETREAT) criteria for selecting qualitative evidence synthesis approaches.

    Science.gov (United States)

    Booth, Andrew; Noyes, Jane; Flemming, Kate; Gerhardus, Ansgar; Wahlster, Philip; van der Wilt, Gert Jan; Mozygemba, Kati; Refolo, Pietro; Sacchini, Dario; Tummers, Marcia; Rehfuess, Eva

    2018-07-01

    To compare and contrast different methods of qualitative evidence synthesis (QES) against criteria identified from the literature and to map their attributes to inform selection of the most appropriate QES method to answer research questions addressed by qualitative research. Electronic databases, citation searching, and a study register were used to identify studies reporting QES methods. Attributes compiled from 26 methodological papers (2001-2014) were used as a framework for data extraction. Data were extracted into summary tables by one reviewer and then considered within the author team. We identified seven considerations determining choice of methods from the methodological literature, encapsulated within the mnemonic Review question-Epistemology-Time/Timescale-Resources-Expertise-Audience and purpose-Type of data. We mapped 15 different published QES methods against these seven criteria. The final framework focuses on stand-alone QES methods but may also hold potential when integrating quantitative and qualitative data. These findings offer a contemporary perspective as a conceptual basis for future empirical investigation of the advantages and disadvantages of different methods of QES. It is hoped that this will inform appropriate selection of QES approaches. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Sensory evaluation of selected formulated milk barberry drinks using the fuzzy approach.

    Science.gov (United States)

    Tahsiri, Zahra; Niakousari, Mehrdad; Khoshnoudi-Nia, Sara; Hosseini, Seyed Mohamad H

    2017-05-01

    Amid rigid competition in marketing to accomplish customers' needs, the cost of disappointment is too high. In an effort to escape market disappointment, one of the options to be considered is probing for customer satisfaction through sensory evaluation. This study aims to rank the six selected milk-barberry drink formulae out of 24 (code numbers S3, S4, S15, S16, S17 and S18) each having different milk:barberry:pectin amount (7: 3: 0.2; 6: 4: 0.2; 7: 3: 0.4, 6: 4: 0.4, 5: 5: 0.4 and 6: 4: 0.4), respectively, and to determine the best of quality attribute through sensory evaluation, using the fuzzy decision-making model. The selection was based on pH, total solid content, and degree of serum separation and rheological properties of the drinks. The results showed that the S4 had the highest acceptability, rated under the "very good" category, whereas the lowest acceptability was reported for the S3 which was classified under the "satisfactory" category. In summary, the ranking of the milk-barberry drinks was S4 >  S17 >  S16 >  S15 >  S18 >  S3. Furthermore, quality attributes were ranked as taste > mouth feel > aroma > color. Results suggest that the fuzzy approach could be appropriately used to evaluate this type of sensory data.

  19. Bayesian ensemble refinement by replica simulations and reweighting

    Science.gov (United States)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-01

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  20. Adaptive mesh refinement for shocks and material interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Dai, William Wenlong [Los Alamos National Laboratory

    2010-01-01

    There are three kinds of adaptive mesh refinement (AMR) in structured meshes. Block-based AMR sometimes over refines meshes. Cell-based AMR treats cells cell by cell and thus loses the advantage of the nature of structured meshes. Patch-based AMR is intended to combine advantages of block- and cell-based AMR, i.e., the nature of structured meshes and sharp regions of refinement. But, patch-based AMR has its own difficulties. For example, patch-based AMR typically cannot preserve symmetries of physics problems. In this paper, we will present an approach for a patch-based AMR for hydrodynamics simulations. The approach consists of clustering, symmetry preserving, mesh continuity, flux correction, communications, management of patches, and load balance. The special features of this patch-based AMR include symmetry preserving, efficiency of refinement across shock fronts and material interfaces, special implementation of flux correction, and patch management in parallel computing environments. To demonstrate the capability of the AMR framework, we will show both two- and three-dimensional hydrodynamics simulations with many levels of refinement.

  1. Selection of Sustainable Technology for VOC Abatement in an Industry: An Integrated AHP-QFD Approach

    Science.gov (United States)

    Gupta, Alok Kumar; Modi, Bharat A.

    2018-04-01

    Volatile organic compounds (VOCs) are universally present in global atmospheric pollutants. These VOCs are responsible for photo chemical reaction in atmosphere leading to serious harmful effects on human health and environment. VOCs are produced from both natural and man-made sources and may have good commercial value if it can be utilized as alternate fuel. As per data from US EPA, 15% of total VOC emissions are generated from surface coating industry but VOC concentration and exhaust air volume varies to a great extent and is dependent on processes used by industry. Various technologies are available for abatement of VOCs. Physical, Chemical and Biological technologies are available to remove VOCs by either recovery or destruction with many advantages and limitations. With growing environmental awareness and considering the resource limitations of medium and small scale industries, requirement of a tool for selecting appropriate techno economically viable solution for removal of VOCs from industrial process exhaust is envisaged. The aim of the present study is to provide management a tool to determine the overall effect of implementation of VOC abatement technology on business performance and VOC emissions. The primary purpose of this work is to outline a methodology to rate various VOC abatement technologies with respect to the constraint of meeting current and foreseeable future regulatory requirements, operational flexibility and Over All Economics Parameters considering conservation of energy. In this paper an integrated approach has been proposed to select most appropriate abatement technology strategically. Analytical hierarchy process and Quality function deployment have been integrated for Techno-commercial evaluation. A case study on selection of VOC abatement technology for a leading aluminium foil surface coating, lamination and printing facility using this methodology is presented in this study.

  2. Big Web data, small focus: An ethnosemiotic approach to culturally themed selective Web archiving

    Directory of Open Access Journals (Sweden)

    Saskia Huc-Hepher

    2015-07-01

    Full Text Available This paper proposes a multimodal ethnosemiotic conceptual framework for culturally themed selective Web archiving, taking as a practical example the curation of the London French Special Collection (LFSC in the UK Web Archive. Its focus on a particular ‘community’ is presented as advantageous in overcoming the sheer scale of data available on the Web; yet, it is argued that these ethnographic boundaries may be flawed if they do not map onto the collective self-perception of the London French. The approach establishes several theoretical meeting points between Pierre Bourdieu’s ethnography and Gunther Kress’s multimodal social semiotics, notably, the foregrounding of practice and the meaning-making potentialities of the everyday; the implications of language and categorisation; the interplay between (curating/researcher subject and (curated/research object; evolving notions of agency, authorship and audience; together with social engagement, and the archive as dynamic process and product. The curation rationale proposed stems from Bourdieu’s three-stage field analysis model, which places a strong emphasis on habitus, considered to be most accurately (represented through blogs, yet necessitates its contextualisation within the broader (diasporic field(s, through institutional websites, for example, whilst advocating a reflexive awareness of the researcher/curator’s (subjective role. This, alongside the Kressian acknowledgement of the inherent multimodality of on-line resources, lends itself convincingly to selection and valuation strategies, whilst the discussion of language, genre, authorship and audience is relevant to the potential cataloguing of Web objects. By conceptualising the culturally themed selective Web-archiving process within the ethnosemiotic framework constructed, concrete recommendations emerge regarding curation, classification and crowd-sourcing.

  3. An Objective Approach to Select Climate Scenarios when Projecting Species Distribution under Climate Change.

    Directory of Open Access Journals (Sweden)

    Nicolas Casajus

    Full Text Available An impressive number of new climate change scenarios have recently become available to assess the ecological impacts of climate change. Among these impacts, shifts in species range analyzed with species distribution models are the most widely studied. Whereas it is widely recognized that the uncertainty in future climatic conditions must be taken into account in impact studies, many assessments of species range shifts still rely on just a few climate change scenarios, often selected arbitrarily. We describe a method to select objectively a subset of climate change scenarios among a large ensemble of available ones. Our k-means clustering approach reduces the number of climate change scenarios needed to project species distributions, while retaining the coverage of uncertainty in future climate conditions. We first show, for three biologically-relevant climatic variables, that a reduced number of six climate change scenarios generates average climatic conditions very close to those obtained from a set of 27 scenarios available before reduction. A case study on potential gains and losses of habitat by three northeastern American tree species shows that potential future species distributions projected from the selected six climate change scenarios are very similar to those obtained from the full set of 27, although with some spatial discrepancies at the edges of species distributions. In contrast, projections based on just a few climate models vary strongly according to the initial choice of climate models. We give clear guidance on how to reduce the number of climate change scenarios while retaining the central tendencies and coverage of uncertainty in future climatic conditions. This should be particularly useful during future climate change impact studies as more than twice as many climate models were reported in the fifth assessment report of IPCC compared to the previous one.

  4. An Objective Approach to Select Climate Scenarios when Projecting Species Distribution under Climate Change.

    Science.gov (United States)

    Casajus, Nicolas; Périé, Catherine; Logan, Travis; Lambert, Marie-Claude; de Blois, Sylvie; Berteaux, Dominique

    2016-01-01

    An impressive number of new climate change scenarios have recently become available to assess the ecological impacts of climate change. Among these impacts, shifts in species range analyzed with species distribution models are the most widely studied. Whereas it is widely recognized that the uncertainty in future climatic conditions must be taken into account in impact studies, many assessments of species range shifts still rely on just a few climate change scenarios, often selected arbitrarily. We describe a method to select objectively a subset of climate change scenarios among a large ensemble of available ones. Our k-means clustering approach reduces the number of climate change scenarios needed to project species distributions, while retaining the coverage of uncertainty in future climate conditions. We first show, for three biologically-relevant climatic variables, that a reduced number of six climate change scenarios generates average climatic conditions very close to those obtained from a set of 27 scenarios available before reduction. A case study on potential gains and losses of habitat by three northeastern American tree species shows that potential future species distributions projected from the selected six climate change scenarios are very similar to those obtained from the full set of 27, although with some spatial discrepancies at the edges of species distributions. In contrast, projections based on just a few climate models vary strongly according to the initial choice of climate models. We give clear guidance on how to reduce the number of climate change scenarios while retaining the central tendencies and coverage of uncertainty in future climatic conditions. This should be particularly useful during future climate change impact studies as more than twice as many climate models were reported in the fifth assessment report of IPCC compared to the previous one.

  5. A heuristic approach using multiple criteria for environmentally benign 3PLs selection

    Science.gov (United States)

    Kongar, Elif

    2005-11-01

    Maintaining competitiveness in an environment where price and quality differences between competing products are disappearing depends on the company's ability to reduce costs and supply time. Timely responses to rapidly changing market conditions require an efficient Supply Chain Management (SCM). Outsourcing logistics to third-party logistics service providers (3PLs) is one commonly used way of increasing the efficiency of logistics operations, while creating a more "core competency focused" business environment. However, this alone may not be sufficient. Due to recent environmental regulations and growing public awareness regarding environmental issues, 3PLs need to be not only efficient but also environmentally benign to maintain companies' competitiveness. Even though an efficient and environmentally benign combination of 3PLs can theoretically be obtained using exhaustive search algorithms, heuristics approaches to the selection process may be superior in terms of the computational complexity. In this paper, a hybrid approach that combines a multiple criteria Genetic Algorithm (GA) with Linear Physical Weighting Algorithm (LPPW) to be used in efficient and environmentally benign 3PLs is proposed. A numerical example is also provided to illustrate the method and the analyses.

  6. An Integrated Approach to Mitigation Wetland Site Selection: A Case Study in Gwacheon, Korea

    Directory of Open Access Journals (Sweden)

    Junga Lee

    2015-03-01

    Full Text Available This paper presents an integrated approach to mitigation wetland site selection using functional landscape connectivity and landscape structure. This approach enables landscape designers to evaluate the relative priorities of mitigation wetland areas based on functional landscape connectivity and wildlife mobility, as well as landscape structure, composition, and configuration. The least-cost path method is used to evaluate candidate sites for mitigation wetlands with regard to wildlife movement. A set of assessments for landscape indices using FRAGSTATS was applied to identify suitable mitigation wetland areas on the basis of landscape connectivity, composition, and configuration. The study was conducted in Gwacheon, Korea, where there are plans for regional development that will change the landscape. In the first step, a group of 14 candidate sites is identified via analysis of functional landscape connectivity using the least-cost path method. In the second step, candidate mitigation wetland areas are ranked according to landscape connectivity and composition. The five mitigation wetland areas that were found to be suitable were analyzed based on landscape configuration at the class level. This study demonstrates that functional landscape connectivity and landscape structure are important aspects to consider when identifying suitable sites for mitigation wetland planning and restoration.

  7. A multi-user selective undo/redo approach for collaborative CAD systems

    Directory of Open Access Journals (Sweden)

    Yuan Cheng

    2014-04-01

    Full Text Available The engineering design process is a creative process, and the designers must repeatedly apply Undo/Redo operations to modify CAD models to explore new solutions. Undo/Redo has become one of most important functions in interactive graphics and CAD systems. Undo/Redo in a collaborative CAD system is also very helpful for collaborative awareness among a group of cooperative designers to eliminate misunderstanding and to recover from design error. However, Undo/Redo in a collaborative CAD system is much more complicated. This is because a single erroneous operation is propagated to other remote sites, and operations are interleaved at different sites. This paper presents a multi-user selective Undo/Redo approach in full distributed collaborative CAD systems. We use site ID and State Vectors to locate the Undo/Redo target at each site. By analyzing the composition of the complex CAD model, a tree-like structure called Feature Combination Hierarchy is presented to describe the decomposition of a CAD model. Based on this structure, the dependency relationship among features is clarified. B-Rep re-evaluation is simplified with the assistance of the Feature Combination Hierarchy. It can be proven that the proposed Undo/Redo approach satisfies the intention preservation and consistency maintenance correctness criteria for collaborative systems.

  8. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    Science.gov (United States)

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  9. An integrated approach of topology optimized design and selective laser melting process for titanium implants materials.

    Science.gov (United States)

    Xiao, Dongming; Yang, Yongqiang; Su, Xubin; Wang, Di; Sun, Jianfeng

    2013-01-01

    The load-bearing bone implants materials should have sufficient stiffness and large porosity, which are interacted since larger porosity causes lower mechanical properties. This paper is to seek the maximum stiffness architecture with the constraint of specific volume fraction by topology optimization approach, that is, maximum porosity can be achieved with predefine stiffness properties. The effective elastic modulus of conventional cubic and topology optimized scaffolds were calculated using finite element analysis (FEA) method; also, some specimens with different porosities of 41.1%, 50.3%, 60.2% and 70.7% respectively were fabricated by Selective Laser Melting (SLM) process and were tested by compression test. Results showed that the computational effective elastic modulus of optimized scaffolds was approximately 13% higher than cubic scaffolds, the experimental stiffness values were reduced by 76% than the computational ones. The combination of topology optimization approach and SLM process would be available for development of titanium implants materials in consideration of both porosity and mechanical stiffness.

  10. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    Science.gov (United States)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression

  11. Gene selection for the reconstruction of stem cell differentiation trees: a linear programming approach.

    Science.gov (United States)

    Ghadie, Mohamed A; Japkowicz, Nathalie; Perkins, Theodore J

    2015-08-15

    Stem cell differentiation is largely guided by master transcriptional regulators, but it also depends on the expression of other types of genes, such as cell cycle genes, signaling genes, metabolic genes, trafficking genes, etc. Traditional approaches to understanding gene expression patterns across multiple conditions, such as principal components analysis or K-means clustering, can group cell types based on gene expression, but they do so without knowledge of the differentiation hierarchy. Hierarchical clustering can organize cell types into a tree, but in general this tree is different from the differentiation hierarchy itself. Given the differentiation hierarchy and gene expression data at each node, we construct a weighted Euclidean distance metric such that the minimum spanning tree with respect to that metric is precisely the given differentiation hierarchy. We provide a set of linear constraints that are provably sufficient for the desired construction and a linear programming approach to identify sparse sets of weights, effectively identifying genes that are most relevant for discriminating different parts of the tree. We apply our method to microarray gene expression data describing 38 cell types in the hematopoiesis hierarchy, constructing a weighted Euclidean metric that uses just 175 genes. However, we find that there are many alternative sets of weights that satisfy the linear constraints. Thus, in the style of random-forest training, we also construct metrics based on random subsets of the genes and compare them to the metric of 175 genes. We then report on the selected genes and their biological functions. Our approach offers a new way to identify genes that may have important roles in stem cell differentiation. tperkins@ohri.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Individual capacity-building approaches in a global pharmaceutical systems strengthening program: a selected review.

    Science.gov (United States)

    Konduri, Niranjan; Rauscher, Megan; Wang, Shiou-Chu Judy; Malpica-Llanos, Tanya

    2017-01-01

    Medicines use related challenges such as inadequate adherence, high levels of antimicrobial resistance and preventable adverse drug reactions have underscored the need to incorporate pharmaceutical services to help achieve desired treatment outcomes, and protect patients from inappropriate use of medicines. This situation is further constrained by insufficient numbers of pharmaceutical personnel and inappropriate skill mix. Studies have addressed individual capacity building approaches of logistics, supply chain or disease specific interventions but few have documented those involving such pharmacy assistants/professionals, or health workers/professionals charged with improving access and provision of pharmaceutical services. We examined how different training modalities have been employed and adapted to meet country-specific context and needs by a global pharmaceutical systems strengthening program in collaboration with a country's Ministry of Health and local stakeholders. Structured, content analysis of training approaches from twelve selected countries and a survey among conveniently selected trainees in Bangladesh and Ethiopia. Case-based learning, practice and feedback, and repetitive interventions such as post-training action plan, supportive supervision and mentoring approaches are effective, evidence-based training techniques. In Ethiopia and Bangladesh, over 94% of respondents indicated that they have improved or developed skills or competencies as a result of the program's training activities. Supportive supervision structures and mentorship have been institutionalized with appropriate management structures. National authorities have been sensitized to secure funding from domestic resources or from the global fund grants for post-training follow-up initiatives. The Pharmaceutical Leadership Development Program is an effective, case-based training modality that motivates staff to develop quality-improvement interventions and solve specific challenges

  13. Niobium-base grain refiner for aluminium

    International Nuclear Information System (INIS)

    Silva Pontes, P. da; Robert, M.H.; Cupini, N.L.

    1980-01-01

    A new chemical grain refiner for aluminium has been developed, using inoculation of a niobium-base compound. When a bath of molten aluminium is inoculated whith this refiner, an intermetallic aluminium-niobium compound is formed which acts as a powerful nucleant, producing extremely fine structure comparable to those obtained by means of the traditional grain refiner based on titanium and boron. It was found that the refinement of the structure depends upon the weight percentage of the new refiner inoculated as well as the time of holding the bath after inoculation and before pouring, but mainly on the inoculating temperature. (Author) [pt

  14. Highly selective BSA imprinted polyacrylamide hydrogels facilitated by a metal-coding MIP approach.

    Science.gov (United States)

    El-Sharif, H F; Yapati, H; Kalluru, S; Reddy, S M

    2015-12-01

    We report the fabrication of metal-coded molecularly imprinted polymers (MIPs) using hydrogel-based protein imprinting techniques. A Co(II) complex was prepared using (E)-2-((2 hydrazide-(4-vinylbenzyl)hydrazono)methyl)phenol; along with iron(III) chloroprotoporphyrin (Hemin), vinylferrocene (VFc), zinc(II) protoporphyrin (ZnPP) and protoporphyrin (PP), these complexes were introduced into the MIPs as co-monomers for metal-coding of non-metalloprotein imprints. Results indicate a 66% enhancement for bovine serum albumin (BSA) protein binding capacities (Q, mg/g) via metal-ion/ligand exchange properties within the metal-coded MIPs. Specifically, Co(II)-complex-based MIPs exhibited 92 ± 1% specific binding with Q values of 5.7 ± 0.45 mg BSA/g polymer and imprinting factors (IF) of 14.8 ± 1.9 (MIP/non-imprinted (NIP) control). The selectivity of our Co(II)-coded BSA MIPs were also tested using bovine haemoglobin (BHb), lysozyme (Lyz), and trypsin (Tryp). By evaluating imprinting factors (K), each of the latter proteins was found to have lower affinities in comparison to cognate BSA template. The hydrogels were further characterised by thermal analysis and differential scanning calorimetry (DSC) to assess optimum polymer composition. The development of hydrogel-based molecularly imprinted polymer (HydroMIPs) technology for the memory imprinting of proteins and for protein biosensor development presents many possibilities, including uses in bio-sample clean-up or selective extraction, replacement of biological antibodies in immunoassays and biosensors for medicine and the environment. Biosensors for proteins and viruses are currently expensive to develop because they require the use of expensive antibodies. Because of their biomimicry capabilities (and their potential to act as synthetic antibodies), HydroMIPs potentially offer a route to the development of new low-cost biosensors. Herein, a metal ion-mediated imprinting approach was employed to metal-code our

  15. Grain refinement through severe plastic deformation (SPD) processing

    International Nuclear Information System (INIS)

    Izairi, N.; Vevecka - Priftaj, A.

    2012-01-01

    There is considerable current interest in processing metallic samples through procedures involving the imposition of severe plastic deformation (SPD). These procedures lead to very significant grain refinement to the submicrometer or even the nanometer level, resulting in advanced physical properties. Among various SPD processes, Equal Channel Angular Pressing, High pressure Torsion and Accumulated Roll Bonding have been widely used for many metals and alloys. In the present work, we present an overview of the most used methods of SPD for grain refinement and the production of bulk nano structured materials with enhancement in their mechanical and functional properties. In order to examine the potential for using ECAP to refine the grain size and improve the mechanical properties, two commercial 5754 Al alloy and AA 3004 , were selected for study. Processing by ECAP gives a reduction in the grain size and an increase in the microhardness. (Author)

  16. Application of methodological approach to selection of sportswomen to calisthenics teams for group exercises, considering compatibility factor

    Directory of Open Access Journals (Sweden)

    O.S. Kozhanova

    2015-04-01

    Full Text Available Purpose: motivation of methodological approach to selection of sportswomen to calisthenics teams for group exercises considering compatibility factor. Material: in the research 40 high qualification sportswomen of 17-23 yrs age with sport experience of 11-16 years participated. With cluster analysis 10 gymnasts with morphological indicators, meeting modern standards of group exercises were selected. Results: we found 5 generalized factors, which characterize structure of selection to teams and determines 72% of dispersion. Influence of kinds and connected with them criteria of compatibility on efficiency of gymnasts’ competition functioning were also determined. The authors substantiated methodological approach to selection of sportswomen to calisthenics teams for group exercises, considering compatibility factor. Conclusions: in selection to calisthenics teams for group exercises it is purposeful to realize complex registration of compatibility kinds, considering gymnasts’ similar features by recommended indicators.

  17. Retrograde approach for the recanalization of coronary chronic total occlusion: collateral selection and collateral related complication.

    Science.gov (United States)

    Ma, Jian-Ying; Qian, Ju-Ying; Ge, Lei; Fan, Bing; Wang, Qi-Bing; Yan, Yan; Zhang, Feng; Yao, Kang; Huang, Dong; Ge, Jun-Bo

    2013-03-01

    The retrograde approach through collaterals has been applied in the treatment of chronic total occlusion (CTO) lesions during percutaneous recanalization of coronary arteries. This study was to investigate the success rate of recanalization and collateral related complications in patients when using the retrograde approach. Eighty-four cases subjected to retrograde approach identified from July 2005 to July 2012 were included in this study. Patient characteristics, procedural outcomes and in-hospital clinical events were evaluated. Mean age of the patient was (59.6 ± 11.2) years old and 91.7% were men. The target CTO lesions were distributed among the left anterior descending artery in 45 cases (53.5%), left circumflex artery in one case (1.2%), right coronary artery in 34 cases (40.5%), and left main in four cases (4.8%). The overall success rate of recanalization was 79.8%. The septal collateral was three times more frequently used for retrograde access than the epicardial collateral, 68/84 (81%) vs. 16/84 (19%). Successful wire passage through the collateral channel was achieved in 58 (72.6%) patients. The success rate of recanalization was 93.1% (54/58) in patients with and 50% (13/26) in patients without successful retrograde wire passage of the collateral channel (P collaterals was achieved in 49 of 68 septal collaterals (72.1%) and in 9 of 16 epicardial collaterals (56.3%) (P = NS). There was no significant difference between the septal collateral group and the epicardial group in the success rate of recanalization after retrograde wire crossing the collaterals (91.8% vs. 100%, P > 0.05). CART or reverse CART technique was used in 15 patients, and 14 patients (93.3%) were recanalized successfully. Collateral related perforation occurred in three (18.8%) cases with the epicardial collateral as the first choice (compared with the septal collateral group (0), P collaterals. The retrograde approach is an effective technique to recanalize CTO lesions, the septal

  18. A refined atomic scale model of the Saccharomyces cerevisiae K+-translocation protein Trk1p combined with experimental evidence confirms the role of selectivity filter glycines and other key residues

    Czech Academy of Sciences Publication Activity Database

    Zayats, Vasilina; Stockner, T.; Pandey, Saurabh Kumar; Woerz, K.; Ettrich, Rüdiger; Ludwig, J.

    2015-01-01

    Roč. 1848, č. 5 (2015), s. 1183-1195 ISSN 0005-2736 R&D Projects: GA ČR(CZ) GA13-21053S Institutional support: RVO:67179843 Keywords : molecular-dynamics simulations * potassium-transport * vibrio-alginolyticus * high-affinity * ion-channel * system * ktrab * prediction * symporters * currents * K+-translocation * Eukaryotic Trk * Saccharomyces cerevisiae * Homology modeling * Molecular dynamics * Selectivity filter Subject RIV: CE - Biochemistry Impact factor: 3.687, year: 2015

  19. Self-oriented nanoparticles for site-selective immunoglobulin G recognition via epitope imprinting approach.

    Science.gov (United States)

    Çorman, Mehmet Emin; Armutcu, Canan; Uzun, Lokman; Say, Rıdvan; Denizli, Adil

    2014-11-01

    Molecular imprinting is a polymerization technique that provides synthetic analogs for template molecules. Molecularly imprinted polymers (MIPs) have gained much attention due to their unique properties such as selectivity and specificity for target molecules. In this study, we focused on the development of polymeric materials with molecular recognition ability, so molecular imprinting was combined with miniemulsion polymerization to synthesize self-orienting nanoparticles through the use of an epitope imprinting approach. Thus, L-lysine imprinted nanoparticles (LMIP) were synthesized via miniemulsion polymerization technique. Immunoglobulin G (IgG) was then bound to the cavities that specifically formed for L-lysine molecules that are typically found at the C-terminus of the Fc region of antibody molecules. The resulting nanoparticles makes it possible to minimize the nonspecific interaction between monomer and template molecules. In addition, the orientation of the entire IgG molecule was controlled, and random imprinting of the IgG was prevented. The optimum conditions were determined for IgG recognition using the imprinted nanoparticles. The selectivity of the nanoparticles against IgG molecules was also evaluated using albumin and hemoglobin as competitor molecules. In order to show the self-orientation capability of imprinted nanoparticles, human serum albumin (HSA) adsorption onto both the plain nanoparticles and immobilized nanoparticles by anti-human serum albumin antibody (anti-HSA antibody) was also carried out. Due to anti-HSA antibody immobilization on the imprinted nanoparticles, the adsorption capability of nanoparticles against HSA molecules vigorously enhanced. It is proved that the oriented immobilization of antibodies was appropriately succeeded. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Cost Analysis of Selected Patient Categories within a Dermatology Department Using an ABC Approach

    Science.gov (United States)

    Papadaki, Šárka; Popesko, Boris

    2016-01-01

    Background: Present trends in hospital management are facilitating the utilization of more accurate costing methods, which potentially results in superior cost-related information and improved managerial decision-making. However, the Activity-Based Costing method (ABC), which was designed for cost allocation purposes in the 1980s, is not widely used by healthcare organizations. This study analyzes costs related to selected categories of patients, those suffering from psoriasis, varicose ulcers, eczema and other conditions, within a dermatology department at a Czech regional hospital. Methods: The study was conducted in a hospital department where both inpatient and outpatient care are offered. Firstly, the diseases treated at the department were identified. Further costs were determined for each activity using ABC. The study utilized data from managerial and financial accounting, as well as data obtained through interviews with departmental staff. Using a defined cost-allocation procedure makes it possible to determine the cost of an individual patient with a given disease more accurately than via traditional costing procedures. Results: The cost analysis focused on the differences between the costs related to individual patients within the selected diagnoses, variations between inpatient and outpatient treatments and the costs of activities performed by the dermatology department. Furthermore, comparing the costs identified through this approach and the revenue stemming from the health insurance system is an option. Conclusions: Activity-Based Costing is more accurate and relevant than the traditional costing method. The outputs of ABC provide an abundance of additional information for managers. The benefits of this research lie in its practically-tested outputs, resulting from calculating the costs of hospitalization, which could prove invaluable to persons involved in hospital management and decision-making. The study also defines the managerial implications of

  1. Cost Analysis of Selected Patient Categories Within A Dermatology Department Using an ABC Approach.

    Science.gov (United States)

    Papadaki, Šárka; Popesko, Boris

    2015-11-17

    Present trends in hospital management are facilitating the utilization of more accurate costing methods, which potentially results in superior cost-related information and improved managerial decision-making. However, the Activity-Based Costing method (ABC), which was designed for cost allocation purposes in the 1980s, is not widely used by healthcare organizations. This study analyzes costs related to selected categories of patients, those suffering from psoriasis, varicose ulcers, eczema and other conditions, within a dermatology department at a Czech regional hospital. The study was conducted in a hospital department where both inpatient and outpatient care are offered. Firstly, the diseases treated at the department were identified. Further costs were determined for each activity using ABC. The study utilized data from managerial and financial accounting, as well as data obtained through interviews with departmental staff. Using a defined cost-allocation procedure makes it possible to determine the cost of an individual patient with a given disease more accurately than via traditional costing procedures. The cost analysis focused on the differences between the costs related to individual patients within the selected diagnoses, variations between inpatient and outpatient treatments and the costs of activities performed by the dermatology department. Furthermore, comparing the costs identified through this approach and the revenue stemming from the health insurance system is an option. Activity-Based Costing is more accurate and relevant than the traditional costing method. The outputs of ABC provide an abundance of additional information for managers. The benefits of this research lie in its practically-tested outputs, resulting from calculating the costs of hospitalization, which could prove invaluable to persons involved in hospital management and decision-making. The study also defines the managerial implications of the performed cost analysis for the

  2. ANALYTICAL VIEW OF THE PERCEPTION OF SELECTED INNOVATIVE APPROACHES IN MARKETING COMMUNICATIONS

    Directory of Open Access Journals (Sweden)

    Peter Dorčák

    2015-07-01

    Full Text Available Purpose: The purpose of this paper is to present by means of thorough analysis of the selected market the perception of innovative approaches in marketing communication, both from the perspective of the companies on the supply side and from the perspective of their potential customers on the demand side. As regards the companies, it is in particular about their perception of relative benefits the use of electronic marketing instruments has given the degree of effort and resources spent on the establishment and maintenance of e-marketing infrastructure. As regards the customers, it is, in turn, about their perception of the particular aspects of use of virtual social networks by the companies for branding purposes or directly for promotion.Methodology/Approach: Object of the research were real companies active on the analysed market. The subject of the research were their actual and potential customers represented by the users of one of the local Internet portals long active on the analysed market. Both these groups were presented with questions by means of electronic questionnaires. The data were subsequently processed and interpreted through cluster analyses.Findings: Based on the confirmed aspects we came to the conclusion that perceived benefit of the Internet on business is closely related to on-line activity of the companies and to the number of instruments the company uses to promote its business. Last but not least, we have investigated whether economic status of the users (real or potential customers has an impact on their perception of the companies' use of social networks for promotional purposes. The similarity of behaviour of particular groups allows for more accurate targeting of on-line activity with respect to selected target markets.Research Limitation/implication: The biggest limitation is the territorial nature of the research. Research was targeted primarily at the Central European market.Originality/Value of paper: Presented

  3. Selective leaching of Zn from spent alkaline batteries using environmentally friendly approaches.

    Science.gov (United States)

    Maryam Sadeghi, S; Vanpeteghem, Guillaumme; Neto, Isabel F F; Soares, Helena M V M

    2017-02-01

    The main aim of this work was to evaluate the possibility of using microwave or ultrasound to assist the efficient and selective leaching of Zn from spent alkaline batteries and compare the results with those obtained using the conventional method. Two different strategies were applied: acid leaching of a washed residue and alkaline leaching of the original residue. In both (acid and alkaline) approaches, the use of microwave- or ultrasound-assisted leaching increased the extraction of Zn compared with the best results obtained using conventional leaching [acid leaching (1.5mol/L H 2 SO 4 , 3h, 80°C), 90% of Zn extracted; alkaline leaching (6mol/L NaOH, 3h, 80°C), 42% of Zn extracted]. With acid leaching, 94% of the Zn was extracted using microwave-assisted leaching (1 cycle, 30s, 1mol/L H 2 SO 4 ), and 92% of the Zn was extracted using ultrasound-assisted leaching (2min, 0.1p, 20% amplitude, 1mol/L H 2 SO 4 ). Ultrasound-assisted leaching resulted in a more selective (Zn/Mn ratio of 5.1) Zn extraction than microwave-assisted leaching (Zn/Mn ratio of 3.5); both processes generated a concentrated Zn solution (⩾18.7g/L) with a purity (83.3% and 77.7%, respectively) that was suitable for electrowinning. With alkaline leaching, microwave- (1 cycle, 3 min, 4mol/L NaOH) and ultrasound-assisted (14min, 0.1p, 20% amplitude, 4mol/L NaOH) leaching extracted about 80% of the Zn and less than 0.01% of the Mn, which resulted in lesser concentrated Zn solutions (approximately 16.5g/L) but with high purity (>99.5%) that was suitable for the recovery of Zn by precipitation. The microwave- and ultrasound-assisted leaching strategies used in this work proved to be efficient and environmentally-friendly approaches for the extraction of Zn from spent alkaline residues since a concentrated Zn solution with adequate purity for subsequent Zn recovery was obtained using significantly decreased leaching times and concentrations of chemicals. Copyright © 2017 Elsevier Ltd. All rights

  4. Efficacious and safe tissue-selective controlled gene therapy approaches for the cornea.

    Directory of Open Access Journals (Sweden)

    Rajiv R Mohan

    2011-04-01

    Full Text Available Untargeted and uncontrolled gene delivery is a major cause of gene therapy failure. This study aimed to define efficient and safe tissue-selective targeted gene therapy approaches for delivering genes into keratocytes of the cornea in vivo using a normal or diseased rabbit model. New Zealand White rabbits, adeno-associated virus serotype 5 (AAV5, and a minimally invasive hair-dryer based vector-delivery technique were used. Fifty microliters of AAV5 titer (6.5×10(12 vg/ml expressing green fluorescent protein gene (GFP was topically applied onto normal or diseased (fibrotic or neovascularized rabbit corneas for 2-minutes with a custom vector-delivery technique. Corneal fibrosis and neovascularization in rabbit eyes were induced with photorefractive keratectomy using excimer laser and VEGF (630 ng using micropocket assay, respectively. Slit-lamp biomicroscopy and immunocytochemistry were used to confirm fibrosis and neovascularization in rabbit corneas. The levels, location and duration of delivered-GFP gene expression in the rabbit stroma were measured with immunocytochemistry and/or western blotting. Slot-blot measured delivered-GFP gene copy number. Confocal microscopy performed in whole-mounts of cornea and thick corneal sections determined geometric and spatial localization of delivered-GFP in three-dimensional arrangement. AAV5 toxicity and safety were evaluated with clinical eye exam, stereomicroscopy, slit-lamp biomicroscopy, and H&E staining. A single 2-minute AAV5 topical application via custom delivery-technique efficiently and selectively transduced keratocytes in the anterior stroma of normal and diseased rabbit corneas as evident from immunocytochemistry and confocal microscopy. Transgene expression was first detected at day 3, peaked at day 7, and was maintained up to 16 weeks (longest tested time point. Clinical and slit-lamp eye examination in live rabbits and H&E staining did not reveal any significant changes between AAV5

  5. Iterative Refinement Methods for Time-Domain Equalizer Design

    Directory of Open Access Journals (Sweden)

    Evans Brian L

    2006-01-01

    Full Text Available Commonly used time domain equalizer (TEQ design methods have been recently unified as an optimization problem involving an objective function in the form of a Rayleigh quotient. The direct generalized eigenvalue solution relies on matrix decompositions. To reduce implementation complexity, we propose an iterative refinement approach in which the TEQ length starts at two taps and increases by one tap at each iteration. Each iteration involves matrix-vector multiplications and vector additions with matrices and two-element vectors. At each iteration, the optimization of the objective function either improves or the approach terminates. The iterative refinement approach provides a range of communication performance versus implementation complexity tradeoffs for any TEQ method that fits the Rayleigh quotient framework. We apply the proposed approach to three such TEQ design methods: maximum shortening signal-to-noise ratio, minimum intersymbol interference, and minimum delay spread.

  6. A System Dynamics Approach for the Selection of Contaminated Land Management Options

    Science.gov (United States)

    McKnight, U. S.; Kuebert, M.; Finkel, M.; Bieg, M.

    2006-12-01

    Large-scale contaminated land and groundwater is a widespread problem that can severely impact human health, the environment and the economy at many urban sites all over the world. Usually a considerable number of potential management solutions exist at each of these sites. A detailed investigation of all these options, however, is not economically feasible which makes streamlining of the planning and decision process a mandatory requirement. Decisions to be taken should be made as early as possible in order to reduce expenditures on site investigation. Therefore, a tiered decision-making procedure is required, including (i) identification and prioritization of focal areas of risks, (ii) feasibility screening of remediation targets and available management alternatives to narrow the range of possible options for (iii) subsequent detailed investigations of only a select group of preferable options. For each of these elements, tailored decision and investigation concepts are required. These concepts and applied methods should be specifically adapted to the type and scale of the particular decision to be taken- more target-oriented, cost-efficient investigation programs, as well as model-based assessment methods are needed (Ruegner et al. 2006). A gap exists within this framework with respect to preliminary assessment methodologies representing the first decision level. To fill this gap, a new system dynamics approach has been developed that represents the system of source- pathway-receptor sequences by means of a mass flux model. The dynamics are governed by the effects of possible remedial actions, which are described as mass flux change over time (Serapiglia et al. 2005). This approach has been implemented in the preliminary evaluation tool CARO-plus (Cost-efficiency Assessment of Remediation Options) that models the effects of potential remedial actions, including tackling the contaminant source and managing the groundwater plume. The model represents the causal

  7. Pacific Basin Heavy Oil Refining Capacity

    Directory of Open Access Journals (Sweden)

    David Hackett

    2013-02-01

    Full Text Available The United States today is Canada’s largest customer for oil and refined oil products. However, this relationship may be strained due to physical, economic and political influences. Pipeline capacity is approaching its limits; Canadian oil is selling at substantive discounts to world market prices; and U.S. demand for crude oil and finished products (such as gasoline, has begun to flatten significantly relative to historical rates. Lower demand, combined with increased shale oil production, means U.S. demand for Canadian oil is expected to continue to decline. Under these circumstances, gaining access to new markets such as those in the Asia-Pacific region is becoming more and more important for the Canadian economy. However, expanding pipeline capacity to the Pacific via the proposed Northern Gateway pipeline and the planned Trans Mountain pipeline expansion is only feasible when there is sufficient demand and processing capacity to support Canadian crude blends. Canadian heavy oil requires more refining and produces less valuable end products than other lighter and sweeter blends. Canadian producers must compete with lighter, sweeter oils from the Middle East, and elsewhere, for a place in the Pacific Basin refineries built to handle heavy crude blends. Canadian oil sands producers are currently expanding production capacity. Once complete, the Northern Gateway pipeline and the Trans Mountain expansion are expected to deliver an additional 500,000 to 1.1 million barrels a day to tankers on the Pacific coast. Through this survey of the capacity of Pacific Basin refineries, including existing and proposed facilities, we have concluded that there is sufficient technical capacity in the Pacific Basin to refine the additional Canadian volume; however, there may be some modifications required to certain refineries to allow them to process Western Canadian crude. Any additional capacity for Canadian oil would require refinery modifications or

  8. A Regionalization Approach to select the final watershed parameter set among the Pareto solutions

    Science.gov (United States)

    Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.

    2017-12-01

    The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.

  9. Materials refining on the Moon

    Science.gov (United States)

    Landis, Geoffrey A.

    2007-05-01

    Oxygen, metals, silicon, and glass are raw materials that will be required for long-term habitation and production of structural materials and solar arrays on the Moon. A process sequence is proposed for refining these materials from lunar regolith, consisting of separating the required materials from lunar rock with fluorine. The fluorine is brought to the Moon in the form of potassium fluoride, and is liberated from the salt by electrolysis in a eutectic salt melt. Tetrafluorosilane produced by this process is reduced to silicon by a plasma reduction stage; the fluorine salts are reduced to metals by reaction with metallic potassium. Fluorine is recovered from residual MgF and CaF2 by reaction with K2O.

  10. Application of discriminative models for interactive query refinement in video retrieval

    Science.gov (United States)

    Srivastava, Amit; Khanwalkar, Saurabh; Kumar, Anoop

    2013-12-01

    The ability to quickly search for large volumes of videos for specific actions or events can provide a dramatic new capability to intelligence agencies. Example-based queries from video are a form of content-based information retrieval (CBIR) where the objective is to retrieve clips from a video corpus, or stream, using a representative query sample to find more like this. Often, the accuracy of video retrieval is largely limited by the gap between the available video descriptors and the underlying query concept, and such exemplar queries return many irrelevant results with relevant ones. In this paper, we present an Interactive Query Refinement (IQR) system which acts as a powerful tool to leverage human feedback and allow intelligence analyst to iteratively refine search queries for improved precision in the retrieved results. In our approach to IQR, we leverage discriminative models that operate on high dimensional features derived from low-level video descriptors in an iterative framework. Our IQR model solicits relevance feedback on examples selected from the region of uncertainty and updates the discriminating boundary to produce a relevance ranked results list. We achieved 358% relative improvement in Mean Average Precision (MAP) over initial retrieval list at a rank cutoff of 100 over 4 iterations. We compare our discriminative IQR model approach to a naïve IQR and show our model-based approach yields 49% relative improvement over the no model naïve system.

  11. Favoring the unfavored: Selective electrochemical nitrogen fixation using a reticular chemistry approach.

    Science.gov (United States)

    Lee, Hiang Kwee; Koh, Charlynn Sher Lin; Lee, Yih Hong; Liu, Chong; Phang, In Yee; Han, Xuemei; Tsung, Chia-Kuang; Ling, Xing Yi

    2018-03-01

    Electrochemical nitrogen-to-ammonia fixation is emerging as a sustainable strategy to tackle the hydrogen- and energy-intensive operations by Haber-Bosch process for ammonia production. However, current electrochemical nitrogen reduction reaction (NRR) progress is impeded by overwhelming competition from the hydrogen evolution reaction (HER) across all traditional NRR catalysts and the requirement for elevated temperature/pressure. We achieve both excellent NRR selectivity (~90%) and a significant boost to Faradic efficiency by 10 percentage points even at ambient operations by coating a superhydrophobic metal-organic framework (MOF) layer over the NRR electrocatalyst. Our reticular chemistry approach exploits MOF's water-repelling and molecular-concentrating effects to overcome HER-imposed bottlenecks, uncovering the unprecedented electrochemical features of NRR critical for future theoretical studies. By favoring the originally unfavored NRR, we envisage our electrocatalytic design as a starting point for high-performance nitrogen-to-ammonia electroconversion directly from water vapor-abundant air to address increasing global demand of ammonia in (bio)chemical and energy industries.

  12. A new approach for an efficient human resource appraisal and selection

    Directory of Open Access Journals (Sweden)

    Hachicha Raoudha

    2012-12-01

    Full Text Available The aim of the paper is to provide a decision making tool for solving a multi-criteria selection problem that can accommodate the qualitative details in relations with the task requirements and candidates’ competences. Our inquiry emphasizes the use of the 2-tuple linguistic representation model as the most suitable tool to overcome the uncertain and subjective assessments. It is adapted to aggregate linguistic assessments of acquired and required competence resources generated by a group of appraisers. The resulting aggregated objective evaluations are therefore used as inputs of an extended version of the TOPSIS method.  After certain customization, a candidates’ ranking based on a similarity degree between required and acquired competence components levels is provided. The quality and efficiency of the proposed approach were confirmed through a real life application from a university context. It ensures a better management of the available candidates. Moreover, it allows facing the circumstances of absenteeism, identifying the need of training, and so on.

  13. Environmental effect and fate of selected phenols in aquatic ecosystems using microcosm approaches

    International Nuclear Information System (INIS)

    Portier, R.J.; Chen, H.M.; Meyers, S.P.

    1983-01-01

    Microbiological studies, together with physicochemical analyses of selected industrial source phenols of environmental significance, were conducted in continuous flow and carbon metabolism microcosms to determine the behavior of these priority pollutants in soil and sediment-water systems typical of coastal wetlands. Phenols used included 4- nitrophenol, 2,4,6-trichlorophenol, 2-chlorophenol, and phenol. The organophosphate, 14 C-UL-Methyl Parathion, was used as a benchmark toxicant control while 14 C-Ring-Phenol was employed for all phenolic compound additions. Microbial diversity, ATP, and specific enzyme systems (i.e., phosphatase, dehydrogenase) were continuously monitored along with 14 CO 2 expiration and 14 C assimilation by the cellular component. Residual analysis of all microcosm tests employed procedures using combined gas chromatography/high-performance liquid chromatography. Statistical analyses were conducted of variations of testing criteria, along with a ranking profile of relative biotransformation and biodegradation potential. Data presented confirm the validity of microcosm approaches and related correlation analysis in toxic substance fate investigations. 17 references, 6 figures, 1 table

  14. Selected remedy at the Queen City Farms superfund site: A risk management approach

    International Nuclear Information System (INIS)

    Weber, E.F.; Wilson, J.; Kirk, M.; Tochko, S.

    1994-01-01

    A risk management approach at a former industrial waste disposal site in western Washington resulted in a selected remedy that is cost-effective and that meets the CERCLA threshold criterion of protecting human health and the environment. The proposed remedy, which addresses contamination in soil and groundwater, does not require an ARARs waiver and received state and community acceptance. By analyzing the current and potential risk at the site, a proposed remedy was chosen that would control the source and naturally attenuate the groundwater plume. Source control will include removal and treatment of some light nonaqueous phase liquid (LNAPL) and some soil, followed by isolation of the remaining soil and LNAPL within a slurry wall and beneath a multilayer cap. A contingent groundwater extraction and treatment system was included to address uncertainty in the risk characterization. Implementing source control is predicted to result in a steady decline in volatile organic compound levels in the drinking water aquifer through adsorption, degradation, and dispersion. Exposure to groundwater during the period of natural attenuation will be controlled by monitoring, institutional controls, and a thorough characterization of the plume and receptors. 7 figs., 1 tab

  15. A PROBABILITY BASED APPROACH FOR THE ALLOCATION OF PLAYER DRAFT SELECTIONS IN AUSTRALIAN RULES FOOTBALL

    Directory of Open Access Journals (Sweden)

    Anthony Bedford

    2006-12-01

    Full Text Available Australian Rules Football, governed by the Australian Football League (AFL is the most popular winter sport played in Australia. Like North American team based leagues such as the NFL, NBA and NHL, the AFL uses a draft system for rookie players to join a team's list. The existing method of allocating draft selections in the AFL is simply based on the reverse order of each team's finishing position for that season, with teams winning less than or equal to 5 regular season matches obtaining an additional early round priority draft pick. Much criticism has been levelled at the existing system since it rewards losing teams and does not encourage poorly performing teams to win matches once their season is effectively over. We propose a probability-based system that allocates a score based on teams that win 'unimportant' matches (akin to Carl Morris' definition of importance. We base the calculation of 'unimportance' on the likelihood of a team making the final eight following each round of the season. We then investigate a variety of approaches based on the 'unimportance' measure to derive a score for 'unimportant' and unlikely wins. We explore derivatives of this system, compare past draft picks with those obtained under our system, and discuss the attractiveness of teams knowing the draft reward for winning each match in a season

  16. A MCDM approach for project finance selection: An application in the renewable energy sector

    Directory of Open Access Journals (Sweden)

    García-Bernabeu, Ana

    2015-05-01

    Full Text Available Renewable energy (RE is emerging as a solution in order to replace fossil fuels and become the primary source of energy consumption. Investments in the RE sector involve huge amounts of capital but also many risks. Public sector plays an important role in promoting RE projects but due to the need for reducing public expenditure the private sector becomes essential in financing this type of projects. Project Finance is widely used in RE projects and is especially attractive to the private sector because it can fund major projects off balance sheet. The objective of this paper is to present a decision making tool for helping the private sector on the selection process of RE projects to be funded. The problem could be considered as a multiple criteria decision-making problem where both, financial and non-financial criteria have to be taken into account. Objective aggregation weights for those criteria are obtained using the Moderate Pessimism Decision Making approach and a final ranking of the projects is obtained.

  17. Model selection approach suggests causal association between 25-hydroxyvitamin D and colorectal cancer.

    Directory of Open Access Journals (Sweden)

    Lina Zgaga

    Full Text Available Vitamin D deficiency has been associated with increased risk of colorectal cancer (CRC, but causal relationship has not yet been confirmed. We investigate the direction of causation between vitamin D and CRC by extending the conventional approaches to allow pleiotropic relationships and by explicitly modelling unmeasured confounders.Plasma 25-hydroxyvitamin D (25-OHD, genetic variants associated with 25-OHD and CRC, and other relevant information was available for 2645 individuals (1057 CRC cases and 1588 controls and included in the model. We investigate whether 25-OHD is likely to be causally associated with CRC, or vice versa, by selecting the best modelling hypothesis according to Bayesian predictive scores. We examine consistency for a range of prior assumptions.Model comparison showed preference for the causal association between low 25-OHD and CRC over the reverse causal hypothesis. This was confirmed for posterior mean deviances obtained for both models (11.5 natural log units in favour of the causal model, and also for deviance information criteria (DIC computed for a range of prior distributions. Overall, models ignoring hidden confounding or pleiotropy had significantly poorer DIC scores.Results suggest causal association between 25-OHD and colorectal cancer, and support the need for randomised clinical trials for further confirmations.

  18. Resolving public conflict in site selection process - a risk communication approach

    International Nuclear Information System (INIS)

    Ishizaka, Kaoru; Tanaka, Masaru

    2003-01-01

    In Japan, conflicts regarding the siting of waste disposal facilities occur frequently. In particular, siting of incinerators and landfills has become increasingly difficult because the public is highly concerned about the dioxin issues. Inefficient siting of waste disposal facilities causes several social problems, such as the shortage of waste treatment and disposal facilities, the rising of waste management costs and an increase in the consumption of resources. While dealing with a similar situation, the Chemical Society of Japan adopted a risk communication technique successfully. Hence, the pragmatic use of a risk communication technique is proposed to avoid conflicts and for a smooth information exchange to seek cooperation in waste management. In order to achieve this, a study was conducted to resolve conflicts between residents and the municipality for the selection of site for a solid waste treatment and disposal facility. This study aims to discuss the subject of risk communication for the waste disposal system in Japan. This study is performed through personal interviews and a questionnaire covering opposing parties in the town. As a result of the survey, a risk communication approach for a waste treatment and disposal system is presented in the paper addressing issues such as building of social trust, pragmatic use of the communication process, installation of credible information sources, and environmental education and awareness

  19. A VECM approach to detangling growth, exports, imports and FDI knot in selected CEE countries

    Directory of Open Access Journals (Sweden)

    Saša Žiković

    2014-12-01

    Full Text Available The authors analyze the relationship between GDP, imports-coverage ratio (NEX, FDI and gross fixed capital formation (GFC in selected CEE countries by using an error correction model. The empirical results confirm a positive long-run influence of the imports coverage ratio, FDI and GFC on GDP growth for all of the countries, except Croatia. In the case of Croatia, there is a significant negative feedback between FDI and GDP growth in the long run and a positive one in the short run. By using B. Horvat’s research on this subject, a logical explanation of this sort of paradoxical behavior is suggested. The second uncommon result is the long-run positive relationship between GDP and the imports-coverage ratio. The obtained result speaks in favor of a conservative approach to running a national economy, where the current account and the imports-coverage ratio are taken into account and the economic growth is achieved through slower but stable, internally driven growth.

  20. Evaluation and selection of indicators for land degradation and desertification monitoring: methodological approach.

    Science.gov (United States)

    Kosmas, C; Kairis, Or; Karavitis, Ch; Ritsema, C; Salvati, L; Acikalin, S; Alcala, M; Alfama, P; Atlhopheng, J; Barrera, J; Belgacem, A; Solé-Benet, A; Brito, J; Chaker, M; Chanda, R; Coelho, C; Darkoh, M; Diamantis, I; Ermolaeva, O; Fassouli, V; Fei, W; Feng, J; Fernandez, F; Ferreira, A; Gokceoglu, C; Gonzalez, D; Gungor, H; Hessel, R; Juying, J; Khatteli, H; Khitrov, N; Kounalaki, A; Laouina, A; Lollino, P; Lopes, M; Magole, L; Medina, L; Mendoza, M; Morais, P; Mulale, K; Ocakoglu, F; Ouessar, M; Ovalle, C; Perez, C; Perkins, J; Pliakas, F; Polemio, M; Pozo, A; Prat, C; Qinke, Y; Ramos, A; Ramos, J; Riquelme, J; Romanenkov, V; Rui, L; Santaloia, F; Sebego, R; Sghaier, M; Silva, N; Sizemskaya, M; Soares, J; Sonmez, H; Taamallah, H; Tezcan, L; Torri, D; Ungaro, F; Valente, S; de Vente, J; Zagal, E; Zeiliguer, A; Zhonging, W; Ziogas, A

    2014-11-01

    An approach to derive relationships for defining land degradation and desertification risk and developing appropriate tools for assessing the effectiveness of the various land management practices using indicators is presented in the present paper. In order to investigate which indicators are most effective in assessing the level of desertification risk, a total of 70 candidate indicators was selected providing information for the biophysical environment, socio-economic conditions, and land management characteristics. The indicators were defined in 1,672 field sites located in 17 study areas in the Mediterranean region, Eastern Europe, Latin America, Africa, and Asia. Based on an existing geo-referenced database, classes were designated for each indicator and a sensitivity score to desertification was assigned to each class based on existing research. The obtained data were analyzed for the various processes of land degradation at farm level. The derived methodology was assessed using independent indicators, such as the measured soil erosion rate, and the organic matter content of the soil. Based on regression analyses, the collected indicator set can be reduced to a number of effective indicators ranging from 8 to 17 in the various processes of land degradation. Among the most important indicators identified as affecting land degradation and desertification risk were rain seasonality, slope gradient, plant cover, rate of land abandonment, land-use intensity, and the level of policy implementation.

  1. Refinement in Z and Object-Z foundations and advanced applications

    CERN Document Server

    Derrick, John

    2013-01-01

    Refinement is one of the cornerstones of the formal approach to software engineering, and its use in various domains has led to research on new applications and generalisation. This book brings together this important research in one volume, with the addition of examples drawn from different application areas. It covers four main themes:Data refinement and its application to ZGeneralisations of refinement that change the interface and atomicity of operationsRefinement in Object-ZModelling state and behaviour by combining Object-Z with CSPRefinement in Z and Object-Z: Foundations and Advanced A

  2. AHP approach for supplier evaluation and selection in a steel manufacturing company

    OpenAIRE

    Tahriri, Farzad; Osman, M. Rasid; Ali, Aidy; Yusuff, Rosnah Mohd; Esfandiary, Alireza

    2008-01-01

    Supplier selection is one of the most critical activities of purchasing management in supply chain. Supplier selection is a complex problem involving qualitative and quantitative multi-criteria. A trade-off between these tangible and intangible factors is essential in selecting the best supplier. The work incorporates AHP in choosing the best suppliers. The results suggest that AHP process makes it possible to introduce the optimum order quantities among the selected suppliers so that the Tot...

  3. Multilevel local refinement and multigrid methods for 3-D turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Liao, C.; Liu, C. [UCD, Denver, CO (United States); Sung, C.H.; Huang, T.T. [David Taylor Model Basin, Bethesda, MD (United States)

    1996-12-31

    A numerical approach based on multigrid, multilevel local refinement, and preconditioning methods for solving incompressible Reynolds-averaged Navier-Stokes equations is presented. 3-D turbulent flow around an underwater vehicle is computed. 3 multigrid levels and 2 local refinement grid levels are used. The global grid is 24 x 8 x 12. The first patch is 40 x 16 x 20 and the second patch is 72 x 32 x 36. 4th order artificial dissipation are used for numerical stability. The conservative artificial compressibility method are used for further improvement of convergence. To improve the accuracy of coarse/fine grid interface of local refinement, flux interpolation method for refined grid boundary is used. The numerical results are in good agreement with experimental data. The local refinement can improve the prediction accuracy significantly. The flux interpolation method for local refinement can keep conservation for a composite grid, therefore further modify the prediction accuracy.

  4. Effects of grain refinement on the rheological behaviors of semisolid hypoeutectic Al-Si alloys

    International Nuclear Information System (INIS)

    Yan, M.; Luo, W.

    2007-01-01

    The paper experimentally investigated the effects of grain refinement on the rheological response of Al and hypoeutectic Al-Si alloys. Selected refiners included K 2 TiF 6 , K 2 TiF 6 plus graphite and Al-5Ti-B. The apparent viscosity of semisolid Al alloys was measured during solidification. Samples at different solid fractions were quenched to observe the microstructure. It was found that grain refinement drastically lowered the apparent viscosity of Al-Si alloys. Among selected refiners, the effect of Al-5Ti-B was the best. The effect of K 2 TiF 6 plus graphite was better than that of K 2 TiF 6 . Silicon contents in Al alloys affected the apparent viscosity. With increasing silicon content the apparent viscosity decreased, resulted from promotion of silicon to both refining effects of titanium and boron

  5. Refined geometric transition and qq-characters

    Science.gov (United States)

    Kimura, Taro; Mori, Hironori; Sugimoto, Yuji

    2018-01-01

    We show the refinement of the prescription for the geometric transition in the refined topological string theory and, as its application, discuss a possibility to describe qq-characters from the string theory point of view. Though the suggested way to operate the refined geometric transition has passed through several checks, it is additionally found in this paper that the presence of the preferred direction brings a nontrivial effect. We provide the modified formula involving this point. We then apply our prescription of the refined geometric transition to proposing the stringy description of doubly quantized Seiberg-Witten curves called qq-characters in certain cases.

  6. Diffusion-assisted selective dynamical recoupling: A new approach to measure background gradients in magnetic resonance

    Science.gov (United States)

    Álvarez, Gonzalo A.; Shemesh, Noam; Frydman, Lucio

    2014-02-01

    Dynamical decoupling, a generalization of the original NMR spin-echo sequence, is becoming increasingly relevant as a tool for reducing decoherence in quantum systems. Such sequences apply non-equidistant refocusing pulses for optimizing the coupling between systems, and environmental fluctuations characterized by a given noise spectrum. One such sequence, dubbed Selective Dynamical Recoupling (SDR) [P. E. S. Smith, G. Bensky, G. A. Álvarez, G. Kurizki, and L. Frydman, Proc. Natl. Acad. Sci. 109, 5958 (2012)], allows one to coherently reintroduce diffusion decoherence effects driven by fluctuations arising from restricted molecular diffusion [G. A. Álvarez, N. Shemesh, and L. Frydman, Phys. Rev. Lett. 111, 080404 (2013)]. The fully-refocused, constant-time, and constant-number-of-pulses nature of SDR also allows one to filter out "intrinsic" T1 and T2 weightings, as well as pulse errors acting as additional sources of decoherence. This article explores such features when the fluctuations are now driven by unrestricted molecular diffusion. In particular, we show that diffusion-driven SDR can be exploited to investigate the decoherence arising from the frequency fluctuations imposed by internal gradients. As a result, SDR presents a unique way of probing and characterizing these internal magnetic fields, given an a priori known free diffusion coefficient. This has important implications in studies of structured systems, including porous media and live tissues, where the internal gradients may serve as fingerprints for the system's composition or structure. The principles of this method, along with full analytical solutions for the unrestricted diffusion-driven modulation of the SDR signal, are presented. The potential of this approach is demonstrated with the generation of a novel source of MRI contrast, based on the background gradients active in an ex vivo mouse brain. Additional features and limitations of this new method are discussed.

  7. Diffusion-assisted selective dynamical recoupling: A new approach to measure background gradients in magnetic resonance

    International Nuclear Information System (INIS)

    Álvarez, Gonzalo A.; Shemesh, Noam; Frydman, Lucio

    2014-01-01

    Dynamical decoupling, a generalization of the original NMR spin-echo sequence, is becoming increasingly relevant as a tool for reducing decoherence in quantum systems. Such sequences apply non-equidistant refocusing pulses for optimizing the coupling between systems, and environmental fluctuations characterized by a given noise spectrum. One such sequence, dubbed Selective Dynamical Recoupling (SDR) [P. E. S. Smith, G. Bensky, G. A. Álvarez, G. Kurizki, and L. Frydman, Proc. Natl. Acad. Sci. 109, 5958 (2012)], allows one to coherently reintroduce diffusion decoherence effects driven by fluctuations arising from restricted molecular diffusion [G. A. Álvarez, N. Shemesh, and L. Frydman, Phys. Rev. Lett. 111, 080404 (2013)]. The fully-refocused, constant-time, and constant-number-of-pulses nature of SDR also allows one to filter out “intrinsic” T 1 and T 2 weightings, as well as pulse errors acting as additional sources of decoherence. This article explores such features when the fluctuations are now driven by unrestricted molecular diffusion. In particular, we show that diffusion-driven SDR can be exploited to investigate the decoherence arising from the frequency fluctuations imposed by internal gradients. As a result, SDR presents a unique way of probing and characterizing these internal magnetic fields, given an a priori known free diffusion coefficient. This has important implications in studies of structured systems, including porous media and live tissues, where the internal gradients may serve as fingerprints for the system's composition or structure. The principles of this method, along with full analytical solutions for the unrestricted diffusion-driven modulation of the SDR signal, are presented. The potential of this approach is demonstrated with the generation of a novel source of MRI contrast, based on the background gradients active in an ex vivo mouse brain. Additional features and limitations of this new method are discussed

  8. Analytical Modeling Approach to Study Harmonic Mitigation in AC Grids with Active Impedance at Selective Frequencies

    Directory of Open Access Journals (Sweden)

    Gonzalo Abad

    2018-05-01

    Full Text Available This paper presents an analytical model, oriented to study harmonic mitigation aspects in AC grids. As it is well known, the presence of non-desired harmonics in AC grids can be palliated in several manners. However, in this paper, a power electronic-based active impedance at selective frequencies (ACISEF is used, due to its already proven flexibility and adaptability to the changing characteristics of AC grids. Hence, the proposed analytical model approach is specially conceived to globally consider both the model of the AC grid itself with its electric equivalent impedances, together with the power electronic-based ACISEF, including its control loops. In addition, the proposed analytical model presents practical and useful properties, as it is simple to understand and simple to use, it has low computational cost and simple adaptability to different scenarios of AC grids, and it provides an accurate enough representation of the reality. The benefits of using the proposed analytical model are shown in this paper through some examples of its usefulness, including an analysis of stability and the identification of sources of instability for a robust design, an analysis of effectiveness in harmonic mitigation, an analysis to assist in the choice of the most suitable active impedance under a given state of the AC grid, an analysis of the interaction between different compensators, and so on. To conclude, experimental validation of a 2.15 kA ACISEF in a real 33 kV AC grid is provided, in which real users (household and industry loads and crucial elements such as wind parks and HVDC systems are near inter-connected.

  9. A volatolomic approach for studying plant variability: the case of selected Helichrysum species (Asteraceae).

    Science.gov (United States)

    Giuliani, Claudia; Lazzaro, Lorenzo; Calamassi, Roberto; Calamai, Luca; Romoli, Riccardo; Fico, Gelsomina; Foggi, Bruno; Mariotti Lippi, Marta

    2016-10-01

    The species of Helichrysum sect. Stoechadina (Asteraceae) are well-known for their secondary metabolite content and the characteristic aromatic bouquets. In the wild, populations exhibit a wide phenotypic plasticity which makes critical the circumscription of species and infraspecific ranks. Previous investigations on Helichrysum italicum complex focused on a possible phytochemical typification based on hydrodistilled essential oils. Aims of this paper are three-fold: (i) characterizing the volatile profiles of different populations, testing (ii) how these profiles vary across populations and (iii) how the phytochemical diversity may contribute in solving taxonomic problems. Nine selected Helichrysum populations, included within the H. italicum complex, Helichrysum litoreum and Helichrysum stoechas, were investigated. H. stoechas was chosen as outgroup for validating the method. After collection in the wild, plants were cultivated in standard growing conditions for over one year. Annual leafy shoots were screened in the post-blooming period for the emissions of volatile organic compounds (VOCs) by means of headspace solid phase microextraction coupled with gas-chromatography and mass spectrometry (HS-SPME-GC/MS). The VOC composition analysis revealed the production of overall 386 different compounds, with terpenes being the most represented compound class. Statistical data processing allowed the identification of the indicator compounds that differentiate the single populations, revealing the influence of the geographical provenance area in determining the volatile profiles. These results suggested the potential use of VOCs as valuable diacritical characters in discriminating the Helichrysum populations. In addition, the cross-validation analysis hinted the potentiality of this volatolomic study in the discrimination of the Helichrysum species and subspecies, highlighting a general congruence with the current taxonomic treatment of the genus. The consistency

  10. A Precision Microbiome Approach Using Sucrose for Selective Augmentation of Staphylococcus epidermidis Fermentation against Propionibacterium acnes

    Directory of Open Access Journals (Sweden)

    Yanhan Wang

    2016-11-01

    Full Text Available Acne dysbiosis happens when there is a microbial imbalance of the over-growth of Propionibacterium acnes (P. acnes in the acne microbiome. In our previous study, we demonstrated that Staphylococcus epidermidis (S. epidermidis, a probiotic skin bacterium can exploit glycerol fermentation to produce short-chain fatty acids (SCFAs which have antimicrobial activities to suppress the growth of P. acnes. Unlike glycerol, sucrose is chosen here as a selective fermentation initiator (SFI that can specifically intensify the fermentation activity of S. epidermidis, but not P. acnes. A co-culture of P. acnes and fermenting S. epidermidis in the presence of sucrose significantly led to a reduction in the growth of P. acnes. The reduction was abolished when P. acnes was co-cultured with non-fermenting S. epidermidis. Results from nuclear magnetic resonance (NMR analysis revealed four SCFAs (acetic acid, butyric acid, lactic acid, and succinic acid were detectable in the media of S. epidermidis sucrose fermentation. To validate the interference of S. epidermidis sucrose fermentation with P. acnes, mouse ears were injected with both P. acnes and S. epidermidis plus sucrose or phosphate buffered saline (PBS. The level of macrophage-inflammatory protein-2 (MIP-2 and the number of P. acnes in ears injected with two bacteria plus sucrose were considerably lower than those in ears injected with two bacteria plus PBS. Our results demonstrate a precision microbiome approach by using sucrose as a SFI for S. epidermidis, holding future potential as a novel modality to equilibrate dysbiotic acne.

  11. Automated knowledge-base refinement

    Science.gov (United States)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  12. A method for refining oil

    Energy Technology Data Exchange (ETDEWEB)

    Bruskin, Yu.A.; Gorokhov, V.V.; Kotler, L.D.; Kovalenko, N.F.; Spasskiy, Yu.B.; Titov, A.M.; Vlasenko, V.Ye.; Vytnov, V.A.

    1983-01-01

    In the method for refining oil through its distillation with the isolation of directly distilled gases and a benzine fraction (BS) with the use of a benzine fraction pyrolysis, in order to increase the output of the lower olefines and to reduce the energy expenditures, the distillation is conducted with the isolation of 10 to 40 percent of the benzine fraction from its potential content along with the directly distilled gases. The obtained mixture of the remaining part of the benzine fraction is absorbed at a pressure of 1.5 to 6 atmospheres with the feeding of the obtained saturated absorbent to pyrolysis and subsequent mixing of the obtained pyrolysis gas with the unabsorbed product and their joint gas division. As compared to the known method, the proposed method makes it possible to reduce the energy expenditures which is achieved through a reduction in the volume of irrigation in the tower, and to increase the output of the olefines through processing of the steam and gas mixture of the benzine and the directly distilled gases.

  13. Protein structure modeling and refinement by global optimization in CASP12.

    Science.gov (United States)

    Hong, Seung Hwan; Joung, InSuk; Flores-Canales, Jose C; Manavalan, Balachandran; Cheng, Qianyi; Heo, Seungryong; Kim, Jong Yun; Lee, Sun Young; Nam, Mikyung; Joo, Keehyoung; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2018-03-01

    For protein structure modeling in the CASP12 experiment, we have developed a new protocol based on our previous CASP11 approach. The global optimization method of conformational space annealing (CSA) was applied to 3 stages of modeling: multiple sequence-structure alignment, three-dimensional (3D) chain building, and side-chain re-modeling. For better template selection and model selection, we updated our model quality assessment (QA) method with the newly developed SVMQA (support vector machine for quality assessment). For 3D chain building, we updated our energy function by including restraints generated from predicted residue-residue contacts. New energy terms for the predicted secondary structure and predicted solvent accessible surface area were also introduced. For difficult targets, we proposed a new method, LEEab, where the template term played a less significant role than it did in LEE, complemented by increased contributions from other terms such as the predicted contact term. For TBM (template-based modeling) targets, LEE performed better than LEEab, but for FM targets, LEEab was better. For model refinement, we modified our CASP11 molecular dynamics (MD) based protocol by using explicit solvents and tuning down restraint weights. Refinement results from MD simulations that used a new augmented statistical energy term in the force field were quite promising. Finally, when using inaccurate information (such as the predicted contacts), it was important to use the Lorentzian function for which the maximal penalty arising from wrong information is always bounded. © 2017 Wiley Periodicals, Inc.

  14. Refining the ischemic penumbra with topography.

    Science.gov (United States)

    Thirugnanachandran, Tharani; Ma, Henry; Singhal, Shaloo; Slater, Lee-Anne; Davis, Stephen M; Donnan, Geoffrey A; Phan, Thanh

    2018-04-01

    It has been 40 years since the ischemic penumbra was first conceptualized through work on animal models. The topography of penumbra has been portrayed as an infarcted core surrounded by penumbral tissue and an extreme rim of oligemic tissue. This picture has been used in many review articles and textbooks before the advent of modern imaging. In this paper, we review our understanding of the topography of the ischemic penumbra from the initial experimental animal models to current developments with neuroimaging which have helped to further define the temporal and spatial evolution of the penumbra and refine our knowledge. The concept of the penumbra has been successfully applied in clinical trials of endovascular therapies with a time window as long as 24 h from onset. Further, there are reports of "good" outcome even in patients with a large ischemic core. This latter observation of good outcome despite having a large core requires an understanding of the topography of the penumbra and the function of the infarcted regions. It is proposed that future research in this area takes departure from a time-dependent approach to a more individualized tissue and location-based approach.

  15. FPGA Congestion-Driven Placement Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Vicente de, J.

    2005-07-01

    The routing congestion usually limits the complete proficiency of the FPGA logic resources. A key question can be formulated regarding the benefits of estimating the congestion at placement stage. In the last years, it is gaining acceptance the idea of a detailed placement taking into account congestion. In this paper, we resort to the Thermodynamic Simulated Annealing (TSA) algorithm to perform a congestion-driven placement refinement on the top of the common Bounding-Box pre optimized solution. The adaptive properties of TSA allow the search to preserve the solution quality of the pre optimized solution while improving other fine-grain objectives. Regarding the cost function two approaches have been considered. In the first one Expected Occupation (EO), a detailed probabilistic model to account for channel congestion is evaluated. We show that in spite of the minute detail of EO, the inherent uncertainty of this probabilistic model impedes to relieve congestion beyond the sole application of the Bounding-Box cost function. In the second approach we resort to the fast Rectilinear Steiner Regions algorithm to perform not an estimation but a measurement of the global routing congestion. This second strategy allows us to successfully reduce the requested channel width for a set of benchmark circuits with respect to the widespread Versatile Place and Route (VPR) tool. (Author) 31 refs.

  16. A Reinforcement Learning Approach to Improve the Argument Selection Effectiveness in Argumentation-based Negotiation

    OpenAIRE

    Amandi, Analia Adriana; Monteserin, Ariel José

    2016-01-01

    Argument selection is considered the essence of the strategy in argumentation-based negotiation. An agent, which is arguing during a negotiation, must decide what arguments are the best to persuade the opponent. In fact, in each negotiation step, the agent must select an argument from a set of candidate arguments by applying some selection policy. Following this policy, the agent observes some factors of the negotiation context, for instance: trust in the opponent and expected utility of the...

  17. Training self-assessment and task-selection skills: A cognitive approach to improving self-regulated learning

    NARCIS (Netherlands)

    Kostons, Danny; Van Gog, Tamara; Paas, Fred

    2012-01-01

    Kostons, D., Van Gog, T., & Paas, F. (2012). Training self-assessment and task-selection skills: A cognitive approach to improving self-regulated learning. Learning and Instruction, 22(2), 121-132. doi:10.1016/j.learninstruc.2011.08.004

  18. The Generation of AlmFe in Dilute Aluminium Alloys with Different Grain Refining Additions

    Science.gov (United States)

    Meredith, M. W.; Greer, A. L.; Evans, P. V.; Hamerton, R. G.

    Al13Fe4, Al6Fe and AlmFe are common intermetallics in commercial AA1XXX series Al alloys. Grain-refining additions (based on either Al-Ti-B or Al-Ti-C) are usually added to such alloys during solidification processing to aid the grain structure development. They also influence the favoured intermetallic and, hence, can affect the materials' properties. This work simulates commercial casting practices in an attempt to determine the mechanisms by which one intermetallic phase is favoured over another by the introduction of grain-refining additions. Directional solidification experiments on Al-0.3wt.%Fe-0.15wt.%Si with and without grain refiner are conducted using Bridgman apparatus. The type, amount and effectiveness of the grain-refining additions are altered and the resulting intermetallic phase selection followed. The materials are characterised using optical microscopy, scanning electron microscopy and X-ray diffraction. AlmFe is seen to form when Al-Ti-B grain-refiner is introduced but only when the refinement is successful; reducing the effectiveness of the refiner led to Al6Fe forming under all conditions. Al-Ti-C refiners are seen to promote AlmFe at lower solidification velocities than when Al-Ti-B was used even though the grain structure was not as refined. These trends can be explained within existing eutectic theory, by considering growth undercooling.

  19. AHP approach for supplier evaluation and selection in a steel manufacturing company

    Directory of Open Access Journals (Sweden)

    Farzad Tahriri

    2008-12-01

    Full Text Available Supplier selection is one of the most critical activities of purchasing management in supply chain. Supplier selection is a complex problem involving qualitative and quantitative multi-criteria. A trade-off between these tangible and intangible factors is essential in selecting the best supplier. The work incorporates AHP in choosing the best suppliers. The results suggest that AHP process makes it possible to introduce the optimum order quantities among the selected suppliers so that the Total Value of Purchasing (TVP becomes maximum. In this work, an AHP-based supplier selection model is formulated and then applied to a real case study for a steel manufacturing company in Malaysia. The use of the proposed model indicates that it can be applied to improve and assist decision making to resolve the supplier selection problem in choosing the optimal supplier combination. The work represents the systematic identification of the important criteria for supplier selection process. In addition, the results exhibit the application of development of a multi-criteria decision model for evaluation and selection of suppliers with proposed AHP model, which by scoring the performance of suppliers is able to reduce the time taken to select a vendor.

  20. Refinement Checking on Parametric Modal Transition Systems

    DEFF Research Database (Denmark)

    Benes, Nikola; Kretínsky, Jan; Larsen, Kim Guldstrand

    2015-01-01

    Modal transition systems (MTS) is a well-studied specification formalism of reactive systems supporting a step-wise refinement methodology. Despite its many advantages, the formalism as well as its currently known extensions are incapable of expressing some practically needed aspects in the refin...

  1. Comparing Syntactic and Semantics Action Refinement

    NARCIS (Netherlands)

    Goltz, Ursula; Gorrieri, Roberto; Rensink, Arend

    The semantic definition of action refinement on labelled configuration structures is compared with the notion of syntactic substitution, which can be used as another notion of action refinement in a process algebraic setting. The comparison is done by studying a process algebra equipped with

  2. On Syntactic and Semantic Action Refinement

    NARCIS (Netherlands)

    Hagiya, M.; Goltz, U.; Mitchell, J.C.; Gorrieri, R.; Rensink, Arend

    1994-01-01

    The semantic definition of action refinement on labelled event structures is compared with the notion of syntactic substitution, which can be used as another notion of action refinement in a process algebraic setting. This is done by studying a process algebra equipped with the ACP sequential

  3. Anomalies in the refinement of isoleucine

    Energy Technology Data Exchange (ETDEWEB)

    Berntsen, Karen R. M.; Vriend, Gert, E-mail: gerrit.vriend@radboudumc.nl [Radboud University Medical Center, Geert Grooteplein 26-28, 6525 GA Nijmegen (Netherlands)

    2014-04-01

    The side-chain torsion angles of isoleucines in X-ray protein structures are a function of resolution, secondary structure and refinement software. Detailing the standard torsion angles used in refinement software can improve protein structure refinement. A study of isoleucines in protein structures solved using X-ray crystallography revealed a series of systematic trends for the two side-chain torsion angles χ{sub 1} and χ{sub 2} dependent on the resolution, secondary structure and refinement software used. The average torsion angles for the nine rotamers were similar in high-resolution structures solved using either the REFMAC, CNS or PHENIX software. However, at low resolution these programs often refine towards somewhat different χ{sub 1} and χ{sub 2} values. Small systematic differences can be observed between refinement software that uses molecular dynamics-type energy terms (for example CNS) and software that does not use these terms (for example REFMAC). Detailing the standard torsion angles used in refinement software can improve the refinement of protein structures. The target values in the molecular dynamics-type energy functions can also be improved.

  4. Refined large N duality for knots

    DEFF Research Database (Denmark)

    Kameyama, Masaya; Nawata, Satoshi

    We formulate large N duality of U(N) refined Chern-Simons theory with a torus knot/link in S³. By studying refined BPS states in M-theory, we provide the explicit form of low-energy effective actions of Type IIA string theory with D4-branes on the Ω-background. This form enables us to relate...

  5. Anomalies in the refinement of isoleucine

    International Nuclear Information System (INIS)

    Berntsen, Karen R. M.; Vriend, Gert

    2014-01-01

    The side-chain torsion angles of isoleucines in X-ray protein structures are a function of resolution, secondary structure and refinement software. Detailing the standard torsion angles used in refinement software can improve protein structure refinement. A study of isoleucines in protein structures solved using X-ray crystallography revealed a series of systematic trends for the two side-chain torsion angles χ 1 and χ 2 dependent on the resolution, secondary structure and refinement software used. The average torsion angles for the nine rotamers were similar in high-resolution structures solved using either the REFMAC, CNS or PHENIX software. However, at low resolution these programs often refine towards somewhat different χ 1 and χ 2 values. Small systematic differences can be observed between refinement software that uses molecular dynamics-type energy terms (for example CNS) and software that does not use these terms (for example REFMAC). Detailing the standard torsion angles used in refinement software can improve the refinement of protein structures. The target values in the molecular dynamics-type energy functions can also be improved

  6. Feature selection in wind speed prediction systems based on a hybrid coral reefs optimization – Extreme learning machine approach

    International Nuclear Information System (INIS)

    Salcedo-Sanz, S.; Pastor-Sánchez, A.; Prieto, L.; Blanco-Aguilera, A.; García-Herrera, R.

    2014-01-01

    Highlights: • A novel approach for short-term wind speed prediction is presented. • The system is formed by a coral reefs optimization algorithm and an extreme learning machine. • Feature selection is carried out with the CRO to improve the ELM performance. • The method is tested in real wind farm data in USA, for the period 2007–2008. - Abstract: This paper presents a novel approach for short-term wind speed prediction based on a Coral Reefs Optimization algorithm (CRO) and an Extreme Learning Machine (ELM), using meteorological predictive variables from a physical model (the Weather Research and Forecast model, WRF). The approach is based on a Feature Selection Problem (FSP) carried out with the CRO, that must obtain a reduced number of predictive variables out of the total available from the WRF. This set of features will be the input of an ELM, that finally provides the wind speed prediction. The CRO is a novel bio-inspired approach, based on the simulation of reef formation and coral reproduction, able to obtain excellent results in optimization problems. On the other hand, the ELM is a new paradigm in neural networks’ training, that provides a robust and extremely fast training of the network. Together, these algorithms are able to successfully solve this problem of feature selection in short-term wind speed prediction. Experiments in a real wind farm in the USA show the excellent performance of the CRO–ELM approach in this FSP wind speed prediction problem

  7. Refined Phenotyping of Modic Changes

    Science.gov (United States)

    Määttä, Juhani H.; Karppinen, Jaro; Paananen, Markus; Bow, Cora; Luk, Keith D.K.; Cheung, Kenneth M.C.; Samartzis, Dino

    2016-01-01

    Abstract Low back pain (LBP) is the world's most disabling condition. Modic changes (MC) are vertebral bone marrow changes adjacent to the endplates as noted on magnetic resonance imaging. The associations of specific MC types and patterns with prolonged, severe LBP and disability remain speculative. This study assessed the relationship of prolonged, severe LBP and back-related disability, with the presence and morphology of lumbar MC in a large cross-sectional population-based study of Southern Chinese. We addressed the topographical and morphological dimensions of MC along with other magnetic resonance imaging phenotypes (eg, disc degeneration and displacement) on the basis of axial T1 and sagittal T2-weighted imaging of L1-S1. Prolonged severe LBP was defined as LBP lasting ≥30 days during the past year, and a visual analog scale severest pain intensity of at least 6/10. An Oswestry Disability Index score of 15% was regarded as significant disability. We also assessed subject demographics, occupation, and lifestyle factors. In total, 1142 subjects (63% females, mean age 53 years) were assessed. Of these, 282 (24.7%) had MC (7.1% type I, 17.6% type II). MC subjects were older (P = 0.003), had more frequent disc displacements (P disability. The strength of the associations increased with the number of MC. This large-scale study is the first to definitively note MC types and specific morphologies to be independently associated with prolonged severe LBP and back-related disability. This proposed refined MC phenotype may have direct implications in clinical decision-making as to the development and management of LBP. Understanding of these imaging biomarkers can lead to new preventative and personalized therapeutics related to LBP. PMID:27258491

  8. North Dakota Refining Capacity Study

    Energy Technology Data Exchange (ETDEWEB)

    Dennis Hill; Kurt Swenson; Carl Tuura; Jim Simon; Robert Vermette; Gilberto Marcha; Steve Kelly; David Wells; Ed Palmer; Kuo Yu; Tram Nguyen; Juliam Migliavacca

    2011-01-05

    According to a 2008 report issued by the United States Geological Survey, North Dakota and Montana have an estimated 3.0 to 4.3 billion barrels of undiscovered, technically recoverable oil in an area known as the Bakken Formation. With the size and remoteness of the discovery, the question became 'can a business case be made for increasing refining capacity in North Dakota?' And, if so what is the impact to existing players in the region. To answer the question, a study committee comprised of leaders in the region's petroleum industry were brought together to define the scope of the study, hire a consulting firm and oversee the study. The study committee met frequently to provide input on the findings and modify the course of the study, as needed. The study concluded that the Petroleum Area Defense District II (PADD II) has an oversupply of gasoline. With that in mind, a niche market, naphtha, was identified. Naphtha is used as a diluent used for pipelining the bitumen (heavy crude) from Canada to crude markets. The study predicted there will continue to be an increase in the demand for naphtha through 2030. The study estimated the optimal configuration for the refinery at 34,000 barrels per day (BPD) producing 15,000 BPD of naphtha and a 52 percent refinery charge for jet and diesel yield. The financial modeling assumed the sponsor of a refinery would invest its own capital to pay for construction costs. With this assumption, the internal rate of return is 9.2 percent which is not sufficient to attract traditional investment given the risk factor of the project. With that in mind, those interested in pursuing this niche market will need to identify incentives to improve the rate of return.

  9. Uranium refining by solvent extraction

    International Nuclear Information System (INIS)

    Kraikaew, J.; Srinuttrakul, W.

    2014-01-01

    The solvent extraction process to produce higher purity uranium from yellowcake was studied in laboratory scale. Yellowcake, which the uranium purity is around 70% and the main impurity is thorium, was obtained from monazite processing pilot plant of Rare Earth Research and Development Center in Thailand. For uranium re-extraction process, the extractant chosen was Tributylphosphate (TBP) in kerosene. It was found that the optimum concentration of TBP was 10% in kerosene and the optimum nitric acid concentration in uranyl nitrate feed solution was 4 N. An increase in concentrations of uranium and thorium in feed solution resulted in a decrease in the distribution of both components in the extractant. However, the distribution of uranium into the extractant was found to be more than that of thorium. The equilibration study of the extraction system, UO_2(NO_3)/4N HNO_3 – 10%TBP/Kerosene, was also investigated. Two extraction stages were calculated graphically from 100,000 ppm uranium concentration in feed solution input with 90% extraction efficiency and the flow ratio of aqueous phase to organic phase was adjusted to 1.0. For thorium impurity scrubbing process, 10% TBP in kerosene was loaded with uranium and minor thorium from uranyl nitrate solution prepared from yellowcake and was scrubbed with different low concentration nitric acid. The results showed that at nitric acid normality was lower than 1 N, uranium distributed well to aqueous phase. As conclusion, optimum nitric acid concentration for scrubbing process should not less than 1 N and diluted nitric acid or de-ionized water should be applied to strip uranium from organic phase in the final refining process. (author)

  10. Selective Mutism: A Three-Tiered Approach to Prevention and Intervention

    Science.gov (United States)

    Busse, R. T.; Downey, Jenna

    2011-01-01

    Selective mutism is a rare anxiety disorder that prevents a child from speaking at school or other community settings, and can be detrimental to a child's social development. School psychologists can play an important role in the prevention and treatment of selective mutism. As an advocate for students, school psychologists can work with teachers,…

  11. Pan endoscopic approach "hysterolaparoscopy" as an initial procedure in selected infertile women.

    Science.gov (United States)

    Vaid, Keya; Mehra, Sheila; Verma, Mita; Jain, Sandhya; Sharma, Abha; Bhaskaran, Sruti

    2014-02-01

    normal uterine cavity. When these 112 women (58.03%) with normal HSG report were further subjected to hysterolaparoscopy, only 35/193 (18.13%) of them actually had normal tubes and uterus; rest 77 women (39.89%) were benefited by one step procedure of hysterolaparoscopic evaluation and intervention and further treatment done. Hysterolaparoscopy (Pan Endoscopic) approach is better than HSG and should be encouraged as first and final procedure in selected infertile women.

  12. Current research progress in grain refinement of cast magnesium alloys: A review article

    International Nuclear Information System (INIS)

    Ali, Yahia; Qiu, Dong; Jiang, Bin; Pan, Fusheng; Zhang, Ming-Xing

    2015-01-01

    Grain refinement of cast magnesium alloys, particularly in magnesium–aluminium (Mg–Al) based alloys, has been an active research topic in the past two decades, because it has been considered as one of the most effective approaches to simultaneously increase the strength, ductility and formability. The development of new grain refiners was normally based on the theories/models that were established through comprehensive and considerable studies of grain refinement in cast Al alloys. Generally, grain refinement in cast Al can be achieved through either inoculation treatment, which is a process of adding, or in situ forming, foreign particles to promote heterogeneous nucleation rate, or restricting grain growth by controlling the constitutional supercooling or both. But, the concrete and tangible grain refinement mechanism in cast metals is still not fully understood and there are a number of controversies. Therefore, most of the new developed grain refiners for Mg–Al based alloys are not as efficient as the commercially available ones, such as zirconium in non-Al containing Mg alloys. To facilitate the research in grain refinement of cast magnesium alloys, this review starts with highlighting the theoretical aspects of grain refinement in cast metals, followed by reviewing the latest research progress in grain refinement of magnesium alloys in terms of the solute effect and potent nucleants

  13. Current research progress in grain refinement of cast magnesium alloys: A review article

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Yahia; Qiu, Dong [School of Mechanical and Mining Engineering, University of Queensland, St Lucia, QLD 4072 (Australia); Jiang, Bin; Pan, Fusheng [College of Materials Science and Engineering, Chongqing University, Chongqing 400030 (China); Zhang, Ming-Xing, E-mail: Mingxing.Zhang@uq.edu.au [School of Mechanical and Mining Engineering, University of Queensland, St Lucia, QLD 4072 (Australia)

    2015-01-15

    Grain refinement of cast magnesium alloys, particularly in magnesium–aluminium (Mg–Al) based alloys, has been an active research topic in the past two decades, because it has been considered as one of the most effective approaches to simultaneously increase the strength, ductility and formability. The development of new grain refiners was normally based on the theories/models that were established through comprehensive and considerable studies of grain refinement in cast Al alloys. Generally, grain refinement in cast Al can be achieved through either inoculation treatment, which is a process of adding, or in situ forming, foreign particles to promote heterogeneous nucleation rate, or restricting grain growth by controlling the constitutional supercooling or both. But, the concrete and tangible grain refinement mechanism in cast metals is still not fully understood and there are a number of controversies. Therefore, most of the new developed grain refiners for Mg–Al based alloys are not as efficient as the commercially available ones, such as zirconium in non-Al containing Mg alloys. To facilitate the research in grain refinement of cast magnesium alloys, this review starts with highlighting the theoretical aspects of grain refinement in cast metals, followed by reviewing the latest research progress in grain refinement of magnesium alloys in terms of the solute effect and potent nucleants.

  14. Refining search terms for nanotechnology

    International Nuclear Information System (INIS)

    Porter, Alan L.; Youtie, Jan; Shapira, Philip; Schoeneck, David J.

    2008-01-01

    The ability to delineate the boundaries of an emerging technology is central to obtaining an understanding of the technology's research paths and commercialization prospects. Nowhere is this more relevant than in the case of nanotechnology (hereafter identified as 'nano') given its current rapid growth and multidisciplinary nature. (Under the rubric of nanotechnology, we also include nanoscience and nanoengineering.) Past efforts have utilized several strategies, including simple term search for the prefix nano, complex lexical and citation-based approaches, and bootstrapping techniques. This research introduces a modularized Boolean approach to defining nanotechnology which has been applied to several research and patenting databases. We explain our approach to downloading and cleaning data, and report initial results. Comparisons of this approach with other nanotechnology search formulations are presented. Implications for search strategy development and profiling of the nanotechnology field are discussed

  15. Refining search terms for nanotechnology

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Alan L. [Georgia Institute of Technology (United States); Youtie, Jan [Georgia Institute of Technology, Enterprise Innovation Institute (United States)], E-mail: jan.youtie@innovate.gatech.edu; Shapira, Philip [Georgia Institute of Technology (United States); Schoeneck, David J. [Search Technology, Inc. (United States)

    2008-05-15

    The ability to delineate the boundaries of an emerging technology is central to obtaining an understanding of the technology's research paths and commercialization prospects. Nowhere is this more relevant than in the case of nanotechnology (hereafter identified as 'nano') given its current rapid growth and multidisciplinary nature. (Under the rubric of nanotechnology, we also include nanoscience and nanoengineering.) Past efforts have utilized several strategies, including simple term search for the prefix nano, complex lexical and citation-based approaches, and bootstrapping techniques. This research introduces a modularized Boolean approach to defining nanotechnology which has been applied to several research and patenting databases. We explain our approach to downloading and cleaning data, and report initial results. Comparisons of this approach with other nanotechnology search formulations are presented. Implications for search strategy development and profiling of the nanotechnology field are discussed.

  16. Procurement planning in oil refining industries considering blending operations

    DEFF Research Database (Denmark)

    Oddsdottir, Thordis Anna; Grunow, Martin; Akkerman, Renzo

    2013-01-01

    This paper addresses procurement planning in oil refining, which has until now only had limited attention in the literature. We introduce a mixed integer nonlinear programming (MINLP) model and develop a novel two-stage solution approach, which aims at computational efficiency while addressing...... parameters than in previous literature. The developed approach is tested using historical data from Statoil A/S as well as through a comprehensive numerical analysis. The approach generates a feasible procurement plan within acceptable computation time, is able to quickly adjust an existing plan to take...

  17. Financial optimisation and risk management in refining activities

    International Nuclear Information System (INIS)

    Fiorenzani, S.

    2006-01-01

    The real options approach has become a benchmark in real assets evaluation and optimal management problems, especially in liberalised and competitive markets such as the oil and hydrocarbon markets. This paper describes how the same approach can be a useful tool for both risk management decisions and the financial optimisation problem. Refineries are black boxes, which can be used for the transformation of crude oil into more refined hydrocarbon products. These black boxes are characterised by operational flexibilities and constraints, which should be optimally managed in order to maximise the refiner's economic goals. Stochastic dynamic programming represents the right mathematical instrument employed to solve the decision-making problem in such an economic environment. (author)

  18. Approaches of selecting options for upgrading of safety of near surface facilities

    International Nuclear Information System (INIS)

    Goldammer, W.

    2003-01-01

    General principle of optimization using cost-benefit analysis and multi-attribute utility analysis and considering the radiological and ecological risks are discussed. Alternative scenarios (unplanned events) are also considered. Comparison of options for interim storage facility is made in the example of Uranium mining. The conclusions from the example are: Quantitative optimisation necessary in order to arrive at conclusion (higher financial expenditures yield lower risks); Only inclusion of failure scenarios reveals that passive safety of wet option is not satisfactory; Probabilistic simulation allows for keeping track of uncertainties and assessing their consequences within the decision-making process; Optimisation analysis can be refined in the course of further reclamation planning to allow for questions on detailed design to be addressed. Pragmatic (non-quantitative) assessment cannot reveal how safe is safe enough. Different options for the waste storage in an interim storage facility are analysed. The methodology includes: Estimate of cost components; Discounting of long-term costs; Estimate of risks; Assessment of qualitative factors; Definition of weighting factors; Application of multi-attribute utility analysis; Deterministic sensitivity analysis for important parameters.The examples show the importance of the incorporation of qualitative factors, weighting of parameters and sensitivity analysis in the decision making

  19. Selection Methodology Approach to Preferable and Alternative Sites for the First NPP Project in Yemen

    Energy Technology Data Exchange (ETDEWEB)

    Kassim, Moath [Kyunghe Univ., Yongin (Korea, Republic of); Kessel, David S. [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    The purpose of this paper is to briefly present the methodology and results of the first siting study for the first nuclear power plant (NPP) in Yemen. In this study it has been demonstrated that there are suitable sites for specific unit/units power of 1000 MWt (about 300 MWe) nuclear power plant. To perform the site selection, a systematic selection method was developed. The method uses site-specific data gathered by literature review and expert judgement to identify the most important site selection criteria. A two-step site selection process was used. Candidate sites were chosen that meet a subset of the selection criteria that form the most important system constraints. These candidate sites were then evaluated against the full set of selection criteria using the Analytical Hierarchy Process Method (AHP). Candidate sites underwent a set of more specific siting criteria weighted by expert judgment to select preferable sites and alternatives using AHP method again. Expert Judgment method was used to rank and weight the importance of each criteria, then AHP method used to evaluate and weight the relation between criterion to criterion and between all criteria against the global weight. Then logical decision software was used to rank sites upon their weighting value.

  20. Selection Methodology Approach to Preferable and Alternative Sites for the First NPP Project in Yemen

    International Nuclear Information System (INIS)

    Kassim, Moath; Kessel, David S.

    2015-01-01

    The purpose of this paper is to briefly present the methodology and results of the first siting study for the first nuclear power plant (NPP) in Yemen. In this study it has been demonstrated that there are suitable sites for specific unit/units power of 1000 MWt (about 300 MWe) nuclear power plant. To perform the site selection, a systematic selection method was developed. The method uses site-specific data gathered by literature review and expert judgement to identify the most important site selection criteria. A two-step site selection process was used. Candidate sites were chosen that meet a subset of the selection criteria that form the most important system constraints. These candidate sites were then evaluated against the full set of selection criteria using the Analytical Hierarchy Process Method (AHP). Candidate sites underwent a set of more specific siting criteria weighted by expert judgment to select preferable sites and alternatives using AHP method again. Expert Judgment method was used to rank and weight the importance of each criteria, then AHP method used to evaluate and weight the relation between criterion to criterion and between all criteria against the global weight. Then logical decision software was used to rank sites upon their weighting value

  1. A model-based approach for identifying signatures of ancient balancing selection in genetic data.

    Science.gov (United States)

    DeGiorgio, Michael; Lohmueller, Kirk E; Nielsen, Rasmus

    2014-08-01

    While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates.

  2. A model-based approach for identifying signatures of ancient balancing selection in genetic data.

    Directory of Open Access Journals (Sweden)

    Michael DeGiorgio

    2014-08-01

    Full Text Available While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates.

  3. Macromolecular refinement by model morphing using non-atomic parameterizations.

    Science.gov (United States)

    Cowtan, Kevin; Agirre, Jon

    2018-02-01

    Refinement is a critical step in the determination of a model which explains the crystallographic observations and thus best accounts for the missing phase components. The scattering density is usually described in terms of atomic parameters; however, in macromolecular crystallography the resolution of the data is generally insufficient to determine the values of these parameters for individual atoms. Stereochemical and geometric restraints are used to provide additional information, but produce interrelationships between parameters which slow convergence, resulting in longer refinement times. An alternative approach is proposed in which parameters are not attached to atoms, but to regions of the electron-density map. These parameters can move the density or change the local temperature factor to better explain the structure factors. Varying the size of the region which determines the parameters at a particular position in the map allows the method to be applied at different resolutions without the use of restraints. Potential applications include initial refinement of molecular-replacement models with domain motions, and potentially the use of electron density from other sources such as electron cryo-microscopy (cryo-EM) as the refinement model.

  4. A Refinement Calculus for Circus - Mini-thesis

    OpenAIRE

    Oliveira, Marcel V. M.

    2004-01-01

    Most software developments do not use any of the existing theories and formalisms. This leads to a loss of precision and correctness on the resulting softwares. Two different approaches to formal techniques have been raised in the past decades: one focus on data aspects, and the other focus on the behavioural aspects of the system. Some combined languages have already been proposed to bring these two schools together. However, as far as we know, none of them has a related refinement calculus....

  5. An Integrated DEMATEL-VIKOR Method-Based Approach for Cotton Fibre Selection and Evaluation

    Science.gov (United States)

    Chakraborty, Shankar; Chatterjee, Prasenjit; Prasad, Kanika

    2018-01-01

    Selection of the most appropriate cotton fibre type for yarn manufacturing is often treated as a multi-criteria decision-making (MCDM) problem as the optimal selection decision needs to be taken in presence of several conflicting fibre properties. In this paper, two popular MCDM methods in the form of decision making trial and evaluation laboratory (DEMATEL) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR) are integrated to aid the cotton fibre selection decision. DEMATEL method addresses the interrelationships between various physical properties of cotton fibres while segregating them into cause and effect groups, whereas, VIKOR method helps in ranking all the considered 17 cotton fibres from the best to the worst. The derived ranking of cotton fibre alternatives closely matches with that obtained by the past researchers. This model can assist the spinning industry personnel in the blending process while making accurate fibre selection decision when cotton fibre properties are numerous and interrelated.

  6. A multi-criteria decision making approach for the selection of a flexible packaging equipment

    Directory of Open Access Journals (Sweden)

    Cristea Ciprian

    2017-01-01

    Full Text Available Flexible packaging is one of the fastest growing segments of the packaging industry, combining paper, plastic film and aluminum foil to deliver a broad array of products for food and beverage, personal care products, and pharmaceutical industries. In order to preserve the quality and safety of the product contained in them, there are currently a variety of flexible packaging equipment options available. However their relative costs and performance differ. This study proposes an application of a ranking methodology to assess a selection of the most suitable flexible packaging equipment. The adequate criteria in the selection of equipment have been identified, and the considered options are assessed, considering the decision maker’s preferences and existing constraints. The options are ranked in terms of their suitability for selecting equipment by using the Electre III method. The results obtained from the simulation experiment highlight the effectiveness of the model in outranking different options in the process of equipment selection.

  7. Identifying footprints of selection in stocked brown trout populations: a spatio-temporal approach

    DEFF Research Database (Denmark)

    Hansen, Michael Møller; Meier, Kristian; Mensberg, Karen-Lise Dons

    2010-01-01

    Studies of interactions between farmed and wild salmonid fishes have suggested reduced fitness of farmed strains in the wild, but evidence for selection at the genic level is lacking. We studied three brown trout populations in Denmark which have been significantly admixed with stocked hatchery...... trout (19–64%), along with two hatchery strains used for stocking. The wild populations were represented by contemporary samples (2000–2006) and two of them by historical samples (1943–1956). We analysed 61 microsatellite loci, nine of which showed putative functional relationships [expressed sequence...... trout. In the most strongly admixed population, however, there was no evidence for selection, possibly because of immigration by stocked trout overcoming selection against hatchery-derived alleles or supportive breeding practices allowing hatchery strain trout to escape natural selection. To our...

  8. A Double Selection Approach to Achieve Specific Expression of Toxin Genes for Ovarian Cancer Gene Therapy

    National Research Council Canada - National Science Library

    Curiel, David T; Siegal, Gene; Wang, Minghui

    2007-01-01

    ...) to achieve efficient and selective gene transfer to target tumor cells. Proposed herein is a strategy to modify one candidate vector, recombinant adenovirus, such that it embodies the requisite properties of efficacy and specificity...

  9. A multi-criteria decision making approach for supplier selection in the flexible packaging industry

    Directory of Open Access Journals (Sweden)

    Cristea Ciprian

    2017-01-01

    Full Text Available The supplier selection problem represents one of the most important components of the supply chain management. This article presents a multiple criteria decision making analysis contributing to the selection of the most convenient supplier in the flexible packaging industry. Due to the fact that in today's supply chain management, the performance of potential suppliers is evaluated against multiple criteria rather than taking into account only the cost factor, the appropriate criteria in the supplier selection have been identified, and the considered variants are assessed, considering the decision maker’s preferences and existing constraints. The variants are ranked in terms of their suitability for selecting a supplier with the use of Electre III method. The results obtained from the simulation experiment suggest that this methodology is a feasible decision support model.

  10. A Combinatory Approach for Selecting Prognostic Genes in Microarray Studies of Tumour Survivals

    Directory of Open Access Journals (Sweden)

    Qihua Tan

    2009-01-01

    Full Text Available Different from significant gene expression analysis which looks for genes that are differentially regulated, feature selection in the microarray-based prognostic gene expression analysis aims at finding a subset of marker genes that are not only differentially expressed but also informative for prediction. Unfortunately feature selection in literature of microarray study is predominated by the simple heuristic univariate gene filter paradigm that selects differentially expressed genes according to their statistical significances. We introduce a combinatory feature selection strategy that integrates differential gene expression analysis with the Gram-Schmidt process to identify prognostic genes that are both statistically significant and highly informative for predicting tumour survival outcomes. Empirical application to leukemia and ovarian cancer survival data through-within- and cross-study validations shows that the feature space can be largely reduced while achieving improved testing performances.

  11. The M-OLAP Cube Selection Problem: A Hyper-polymorphic Algorithm Approach

    Science.gov (United States)

    Loureiro, Jorge; Belo, Orlando

    OLAP systems depend heavily on the materialization of multidimensional structures to speed-up queries, whose appropriate selection constitutes the cube selection problem. However, the recently proposed distribution of OLAP structures emerges to answer new globalization's requirements, capturing the known advantages of distributed databases. But this hardens the search for solutions, especially due to the inherent heterogeneity, imposing an extra characteristic of the algorithm that must be used: adaptability. Here the emerging concept known as hyper-heuristic can be a solution. In fact, having an algorithm where several (meta-)heuristics may be selected under the control of a heuristic has an intrinsic adaptive behavior. This paper presents a hyper-heuristic polymorphic algorithm used to solve the extended cube selection and allocation problem generated in M-OLAP architectures.

  12. The Application of a Decision-making Approach based on Fuzzy ANP and TOPSIS for Selecting a Strategic Supplier

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-09-01

    Full Text Available Supplier selection becomes very important when used in the context of strategic partnerships because of the long-term orientation of the relationship. This paper describes the application of a decision-making approach for selecting a strategic partner (supplier. The approach starts with defining a set of criteria that fits the company’s condition. In the next steps, a combination of fuzzy-ANP and TOPSIS methods is used to determine the weight for each criterion and rank all the alternatives. The application of the approach in an Indonesian manufacturing company showed that the three factors that got the highest weight were “geographical location”, “current operating performance”, and “reliability”. Geographical location got the highest weight because it affects many other factors such as reaction to changes in demand, after-sales service, and delivery lead-time. Application of the approach helps decision-makers to gain effectiveness and efficiency in the decision-making process because it facilitates them to express their group’s collective preferences while also providing opportunities for members to express their individual preferences. Future research can be directed at combining qualitative and quantitative criteria to develop the best criteria and methods for the selection of the best suppliers based on fuzzy ANP and TOPSIS.

  13. A Selection Approach for Optimized Problem-Solving Process by Grey Relational Utility Model and Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chih-Kun Ke

    2012-01-01

    Full Text Available In business enterprises, especially the manufacturing industry, various problem situations may occur during the production process. A situation denotes an evaluation point to determine the status of a production process. A problem may occur if there is a discrepancy between the actual situation and the desired one. Thus, a problem-solving process is often initiated to achieve the desired situation. In the process, how to determine an action need to be taken to resolve the situation becomes an important issue. Therefore, this work uses a selection approach for optimized problem-solving process to assist workers in taking a reasonable action. A grey relational utility model and a multicriteria decision analysis are used to determine the optimal selection order of candidate actions. The selection order is presented to the worker as an adaptive recommended solution. The worker chooses a reasonable problem-solving action based on the selection order. This work uses a high-tech company’s knowledge base log as the analysis data. Experimental results demonstrate that the proposed selection approach is effective.

  14. Application Of Decision Tree Approach To Student Selection Model- A Case Study

    Science.gov (United States)

    Harwati; Sudiya, Amby

    2016-01-01

    The main purpose of the institution is to provide quality education to the students and to improve the quality of managerial decisions. One of the ways to improve the quality of students is to arrange the selection of new students with a more selective. This research takes the case in the selection of new students at Islamic University of Indonesia, Yogyakarta, Indonesia. One of the university's selection is through filtering administrative selection based on the records of prospective students at the high school without paper testing. Currently, that kind of selection does not yet has a standard model and criteria. Selection is only done by comparing candidate application file, so the subjectivity of assessment is very possible to happen because of the lack standard criteria that can differentiate the quality of students from one another. By applying data mining techniques classification, can be built a model selection for new students which includes criteria to certain standards such as the area of origin, the status of the school, the average value and so on. These criteria are determined by using rules that appear based on the classification of the academic achievement (GPA) of the students in previous years who entered the university through the same way. The decision tree method with C4.5 algorithm is used here. The results show that students are given priority for admission is that meet the following criteria: came from the island of Java, public school, majoring in science, an average value above 75, and have at least one achievement during their study in high school.

  15. Assessing the Job Selection Criteria of Accounting Students: a Normative Approach

    OpenAIRE

    zubairu, umaru; Ismail, Suhaiza; Abdul Hamid, Fatima

    2017-01-01

    This research assessed to what extent final-year Muslim accounting students in Malaysia considered Islamic principles when choosing a job after graduation. 356 final-year Muslim accounting students in four Malaysian universities were surveyed using an open-ended job selection scenario. The result shows that reality does not live up to the ideal. Only 16% of the respondents apply Islamic principles in making a job selection decision. The remaining 84% are more concerned with other criteria suc...

  16. Evolutionary Feature Selection for Big Data Classification: A MapReduce Approach

    Directory of Open Access Journals (Sweden)

    Daniel Peralta

    2015-01-01

    Full Text Available Nowadays, many disciplines have to deal with big datasets that additionally involve a high number of features. Feature selection methods aim at eliminating noisy, redundant, or irrelevant features that may deteriorate the classification performance. However, traditional methods lack enough scalability to cope with datasets of millions of instances and extract successful results in a delimited time. This paper presents a feature selection algorithm based on evolutionary computation that uses the MapReduce paradigm to obtain subsets of features from big datasets. The algorithm decomposes the original dataset in blocks of instances to learn from them in the map phase; then, the reduce phase merges the obtained partial results into a final vector of feature weights, which allows a flexible application of the feature selection procedure using a threshold to determine the selected subset of features. The feature selection method is evaluated by using three well-known classifiers (SVM, Logistic Regression, and Naive Bayes implemented within the Spark framework to address big data problems. In the experiments, datasets up to 67 millions of instances and up to 2000 attributes have been managed, showing that this is a suitable framework to perform evolutionary feature selection, improving both the classification accuracy and its runtime when dealing with big data problems.

  17. Approaches to Addressing Service Selection Ties in Ad Hoc Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ayotuyi Tosin Akinola

    2018-01-01

    Full Text Available The ad hoc mobile cloud (AMC allows mobile devices to connect together through a wireless connection or any other means and send a request for web services from one to another within the mobile cloud. However, one of the major challenges in the AMC is the occurrence of dissatisfaction experienced by the users. This is because there are many services with similar functionalities but varying nonfunctional properties. Moreover, another resultant cause of user dissatisfaction being coupled with runtime redundancy is the attainment of similar quality computations during service selection, often referred to as “service selection ties.” In an attempt to address this challenge, service selection mechanisms for the AMC were developed in this work. This includes the use of selected quality of service properties coupled with user feedback data to determine the most suitable service. These mechanisms were evaluated using the experimental method. The evaluation of the mechanisms mainly focused on the metrics that evaluate the satisfaction of users' interest via the quantitative evaluation. The experiments affirmed that the use of the shortest distance can help to break selection ties between potential servicing nodes. Also, a continuous use of updated and unlimited range of users' assessments enhances an optimal service selection.

  18. A two-step approach to estimating selectivity and fishing power of research gill nets used in Greenland waters

    DEFF Research Database (Denmark)

    Hovgård, Holger

    1996-01-01

    by normal distributions and could be related to mesh size in accordance with the principle of geometrical similarity. In the second step the selection parameters were estimated by a nonlinear least squares fit. The model also estimated the relative efficiency of the two capture processes and the fishing......Catches of Atlantic cod (Gadus morhua) from Greenland gill-net surveys were analyzed by a two-step approach. In the initial step the form of the selection curve was identified as binormal, which was caused by fish being gilled or caught by the maxillae. Both capture processes could be described...

  19. Optimizing refiner operation with statistical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, G [Noranda Research Centre, Pointe Claire, PQ (Canada)

    1997-02-01

    The impact of refining conditions on the energy efficiency of the process and on the handsheet quality of a chemi-mechanical pulp was studied as part of a series of pilot scale refining trials. Statistical models of refiner performance were constructed from these results and non-linear optimization of process conditions were conducted. Optimization results indicated that increasing the ratio of specific energy applied in the first stage led to a reduction of some 15 per cent in the total energy requirement. The strategy can also be used to obtain significant increases in pulp quality for a given energy input. 20 refs., 6 tabs.

  20. Refinement Types for TypeScript

    OpenAIRE

    Vekris, Panagiotis; Cosman, Benjamin; Jhala, Ranjit

    2016-01-01

    We present Refined TypeScript (RSC), a lightweight refinement type system for TypeScript, that enables static verification of higher-order, imperative programs. We develop a formal core of RSC that delineates the interaction between refinement types and mutability. Next, we extend the core to account for the imperative and dynamic features of TypeScript. Finally, we evaluate RSC on a set of real world benchmarks, including parts of the Octane benchmarks, D3, Transducers, and the TypeScript co...

  1. Price implications for Russia's oil refining

    International Nuclear Information System (INIS)

    Khartukov, Eugene M.

    1998-01-01

    Over the past several years, Russia's oil industry has undergone its radical transformation from a wholly state-run and generously subsidized oil distribution system toward a substantially privatized, cash-strapped, and quasi-market ''petropreneurship''. This fully applies to the industry's downstream sector. Still unlike more dynamic E and C operations, the country's refining has turned out better fenced off competitive market forces and is less capable to respond to market imperatives. Consequently, jammed between depressed product prices and persistent feedstock costs, Russian refiners were badly hit by the world oil glut - which has made a radical modernization of the obsolete refining sector clearly a must. (author)

  2. An ERP Selection Framework in Constructor Companies using Fuzzy AHP Approach

    Directory of Open Access Journals (Sweden)

    mohammad ali Shahhosseini

    2013-07-01

    Full Text Available The success in ERP implementation is definitely based on selecting an appropriate system which is more aligned with enterprise culture, infrastructure and requirements, and that's why ERP selection, the process and impressive criteria have been increasingly attended in recent years. The constructor companies are strongly affected by ERP systems. A successful implementation will improve their productivity and promote their performance considerably. However, it is a challenge for decision-makers to identify the real needs, define the criteria, select the acceptable vendor and purchase the most appropriate system. This study is developed to present a Fuzzy AHP-based framework for selecting ERP systems in constructor companies. In this study, the impressive criteria have been collected by reviewing previous studies and researches, a questionnaire was used to assess and define the criteria and sub-criteria’s priority. Afterward, another questionnaire was used to compare the alternatives regarding to each criteria. Eventually, the Fuzzy Analytic Hierarchical Process was used to select a system which is more aligned with the organization’s requirements and strategies

  3. Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.

    Science.gov (United States)

    Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon

    2013-04-15

    The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. A multi-objective optimization approach for the selection of working fluids of geothermal facilities: Economic, environmental and social aspects.

    Science.gov (United States)

    Martínez-Gomez, Juan; Peña-Lamas, Javier; Martín, Mariano; Ponce-Ortega, José María

    2017-12-01

    The selection of the working fluid for Organic Rankine Cycles has traditionally been addressed from systematic heuristic methods, which perform a characterization and prior selection considering mainly one objective, thus avoiding a selection considering simultaneously the objectives related to sustainability and safety. The objective of this work is to propose a methodology for the optimal selection of the working fluid for Organic Rankine Cycles. The model is presented as a multi-objective approach, which simultaneously considers the economic, environmental and safety aspects. The economic objective function considers the profit obtained by selling the energy produced. Safety was evaluated in terms of individual risk for each of the components of the Organic Rankine Cycles and it was formulated as a function of the operating conditions and hazardous properties of each working fluid. The environmental function is based on carbon dioxide emissions, considering carbon dioxide mitigation, emission due to the use of cooling water as well emissions due material release. The methodology was applied to the case of geothermal facilities to select the optimal working fluid although it can be extended to waste heat recovery. The results show that the hydrocarbons represent better solutions, thus among a list of 24 working fluids, toluene is selected as the best fluid. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Reliable Portfolio Selection Problem in Fuzzy Environment: An mλ Measure Based Approach

    Directory of Open Access Journals (Sweden)

    Yuan Feng

    2017-04-01

    Full Text Available This paper investigates a fuzzy portfolio selection problem with guaranteed reliability, in which the fuzzy variables are used to capture the uncertain returns of different securities. To effectively handle the fuzziness in a mathematical way, a new expected value operator and variance of fuzzy variables are defined based on the m λ measure that is a linear combination of the possibility measure and necessity measure to balance the pessimism and optimism in the decision-making process. To formulate the reliable portfolio selection problem, we particularly adopt the expected total return and standard variance of the total return to evaluate the reliability of the investment strategies, producing three risk-guaranteed reliable portfolio selection models. To solve the proposed models, an effective genetic algorithm is designed to generate the approximate optimal solution to the considered problem. Finally, the numerical examples are given to show the performance of the proposed models and algorithm.

  6. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  7. Split and Splice Approach for Highly Selective Targeting of Human NSCLC Tumors

    Science.gov (United States)

    2014-10-01

    development and implementation of the “split-and- spice ” approach required optimization of many independent parameters, which were addressed in parallel...verify the feasibility of the “split and splice” approach for targeting human NSCLC tumor cell lines in culture and prepare the optimized toxins for...for cultured cells (months 2- 8). 2B. To test the efficiency of cell targeting by the toxin variants reconstituted in vitro (months 3-6). 2C. To

  8. Tax Havens: Toward An Optimal Selection Approach Based On Multicriteria Analysis

    OpenAIRE

    Tov Assogbavi; Sébastien Azondékon Azondékon

    2011-01-01

    The purpose of this paper is to demystify the concept of tax havens. After defining tax havens in a tax-planning framework, the paper introduces a tax haven selection methodology based on a variant of Gibson and Black multicriteria analysis to identify the most suitable tax haven for a given entity. The study shows the importance of subjective variables and how to incorporate them into a tax haven selection process. While tax advantages remain the key factor when searching for a tax haven sol...

  9. Assessing the Job Selection Criteria of Accounting Students: A Normative Approach

    Directory of Open Access Journals (Sweden)

    Umaru Zubairu

    2017-08-01

    Full Text Available This research assessed to what extent final-year Muslim accounting students in Malaysia considered Islamic principles when choosing a job after graduation. 356 final-year Muslim accounting students in four Malaysian universities were surveyed using an open-ended job selection scenario. The result shows that reality does not live up to the ideal. Only 16% of the respondents apply Islamic principles in making a job selection decision. The remaining 84% are more concerned with other criteria such as personal interests, salary considerations, and company reputation.

  10. Feature Selection as a Time and Cost-Saving Approach for Land Suitability Classification (Case Study of Shavur Plain, Iran

    Directory of Open Access Journals (Sweden)

    Saeid Hamzeh

    2016-10-01

    Full Text Available Land suitability classification is important in planning and managing sustainable land use. Most approaches to land suitability analysis combine a large number of land and soil parameters, and are time-consuming and costly. In this study, a potentially useful technique (combined feature selection and fuzzy-AHP method to increase the efficiency of land suitability analysis was presented. To this end, three different feature selection algorithms—random search, best search and genetic methods—were used to determine the most effective parameters for land suitability classification for the cultivation of barely in the Shavur Plain, southwest Iran. Next, land suitability classes were calculated for all methods by using the fuzzy-AHP approach. Salinity (electrical conductivity (EC, alkalinity (exchangeable sodium percentage (ESP, wetness and soil texture were selected using the random search method. Gypsum, EC, ESP, and soil texture were selected using both the best search and genetic methods. The result shows a strong agreement between the standard fuzzy-AHP methods and methods presented in this study. The values of Kappa coefficients were 0.82, 0.79 and 0.79 for the random search, best search and genetic methods, respectively, compared with the standard fuzzy-AHP method. Our results indicate that EC, ESP, soil texture and wetness are the most effective features for evaluating land suitability classification for the cultivation of barely in the study area, and uses of these parameters, together with their appropriate weights as obtained from fuzzy-AHP, can perform good results for land suitability classification. So, the combined feature selection presented and the fuzzy-AHP approach has the potential to save time and money for land suitability classification.

  11. Validating the Kinematic Wave Approach for Rapid Soil Erosion Assessment and Improved BMP Site Selection to Enhance Training Land Sustainability

    Science.gov (United States)

    2014-02-01

    installation based on a Euclidean distance allocation and assigned that installation’s threshold values. The second approach used a thin - plate spline ...installation critical nLS+ thresholds involved spatial interpolation. A thin - plate spline radial basis functions (RBF) was selected as the...the interpolation of installation results using a thin - plate spline radial basis function technique. 6.5 OBJECTIVE #5: DEVELOP AND

  12. [Lengthening temporalis myoplasty: Technical refinements].

    Science.gov (United States)

    Guerreschi, P; Labbé, D

    2015-10-01

    First described by Labbé in 1997, the lengthening temporalis myoplasty (LTM) ensures the transfer of the entire temporal muscle from the coronoid process to the upper half of the lip without interposition of aponeurotic tissue. Thanks to brain plasticity, the temporal muscle is able to change its function because it is entirely mobilized towards another effector: the labial commissure. After 6 months of speech rehabilitation, the muscle loses its chewing function and it acquires its new smiling function. We describe as far as possible all the technical points to guide surgeons who would like to perform this powerful surgical procedure. We show the coronoid process approaches both through an upper temporal fossa approach and a lower nasolabial fold approach. Rehabilitation starts 3 weeks after the surgery following a standardized protocol to move from a mandibular smile to a voluntary smile, and then a spontaneous smile in 3 steps. The LTM is the main part of a one-stage global treatment of the paralyzed face. It constitutes a dynamic palliative treatment usually started at the sequelae stage, 18 months after the outcome of a peripheral facial paralysis. This one-stage procedure is a reproducible and relevant surgical technique in the difficult treatment of peripheral facial paralysis. An active muscle is transferred to reanimate the labial commissure and to recreate a mobile nasolabial fold. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  13. A combination of modified transnasal endoscopic maxillectomy via transnasal prelacrimal recess approach with or without radiotherapy for selected sinonasal malignancies.

    Science.gov (United States)

    He, Shuangba; Bakst, Richard L; Guo, Tao; Sun, Jingwu

    2015-10-01

    An external approach for resection of sinonasal tumors is associated with increased morbidity. Therefore, we employed a modified transnasal endoscopic maxillectomy combined with pre and/or postoperative radiotherapy for early stage maxillary carcinomas. It aims to evaluate our early experience with endoscopic resection of selected malignant sinonasal tumors. The medical and radiology records of patients who underwent endonasal endoscopic resection of malignant sinonasal tumors between 2008 and 2012 were retrospectively reviewed. Ten cases of selected malignant tumor were performed to resect by modified transnasal endoscopic maxillectomy. All the patients were without evidence of disease at a mean follow-up of 26.8 months. No major complications were recorded. The mean hospitalization stay was 6.6 days. In very carefully selected cases of malignant tumors, modified transnasal endoscopic maxillectomy is acceptable. The postoperative complication rate is low, cosmetic outcome is excellent and patients do not require a long hospitalization.

  14. Are voluntary wheel running and open-field behavior correlated in mice? Different answers from comparative and artificial selection approaches.

    Science.gov (United States)

    Careau, Vincent; Bininda-Emonds, Olaf R P; Ordonez, Genesis; Garland, Theodore

    2012-09-01

    Voluntary wheel running and open-field behavior are probably the two most widely used measures of locomotion in laboratory rodents. We tested whether these two behaviors are correlated in mice using two approaches: the phylogenetic comparative method using inbred strains of mice and an ongoing artificial selection experiment on voluntary wheel running. After taking into account the measurement error and phylogenetic relationships among inbred strains, we obtained a significant positive correlation between distance run on wheels and distance moved in the open-field for both sexes. Thigmotaxis was negatively correlated with distance run on wheels in females but not in males. By contrast, mice from four replicate lines bred for high wheel running did not differ in either distance covered or thigmotaxis in the open field as compared with mice from four non-selected control lines. Overall, results obtained in the selection experiment were generally opposite to those observed among inbred strains. Possible reasons for this discrepancy are discussed.

  15. A multicriteria decision making approach based on fuzzy theory and credibility mechanism for logistics center location selection.

    Science.gov (United States)

    Wang, Bowen; Xiong, Haitao; Jiang, Chengrui

    2014-01-01

    As a hot topic in supply chain management, fuzzy method has been widely used in logistics center location selection to improve the reliability and suitability of the logistics center location selection with respect to the impacts of both qualitative and quantitative factors. However, it does not consider the consistency and the historical assessments accuracy of experts in predecisions. So this paper proposes a multicriteria decision making model based on credibility of decision makers by introducing priority of consistency and historical assessments accuracy mechanism into fuzzy multicriteria decision making approach. In this way, only decision makers who pass the credibility check are qualified to perform the further assessment. Finally, a practical example is analyzed to illustrate how to use the model. The result shows that the fuzzy multicriteria decision making model based on credibility mechanism can improve the reliability and suitability of site selection for the logistics center.

  16. Refinement for Transition Systems with Responses

    Directory of Open Access Journals (Sweden)

    Marco Carbone

    2012-07-01

    Full Text Available Motivated by the response pattern for property specifications and applications within flexible workflow management systems, we report upon an initial study of modal and mixed transition systems in which the must transitions are interpreted as must eventually, and in which implementations can contain may behaviors that are resolved at run-time. We propose Transition Systems with Responses (TSRs as a suitable model for this study. We prove that TSRs correspond to a restricted class of mixed transition systems, which we refer to as the action-deterministic mixed transition systems. We show that TSRs allow for a natural definition of deadlocked and accepting states. We then transfer the standard definition of refinement for mixed transition systems to TSRs and prove that refinement does not preserve deadlock freedom. This leads to the proposal of safe refinements, which are those that preserve deadlock freedom. We exemplify the use of TSRs and (safe refinements on a small medication workflow.

  17. Taiwan: refined need for consuming population

    International Nuclear Information System (INIS)

    Hayes, David.

    1995-01-01

    A brief discussion is given of the oil and gas industry in Taiwan. Topics covered include the possibility of privatization, refineries and refining contracts overseas, plans for a new petrochemical complex and an offshore submarine transmission pipeline. (UK)

  18. 1991 worldwide refining and gas processing directory

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This book ia an authority for immediate information on the industry. You can use it to find new business, analyze market trends, and to stay in touch with existing contacts while making new ones. The possibilities for business applications are numerous. Arranged by country, all listings in the directory include address, phone, fax and telex numbers, a description of the company's activities, names of key personnel and their titles, corporate headquarters, branch offices and plant sites. This newly revised edition lists more than 2000 companies and nearly 3000 branch offices and plant locations. This east-to-use reference also includes several of the most vital and informative surveys of the industry, including the U.S. Refining Survey, the Worldwide Construction Survey in Refining, Sulfur, Gas Processing and Related Fuels, the Worldwide Refining and Gas Processing Survey, the Worldwide Catalyst Report, and the U.S. and Canadian Lube and Wax Capacities Report from the National Petroleum Refiner's Association

  19. Development of a Refined Staff Group Trainer

    National Research Council Canada - National Science Library

    Quensel, Susan

    1999-01-01

    .... As a follow-on effort to the previous SGT project, the goal was to refine a brigade-level staff training program to more effectively and efficiently coordinate the activities within and between the...

  20. Did we choose the best one? A new site selection approach based on exposure and uptake potential for waste incineration.

    Science.gov (United States)

    Demirarslan, K Onur; Korucu, M Kemal; Karademir, Aykan

    2016-08-01

    Ecological problems arising after the construction and operation of a waste incineration plant generally originate from incorrect decisions made during the selection of the location of the plant. The main objective of this study is to investigate how the selection method for the location of a new municipal waste incineration plant can be improved by using a dispersion modelling approach supported by geographical information systems and multi-criteria decision analysis. Considering this aim, the appropriateness of the current location of an existent plant was assessed by applying a pollution dispersion model. Using this procedure, the site ranking for a total of 90 candidate locations and the site of the existing incinerator were determined by a new location selection practice and the current place of the plant was evaluated by ANOVA and Tukey tests. This ranking, made without the use of modelling approaches, was re-evaluated based on the modelling of various variables, including the concentration of pollutants, population and population density, demography, temporality of meteorological data, pollutant type, risk formation type by CALPUFF and re-ranking the results. The findings clearly indicate the impropriety of the location of the current plant, as the pollution distribution model showed that its location was the fourth-worst choice among 91 possibilities. It was concluded that the location selection procedures for waste incinerators should benefit from the improvements obtained by the articulation of pollution dispersion studies combined with the population density data to obtain the most suitable location. © The Author(s) 2016.