WorldWideScience

Sample records for envelope approach comparison

  1. Oriented stochastic data envelopment models: ranking comparison to stochastic frontier approach

    Czech Academy of Sciences Publication Activity Database

    Brázdik, František

    -, č. 271 (2005), s. 1-46 ISSN 1211-3298 Institutional research plan: CEZ:AV0Z70850503 Keywords : stochastic data envelopment analysis * linear programming * rice farm Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp271.pdf

  2. Failure envelope approach for consolidated undrained capacity of shallow foundations

    OpenAIRE

    Vulpe, Cristina; Gourvenec, Susan; Leman, Billy; Fung, Kah Ngii

    2016-01-01

    A generalized framework is applied to predict consolidated undrained VHM failure envelopes for surface circular and strip foundations. The failure envelopes for consolidated undrained conditions are shown to be scaled from those for unconsolidated undrained conditions by the uniaxial consolidated undrained capacities, which are predicted through a theoretical framework based on fundamental critical state soil mechanics. The framework is applied to results from small-strain finite-element anal...

  3. The Hellmann–Feynman theorem, the comparison theorem, and the envelope theory

    Directory of Open Access Journals (Sweden)

    Claude Semay

    2015-01-01

    Full Text Available The envelope theory is a convenient method to compute approximate solutions for bound state equations in quantum mechanics. It is shown that these approximate solutions obey a kind of Hellmann–Feynman theorem, and that the comparison theorem can be applied to these approximate solutions for two ordered Hamiltonians.

  4. An Adaptive Nonlinear Aircraft Maneuvering Envelope Estimation Approach for Online Applications

    Science.gov (United States)

    Schuet, Stefan R.; Lombaerts, Thomas Jan; Acosta, Diana; Wheeler, Kevin; Kaneshige, John

    2014-01-01

    A nonlinear aircraft model is presented and used to develop an overall unified robust and adaptive approach to passive trim and maneuverability envelope estimation with uncertainty quantification. The concept of time scale separation makes this method suitable for the online characterization of altered safe maneuvering limitations after impairment. The results can be used to provide pilot feedback and/or be combined with flight planning, trajectory generation, and guidance algorithms to help maintain safe aircraft operations in both nominal and off-nominal scenarios.

  5. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    Science.gov (United States)

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable

  6. Measuring the Capacity Utilization of Public District Hospitals in Tunisia: Using Dual Data Envelopment Analysis Approach

    Directory of Open Access Journals (Sweden)

    Chokri Arfa

    2017-01-01

    Full Text Available Background Public district hospitals (PDHs in Tunisia are not operating at full plant capacity and underutilize their operating budget. Methods Individual PDHs capacity utilization (CU is measured for 2000 and 2010 using dual data envelopment analysis (DEA approach with shadow prices input and output restrictions. The CU is estimated for 101 of 105 PDH in 2000 and 94 of 105 PDH in 2010. Results In average, unused capacity is estimated at 18% in 2010 vs. 13% in 2000. Of PDHs 26% underutilize their operating budget in 2010 vs. 21% in 2000. Conclusion Inadequate supply, health quality and the lack of operating budget should be tackled to reduce unmet user’s needs and the bypassing of the PDHs and, thus to increase their CU. Social health insurance should be turned into a direct purchaser of curative and preventive care for the PDHs.

  7. Efficiency of the Slovak forestry in comparison to other European countries: An application of Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Kovalčík Miroslav

    2018-03-01

    Full Text Available Efficiency improvement is important for increasing the competitiveness of any sector and the same is essential for the forestry sector. A non-parametric approach – Data Envelopment Analysis (DEA was used for the assessment of forestry efficiency. The paper presents the results of the efficiency evaluation of forestry in European countries using DEA. One basic and two modified models (labour and wood sale were proposed, based on available input and output data from Integrated Environmental and Economic Accounts for Forests and specific conditions of forestry also. The sample size was 22 countries and the data for 2005–2008 was processed. Obtained results show average efficiency in the range of 69 – 90% (depending on the model. Based on the results of the analysis following can be concluded: Slovak forestry achieved under average efficiency in comparison to other European countries, there were great differences in efficiency among individual countries; state of economy (advanced countries and countries with economy in transition and region did not influence the efficiency statistically significant.

  8. Early Site Permit Demonstration Program, plant parameters envelopes: Comparison with ranges of values for four hypothetical sites

    International Nuclear Information System (INIS)

    1992-09-01

    The purpose of this volume is to report the results of the comparison of the ALWR plan parameters envelope with values of site characteristics developed for our hypothetical sites that generally represent conditions encountered within the United States. This effort is not intended to identify or address the suitability of any existing site, site area, or region in the United States. Also included in this volume is Appendix F, SERCH Summaries Regarding Siting

  9. Energy optimization and prediction of complex petrochemical industries using an improved artificial neural network approach integrating data envelopment analysis

    International Nuclear Information System (INIS)

    Han, Yong-Ming; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-01-01

    Graphical abstract: This paper proposed an energy optimization and prediction of complex petrochemical industries based on a DEA-integrated ANN approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA-ANN prediction model is effectively verified by executing a linear comparison between all DMUs and the effective DMUs through the standard data source from the UCI (University of California at Irvine) repository. Finally, the proposed model is validated through an application in a complex ethylene production system of China petrochemical industry. Meanwhile, the optimization result and the prediction value are obtained to reduce energy consumption of the ethylene production system, guide ethylene production and improve energy efficiency. - Highlights: • The DEA-integrated ANN approach is proposed. • The DEA-ANN prediction model is effectively verified through the standard data source from the UCI repository. • The energy optimization and prediction framework of complex petrochemical industries based on the proposed method is obtained. • The proposed method is valid and efficient in improvement of energy efficiency in complex petrochemical plants. - Abstract: Since the complex petrochemical data have characteristics of multi-dimension, uncertainty and noise, it is difficult to accurately optimize and predict the energy usage of complex petrochemical systems. Therefore, this paper proposes a data envelopment analysis (DEA) integrated artificial neural network (ANN) approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA

  10. IMPLEMENTATION OF THE LEAN-KAIZEN APPROACH IN FASTENER INDUSTRIES USING THE DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sunil Kumar

    2017-04-01

    Full Text Available This research paper is an attempt to improve the quality system of ten small scale fastener manufacturing industries through the implementation of the Lean-Kaizen approach using the Data Envelopment Analysis (DEA Charnes Cooper & Rhodes (CCR model with constant returns to scale (CRS. Output maximization is taken as the objective function to identify the percentage scope of improvements. The data is collected by paying personal visits to the selected industries for three inputs (manpower, maintenance, and training of employees and two outputs (quality, on-time delivery of their quality system. The DEA CCR model is applied to identify efficiency scores of the quality system by taking the most efficient industry as a benchmark for the rest of the organizations. The Lean-Kaizen approach is applied to identify waste / non-value added activities in outputs of the selected industries. Four Kaizen events are proposed to eliminate waste / non-value added activities in their quality system. The data collected after the Kaizen events are further analyzed by the DEA CCR model. The improvements in efficiency scores of the selected industries are presented as findings in this research paper. Two fastener industries became 100% efficient while the rest of the organizations reported 8% to 49% improvements in their efficiency scores of the quality system. The conclusions are made as the Lean-Kaizen using DEA is found to be an effective approach to improve the quality system of fastener industries. This study will be beneficial for researchers, practitioners and academicians for tackling the inefficiencies in the organization.

  11. Improving energy productivity of sunflower production using data envelopment analysis (DEA) approach.

    Science.gov (United States)

    Avval, Seyed Hashem Mousavi; Rafiee, Shahin; Jafari, Ali; Mohammadi, Ali

    2011-08-15

    Efficient use of energy in agriculture is one of the conditions for sustainable agricultural production. This study applies the data envelopment analysis (DEA) approach to the data of 95 randomly selected farms to investigate the technical and scale efficiencies of farmers with respect to energy use for sunflower production in Golestan province, Iran. The study also helps to identify the wasteful usage and the optimum level of energy from different inputs. According to the results of DEA models, about 36% of farmers were found to be technically efficient and the mean efficiency of sunflower producers was found to be 0.87 and 0.96 under the constant and variable returns to scale assumptions respectively. The optimum energy requirement was calculated as 8448.3 MJ ha⁻¹; accordingly, a potential reduction of 10.8% (1020.3 MJ ha⁻¹) in total energy input could be achieved by raising the performance of farmers to the highest level. Applying a better machinery management technique and conservation tillage methods, application of fertilisers by performance monitoring and utilisation of alternative sources of energy such as compost and chopped residues may be the pathways for improving energy productivity and reducing the environmental footprint. Copyright © 2011 Society of Chemical Industry.

  12. Comparison of Fe and Ni opacity calculations for a better understanding of pulsating stellar envelopes

    International Nuclear Information System (INIS)

    Gilles, D.; Turck-Chieze, S.; Loisel, G.; Piau, L.; Ducret, J.E.; Poirier, M.; Blenski, T.; Thais, F.; Blancard, C.; Cosse, P.; Faussurier, G.; Gilleron, F.; Pain, J.C.; Porcherot, Q.; Guzik, J.A.; Kilcrease, D.P.; Magee, N.H.; Harris, J.; Busquet, M.; Delahaye, F.; Zeippen, C.J.; Bastiani-Ceccotti, S.

    2011-01-01

    Opacity is an important ingredient of the evolution of stars. The calculation of opacity coefficients is complicated by the fact that the plasma contains partially ionized heavy ions that contribute to opacity dominated by H and He. Up to now, the astrophysical community has greatly benefited from the work of the contributions of Los Alamos, Livermore and the Opacity Project (OP). However unexplained differences of up to 50% in the radiative forces and Rosseland mean values for Fe have been noticed for conditions corresponding to stellar envelopes. Such uncertainty has a real impact on the understanding of pulsating stellar envelopes, on the excitation of modes, and on the identification of the mode frequencies. Temperature and density conditions equivalent to those found in stars can now be produced in laboratory experiments for various atomic species. Recently the photo-absorption spectra of nickel and iron plasmas have been measured during the LULI 2010 campaign, for temperatures between 15 and 40 eV and densities of similar to 3 mg/cm 3 . A large theoretical collaboration, the 'OPAC', has been formed to prepare these experiments. We present here the set of opacity calculations performed by eight different groups for conditions relevant to the LULI 2010 experiment and to astrophysical stellar envelope conditions. (authors)

  13. A non-parametric Data Envelopment Analysis approach for improving energy efficiency of grape production

    International Nuclear Information System (INIS)

    Khoshroo, Alireza; Mulwa, Richard; Emrouznejad, Ali; Arabi, Behrouz

    2013-01-01

    Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production. In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming. The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. - Highlights: • The focus of this paper is to identify excessive use of energy and optimize energy consumption in grape production. • We measure the efficiency as a function of labor/machinery/chemicals/farmyard manure/diesel-fuel/electricity/water. • Data were obtained from 41 grape

  14. A comparison of two Shuttle launch and entry suits - Reach envelope, isokinetic strength, and treadmill tests

    Science.gov (United States)

    Schafer, Lauren E.; Rajulu, Sudhakar L.; Klute, Glenn K.

    1992-01-01

    A quantification has been conducted of any existing differences between the performance, in operational conditions, of the Space Shuttle crew Launch Entry Suit (LES) and the new Advanced Crew Escape Suit (ACES). While LES is a partial-pressure suit, the ACES system which is being considered as a replacement for LES is a full-pressure suit. Three tests have been conducted with six subjects to ascertain the suits' reach envelope, strength, and treadmill performance. No significant operational differences were found between the two suit designs.

  15. Carbon footprint of a reflective foil and comparison with other solutions for thermal insulation in building envelope

    International Nuclear Information System (INIS)

    Proietti, Stefania; Desideri, Umberto; Sdringola, Paolo; Zepparelli, Francesco

    2013-01-01

    Highlights: ► Environmental and energy assessment of thermal insulating materials in building envelope. ► Carbon footprint of a reflective foil, conceived and produced by an Italian company. ► Study conducted according to principles of LCA – Life Cycle Assessment. ► Identification of main impacting processes and measures for reducing emissions. ► Comparison with traditional insulating materials (EPS and rockwool). - Abstract: The present study aims at assessing environmental and energy compatibility of different solutions of thermal insulation in building envelope. In fact a good insulation results in a reduction of heating/cooling energy consumptions; on the other hand construction materials undergo production, transformation and transport processes, whose energy and resources consumptions may lead to a significant decrease of the environmental benefits. The paper presents a detailed carbon footprint of a product (CFP, defined as the sum of greenhouse gas emissions and removals of a product system, expressed in CO 2 equivalents), which is a reflective foil conceived and produced by an Italian company. CFP can be seen as a Life Cycle Assessment with climate change as the single impact category; it does not assess other potential social, economic and environmental impacts arising from the provision of products. The analysis considers all stages of the life cycle, from the extraction of raw materials to the product’s disposal, i.e. “from cradle to grave”; it was carried out according to UNI EN ISO 14040 and 14044, and LCA modelling was performed using SimaPro software tool. On the basis of obtained results, different measures have been proposed in order to reduce emissions in the life cycle and neutralize residual carbon footprint. The results allowed to make an important comparison concerning the environmental performance of the reflective foil in comparison with other types of insulating materials

  16. A proxy approach to dealing with the infeasibility problem in super-efficiency data envelopment analysis

    OpenAIRE

    Cheng, Gang; Zervopoulos, Panagiotis

    2012-01-01

    Super-efficiency data envelopment analysis (SE-DEA) models are expressions of the traditional DEA models featuring the exclusion of the unit under evaluation from the reference set. The SE-DEA models have been applied in various cases such as sensitivity and stability analysis, measurement of productivity changes,outliers’ identification,and classification and ranking of decision making units (DMUs). A major deficiency in the SE-DEA models is their infeasibility in determining super-efficienc...

  17. Design of the Building Envelope: A Novel Multi-Objective Approach for the Optimization of Energy Performance and Thermal Comfort

    Directory of Open Access Journals (Sweden)

    Fabrizio Ascione

    2015-08-01

    Full Text Available According to the increasing worldwide attention to energy and the environmental performance of the building sector, building energy demand should be minimized by considering all energy uses. In this regard, the development of building components characterized by proper values of thermal transmittance, thermal capacity, and radiative properties is a key strategy to reduce the annual energy need for the microclimatic control. However, the design of the thermal characteristics of the building envelope is an arduous task, especially in temperate climates where the energy demands for space heating and cooling are balanced. This study presents a novel methodology for optimizing the thermo-physical properties of the building envelope and its coatings, in terms of thermal resistance, capacity, and radiative characteristics of exposed surfaces. A multi-objective approach is adopted in order to optimize energy performance and thermal comfort. The optimization problem is solved by means of a Genetic Algorithm implemented in MATLAB®, which is coupled with EnergyPlus for performing dynamic energy simulations. For demonstration, the methodology is applied to a residential building for two different Mediterranean climates: Naples and Istanbul. The results show that for Naples, because of the higher incidence of cooling demand, cool external coatings imply significant energy savings, whereas the insulation of walls should be high but not excessive (no more than 13–14 cm. The importance of high-reflective coating is clear also in colder Mediterranean climates, like Istanbul, although the optimal thicknesses of thermal insulation are higher (around 16–18 cm. In both climates, the thermal envelope should have a significant mass, obtainable by adopting dense and/or thick masonry layers. Globally, a careful design of the thermal envelope is always necessary in order to achieve high-efficiency buildings.

  18. A multi-criteria model for the comparison of building envelope energy retrofits

    Science.gov (United States)

    Donnarumma, Giuseppe; Fiore, Pierfrancesco

    2017-02-01

    In light of the current EU guidelines in the energy field, improving building envelope performance cannot be separated from the context of satisfying the environmental sustainability requirements, reducing the costs associated with the life cycle of the building as well as economic and financial feasibility. Therefore, identifying the "optimal" energy retrofit solutions requires the simultaneous assessment of several factors and thus becomes a problem of choice between several possible alternatives. To facilitate the work of the decision-makers, public or private, adequate decision support tools are of great importance. Starting from this need, a model based on the multi-criteria analysis "AHP" technique is proposed, along with the definition of three synthetic indices associated with the three requirements of "Energy Performance", "Sustainability Performance" and "Cost". From the weighted aggregation of the three indices, a global index of preference is obtained that allows to "quantify" the satisfaction level of the i-th alternative from the point of view of a particular group of decision-makers. The model is then applied, by way of example, to the case-study of the energetic redevelopment of a former factory, assuming its functional conversion. Twenty possible alternative interventions on the opaque vertical closures, resulting from the combination of three thermal insulators families (synthetic, natural and mineral) with four energy retrofitting techniques are compared and the results obtained critically discussed by considering the point of view of the three different groups of decision-makers.

  19. A Universal Approach to Optimize the Folding and Stability of Prefusion-Closed HIV-1 Envelope Trimers

    Directory of Open Access Journals (Sweden)

    Lucy Rutten

    2018-04-01

    Full Text Available Summary: The heavily glycosylated native-like envelope (Env trimer of HIV-1 is expected to have low immunogenicity, whereas misfolded forms are often highly immunogenic. High-quality correctly folded Envs may therefore be critical for developing a vaccine that induces broadly neutralizing antibodies. Moreover, the high variability of Env may require immunizations with multiple Envs. Here, we report a universal strategy that provides for correctly folded Env trimers of high quality and yield through a repair-and-stabilize approach. In the repair stage, we utilized a consensus strategy that substituted rare strain-specific residues with more prevalent ones. The stabilization stage involved structure-based design and experimental assessment confirmed by crystallographic feedback. Regions important for the refolding of Env were targeted for stabilization. Notably, the α9-helix and an intersubunit β sheet proved to be critical for trimer stability. Our approach provides a means to produce prefusion-closed Env trimers from diverse HIV-1 strains, a substantial advance for vaccine development. : Rutten et al. describe a universal repair and stabilize approach that corrects rare mutations and stabilizes refolding regions to obtain high-quality HIV Envs with high yields. The crystal structure shows how the optimization of the trimer interface between α9, α6, and the intersubunit β-sheet stabilizes the membrane-proximal base. Keywords: envelope protein, chronic, ConC_base, HIV, SOSIP, stabilization, transmitted/founder, vaccine, X-ray structure, hybrid sheet

  20. A Directed Molecular Evolution Approach to Improved Immunogenicity of the HIV-1 Envelope Glycoprotein

    Science.gov (United States)

    Du, Sean X.; Xu, Li; Zhang, Wenge; Tang, Susan; Boenig, Rebecca I.; Chen, Helen; Mariano, Ellaine B.; Zwick, Michael B.; Parren, Paul W. H. I.; Burton, Dennis R.; Wrin, Terri; Petropoulos, Christos J.; Ballantyne, John A.; Chambers, Michael; Whalen, Robert G.

    2011-01-01

    A prophylactic vaccine is needed to slow the spread of HIV-1 infection. Optimization of the wild-type envelope glycoproteins to create immunogens that can elicit effective neutralizing antibodies is a high priority. Starting with ten genes encoding subtype B HIV-1 gp120 envelope glycoproteins and using in vitro homologous DNA recombination, we created chimeric gp120 variants that were screened for their ability to bind neutralizing monoclonal antibodies. Hundreds of variants were identified with novel antigenic phenotypes that exhibit considerable sequence diversity. Immunization of rabbits with these gp120 variants demonstrated that the majority can induce neutralizing antibodies to HIV-1. One novel variant, called ST-008, induced significantly improved neutralizing antibody responses when assayed against a large panel of primary HIV-1 isolates. Further study of various deletion constructs of ST-008 showed that the enhanced immunogenicity results from a combination of effective DNA priming, an enhanced V3-based response, and an improved response to the constant backbone sequences. PMID:21738594

  1. Evaluating the Total Factor Productivity Growth in Manufacturing Industries of Iran (Data Envelopment Analysis Approach

    Directory of Open Access Journals (Sweden)

    Vahideh Ahmadi

    2014-01-01

    Full Text Available This paper examines the total factor productivity changes for 23 main manufacturing industries (2-digit ISIC group and country's provinces using data envelopment analysis during 2005 to 2007. The results show 2.3% increase in the productivity of the whole sector (average over the studied period, while the productivity of the country's provinces decreases by 7.3%, in the same period. We find Food and Beverage products and Khuzestan province having the highest productivity growth. Non-optimal allocation of resources and using of old equipments are the most important drawbacks of productivity growth for 23 main ISIC groups and provinces. Finally estimation of the regression models by panel data method reveals the privatization and increasing of labor’s available capital having a significant effect on productivity growth.

  2. Building envelope

    CSIR Research Space (South Africa)

    Gibberd, Jeremy T

    2009-01-01

    Full Text Available for use in the building. This is done through photovoltaic and solar water heating panels and wind turbines. Ideally these are integrated in the design of the building envelope to improve the aesthetic quality of the building and minimise material... are naturally ventilated. Renewable energy The building envelope includes renewable energy generation such as photovoltaics, wind turbines and solar water heaters and 10% of the building’s energy requirements are generated from these sources. Views All...

  3. Biomimetic Envelopes

    OpenAIRE

    Ilaria Mazzoleni

    2010-01-01

    How to translate the lessons learned from the analysis and observation of the animal world is the design learning experience presented in this article. Skin is a complex and incredibly sophisticated organ that performs various functions, including protection, sensation and heat and water regulation. In a similar way building envelopes serve multiple roles, as they are the interface between the building inhabitants and environmental elements. The resulting architectural building envelopes prot...

  4. Comparison of intensity discrimination, increment detection, and comodulation masking release in the envelope and audio-frequency domains

    DEFF Research Database (Denmark)

    Nelson, Paul C.; Ewert, Stephan; Carney, Laurel H.

    In the audio-frequency domain, the envelope apparently plays an important role in detection of intensity increments and in comodulation masking release (CMR). The current study addressed the question whether the second-order envelope ("venelope") contributes similarly for comparable experiments i...

  5. A new approach for product cost estimation using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Adil Salam

    2012-10-01

    Full Text Available Cost estimation of new products has always been difficult as only few design, manufacturing and operational features will be known. In these situations, parametric or non-parametric methods are commonly used to estimate the cost of a product given the corresponding cost drivers. The parametric models use priori determined cost function where the parameters of the function are evaluated from historical data. Non-parametric methods, on the other hand, attempt to fit curves to the historic data without predetermined function. In both methods, it is assumed that the historic data used in the analysis is a true representation of the relation between the cost drivers and the corresponding costs. However, because of efficiency variations of the manufacturers and suppliers, changes in supplier selections, market fluctuations, and several other reasons, certain costs in the historic data may be too high whereas other costs may represent better deals for their corresponding cost drivers. Thus, it may be important to rank the historic data and identify benchmarks and estimate the target costs of the product based on these benchmarks. In this paper, a novel adaptation of cost drivers and cost data is introduced in order to use data envelopment analysis for the purpose of ranking cost data and identify benchmarks, and then estimate the target costs of a new product based on these benchmarks. An illustrative case study has been presented for the cost estimation of landing gears of an aircraft manufactured by an aerospace company located in Montreal, CANADA.

  6. Benchmarking the energy performance of office buildings: A data envelopment analysis approach

    Directory of Open Access Journals (Sweden)

    Molinos-Senante, María

    2016-12-01

    Full Text Available The achievement of energy efficiency in buildings is an important challenge facing both developed and developing countries. Very few papers have assessed the energy efficiency of office buildings using real data. To overcome this limitation, this paper proposes an energy efficiency index for buildings having a large window-to-wall ratio, and uses this index to identify the main architectural factors affecting energy performance. This paper assesses, for the first time, the energy performances of 34 office buildings in Santiago, Chile, by using data envelopment analysis. Overall energy efficiency is decomposed into two indices: the architectural energy efficiency index, and the management energy efficiency index. This decomposition is an essential step in identifying the main drivers of energy inefficiency and designing measures for improvement. Office buildings examined here have significant room for improving their energy efficiencies, saving operational costs and reducing greenhouse gas emissions. The methodology and results of this study will be of great interest to building managers and policymakers seeking to increase the sustainability of cities.

  7. Multi-attribute Reverse Auction Design Based on Fuzzy Data Envelopment Analysis Approach

    Directory of Open Access Journals (Sweden)

    Deyan Chen

    2017-08-01

    Full Text Available Multi-attribute reverse auction is widely used for the procurements of enterprises or governments. To overcome the difficulty of identifying bidding attribute weight and score function of the buyer, the multi-round auction and bidding models with multiple winners are established based on fuzzy data envelopment analysis. The winner determination model of the buyer considers the integrated input-output efficiency of k winners. The bidding strategy of seller is divided into two parts: the first one estimates the weight of the ideal supplier that is thought to be the buyer’s preference; the second one is to calculate the weight of the test supplier which reflects the change trend of current weights and the seller’s weakness. The final predicted weight is the weighted sum of both. On the basis of known weight, the test supplier can improve his efficiency to increase the winning chance in the next round auction. Our models comprise crisp numbers and fuzzy numbers. Finally, a numerical example verifies the validity of the proposed models.

  8. A data envelopment analysis based model for proposing safety improvements: a FMEA approach

    International Nuclear Information System (INIS)

    Garcia, Pauli A. de A.; Barbosa Junior, Gilberto V.; Melo, P.F. Frutuoso e

    2005-01-01

    When performing a probabilistic safety assessment, one important step is the identification of the critical or weak points of all systems to be considered. By properly ranking these critical points, improvement recommendations may be proposed, in order to reduce the associated risks. Many tools are available for the identification of critical points, like the Failure Mode and Effect Analysis (FMEA) and the Hazard and Operability Studies (HAZOP). Once the failure modes or deviations are identified, indices associated to the occurrence probabilities, detection potential, and the effects severity, are assigned to them, and so the failure modes or deviations ranking is performed. It is common practice to assign risk priority numbers for this purpose. These numbers are obtained by multiplying the three aforementioned indices, which typically vary from 1 to 10 (natural numbers). Here, the greater the index, the worst the situation. In this paper, a data envelopment analysis (DEA) based model is used to identify the most critical failure modes or deviations and, by means of their respective distances to the boundary, to assess the improvement percentage for each index of each failure mode or deviation. Starting from this identification procedure, the decision maker can more efficiently propose improvement actions, like reliability allocation, detection design, protective barriers, etc. (author)

  9. How efficient are referral hospitals in Uganda? A data envelopment analysis and tobit regression approach.

    Science.gov (United States)

    Mujasi, Paschal N; Asbu, Eyob Z; Puig-Junoy, Jaume

    2016-07-08

    Hospitals represent a significant proportion of health expenditures in Uganda, accounting for about 26 % of total health expenditure. Improving the technical efficiency of hospitals in Uganda can result in large savings which can be devoted to expand access to services and improve quality of care. This paper explores the technical efficiency of referral hospitals in Uganda during the 2012/2013 financial year. This was a cross sectional study using secondary data. Input and output data were obtained from the Uganda Ministry of Health annual health sector performance report for the period July 1, 2012 to June 30, 2013 for the 14 public sector regional referral and 4 large private not for profit hospitals. We assumed an output-oriented model with Variable Returns to Scale to estimate the efficiency score for each hospital using Data Envelopment Analysis (DEA) with STATA13. Using a Tobit model DEA, efficiency scores were regressed against selected institutional and contextual/environmental factors to estimate their impacts on efficiency. The average variable returns to scale (Pure) technical efficiency score was 91.4 % and the average scale efficiency score was 87.1 % while the average constant returns to scale technical efficiency score was 79.4 %. Technically inefficient hospitals could have become more efficient by increasing the outpatient department visits by 45,943; and inpatient days by 31,425 without changing the total number of inputs. Alternatively, they would achieve efficiency by for example transferring the excess 216 medical staff and 454 beds to other levels of the health system without changing the total number of outputs. Tobit regression indicates that significant factors in explaining hospital efficiency are: hospital size (p Uganda.

  10. Energy efficiency and policy in Swedish pulp and paper mills: A data envelopment analysis approach

    International Nuclear Information System (INIS)

    Blomberg, Jerry; Henriksson, Eva; Lundmark, Robert

    2012-01-01

    The paper provides an empirical assessment of the electricity efficiency improvement potential in the Swedish pulp and paper industry by employing data envelopment analysis (DEA) and mill-specific input and output data for the years 1995, 2000 and 2005. The empirical results are discussed in relation to the reported outcomes of the Swedish voluntary energy efficiency programme PFE. The estimated electricity efficiency gap is relatively stable over the time period; it equals roughly 1 TWh per year for the sample mills and this is three times higher than the corresponding self-reported electricity savings in PFE. This result is largely a reflection of the fact that in the pulp and paper industry electricity efficiency improvements are typically embodied in the diffusion of new capital equipment, and there is a risk that some of the reported measures in PFE simply constitute an inefficient speed-up of capital turnover. The above does not preclude, though, that many other measures in PFE may have addressed some relevant market failures and barriers in the energy efficiency market. Overall the analysis suggests that future energy efficiency programs could plausibly be better targeted at explicitly promoting technological progress as well as at addressing the most important information and behaviour-related failures. - Highlights: ► We provide an empirical assessment of the electricity efficiency improvement potential in the Swedish pulp and paper industry. ► The empirical results are discussed in relation to the reported outcomes of the Swedish voluntary energy efficiency programme PFE. ► The estimated electricity efficiency gap is relatively stable over the time period and equals roughly 1 TWh for the sample mills (three times higher than the corresponding self-reported electricity savings in PFE). ► The results suggest that future energy efficiency programs could be better targeted at explicitly promoting technological progress as well as at addressing the

  11. A thermal-optical analysis comparison between symmetric tubular absorber compound parabolic concentrating solar collector with and without envelope

    International Nuclear Information System (INIS)

    Tchinda, R.

    2005-11-01

    Equations describing the heat transfer in symmetric, compound parabolic concentrating solar collectors (CPCs) with and without envelope have been established. The model takes into account the non linear behavior of these two systems. A theoretical numerical model has been developed to outline the effect of the envelope on the thermal and optical performance of CPCs. The effects of the flow rate, the plate length, the selective coating, etc. are studied. The over-all thermal loss coefficient and the enclosure absorption factor for both types are defined. It is found that the efficient configuration has an envelope. Theoretical computed values are in good agreement with the experimental values published in the literature. (author)

  12. Review of life-cycle approaches coupled with data envelopment analysis: launching the CFP + DEA method for energy policy making.

    Science.gov (United States)

    Vázquez-Rowe, Ian; Iribarren, Diego

    2015-01-01

    Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.

  13. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China: A Bootstrap-Data Envelopment Analysis Approach.

    Science.gov (United States)

    Li, Hao; Dong, Siping

    2015-01-01

    China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.

  14. Interaction and inhibition of dengue envelope glycoprotein with mammalian receptor DC-sign, an in-silico approach.

    Directory of Open Access Journals (Sweden)

    Masaud Shah

    Full Text Available Membrane fusion is the central molecular event during the entry of enveloped viruses into cells. The critical agents of this process are viral surface proteins, primed to facilitate cell bilayer fusion. The important role of Dendritic-cell-specific ICAM3-grabbing non-integrin (DC-SIGN in Dengue virus transmission makes it an attractive target to interfere with Dengue virus Propagation. Receptor mediated endocytosis allows the entry of virions due to the presence of endosomal membranes and low pH-induced fusion of the virus. DC-SIGN is the best characterized molecule among the candidate protein receptors and is able to mediate infection with the four serotypes of dengue virus (DENV. Unrestrained pair wise docking was used for the interaction of dengue envelope protein with DC-SIGN and monoclonal antibody 2G12. Pre-processed the PDB coordinates of dengue envelope glycoprotein and other candidate proteins were prepared and energy minimized through AMBER99 force field distributed in MOE software. Protein-protein interaction server, ZDOCK was used to find molecular interaction among the candidate proteins. Based on these interactions it was found that antibody successfully blocks the glycosylation site ASN 67 and other conserved residues present at DC-SIGN-Den-E complex interface. In order to know for certain, the exact location of the antibody in the envelope protein, co-crystallize of the envelope protein with these compounds is needed so that their exact docking locations can be identified with respect to our results.

  15. Pleistocene climate, phylogeny, and climate envelope models: an integrative approach to better understand species' response to climate change.

    Directory of Open Access Journals (Sweden)

    A Michelle Lawing

    Full Text Available Mean annual temperature reported by the Intergovernmental Panel on Climate Change increases at least 1.1°C to 6.4°C over the next 90 years. In context, a change in climate of 6°C is approximately the difference between the mean annual temperature of the Last Glacial Maximum (LGM and our current warm interglacial. Species have been responding to changing climate throughout Earth's history and their previous biological responses can inform our expectations for future climate change. Here we synthesize geological evidence in the form of stable oxygen isotopes, general circulation paleoclimate models, species' evolutionary relatedness, and species' geographic distributions. We use the stable oxygen isotope record to develop a series of temporally high-resolution paleoclimate reconstructions spanning the Middle Pleistocene to Recent, which we use to map ancestral climatic envelope reconstructions for North American rattlesnakes. A simple linear interpolation between current climate and a general circulation paleoclimate model of the LGM using stable oxygen isotope ratios provides good estimates of paleoclimate at other time periods. We use geologically informed rates of change derived from these reconstructions to predict magnitudes and rates of change in species' suitable habitat over the next century. Our approach to modeling the past suitable habitat of species is general and can be adopted by others. We use multiple lines of evidence of past climate (isotopes and climate models, phylogenetic topology (to correct the models for long-term changes in the suitable habitat of a species, and the fossil record, however sparse, to cross check the models. Our models indicate the annual rate of displacement in a clade of rattlesnakes over the next century will be 2 to 3 orders of magnitude greater (430-2,420 m/yr than it has been on average for the past 320 ky (2.3 m/yr.

  16. Efficiency of international cooperation schemata in African countries: A comparative analysis using a data envelopment analysis approach

    Directory of Open Access Journals (Sweden)

    Victor Martin-Perez

    2017-02-01

    Full Text Available Background: Efficiency measurement by means of data envelopment analysis (DEA in the non-profit sector has focused on the so-called Stage I of non-profit organisations, namely, fundraising efforts (which are the most influential determinants of raising funds in order to increase the amount of contributions. However, for the so-called Stage II of non-profit organisations, namely, spending the achieved resources to program services delivery, DEA studies are very scarce. In attempting to address this research gap and to the best of our knowledge, this investigation is the first study that applies DEA to the assessment of international cooperation schemata. Consequently, we offer a significant contribution to the literature by overcoming the limitations of other techniques used to assess the efficiency and providing new insight into the efficiency of targeted different international cooperation schemata (ICS in international cooperation development projects. Aim: The purpose of this study is to evaluate and compare the efficiency of the ICS of developmental projects funded by the Spanish Agency for International Cooperation for Development. Setting: Our setting is composed of different international cooperation projects funded with different schemata by the Spanish Agency for International Cooperation for Development between 2002 and 2006 in two African countries that are top priority targets of Spanish international aid: Morocco, and Mozambique. Methods: Using a sample of 48 international cooperation projects carried out in two African countries considered priorities in the Spanish Cooperation Master Plan, we analyse project efficiency using DEA. Results: The findings suggest that some schemata are more efficient than others when applied to international cooperation projects (ICS. Specifically, we find that permanent open-call subsidies are more efficient than non-governmental development organisation subsidies. Conclusion: Measures for evaluating

  17. Radiative properties of stellar envelopes: Comparison of asteroseismic results to opacity calculations and measurements for iron and nickel

    International Nuclear Information System (INIS)

    Turck-Chieze, S.; Gilles, D.; Le Pennec, M.; Blenski, T.; Thais, F.; Bastiani-Ceccotti, S.; Blancard, C.; Caillaud, T.; Cosse, P.; Faussurier, G.; Gilleron, F.; Pain, J.C.; Reverdin, C.; Silvert, V.; Villette, B.; Busquet, M.; Colgan, J.; Guzik, J.; Kilcrease, D.P.; Magee, N.H.; Delahaye, F.; Zeippen, C.J.; Ducreta, J.E.; Fontes, C.J.; Harris, J.W.; Loisel, G.

    2013-01-01

    The international OPAC consortium consists of astrophysicists, plasma physicists and experimentalists who examine opacity calculations used in stellar physics that appear questionable and perform new calculations and laser experiments to understand the differences and improve the calculations. We report on iron and nickel opacities for envelopes of stars from 2 to 14 M and deliver our first conclusions concerning the reliability of the used calculations by illustrating the importance of the configuration interaction and of the completeness of the calculations for temperatures around 15-27 eV. (authors)

  18. A note on “A new approach for the selection of advanced manufacturing technologies: Data envelopment analysis with double frontiers”

    Directory of Open Access Journals (Sweden)

    Hossein Azizi

    2015-08-01

    Full Text Available Recently, using the data envelopment analysis (DEA with double frontiers approach, Wang and Chin (2009 proposed a new approach for the selection of advanced manufacturing technologies: DEA with double frontiers and a new measure for the selection of the best advanced manufacturing technologies (AMTs. In this note, we show that their proposed overall performance measure for the selection of the best AMT has an additional computational burden. Moreover, we propose a new measure for developing a complete ranking of AMTs. Numerical examples are examined using the proposed measure to show its simplicity and usefulness in the AMT selection and justification.

  19. A high-throughput approach to identify compounds that impair envelope integrity in Escherichia coli

    DEFF Research Database (Denmark)

    Baker, Kristin Renee; Jana, Bimal; Franzyk, Henrik

    2016-01-01

    - to 125-fold) the MICs of erythromycin, fusidic acid, novobiocin and rifampin and displayed synergy (fractional inhibitory concentration index, antibiotics by checkerboard assays in two genetically distinct E. coli strains, including the high-risk multidrug-resistant, CTX-M-15-producing...... the discovery of antimicrobial helper drug candidates and targets that enhance the delivery of existing antibiotics by impairing envelope integrity in Gram-negative bacteria....

  20. H2-O2 fuel cell and advanced battery power systems for autonomous underwater vehicles: performance envelope comparisons

    International Nuclear Information System (INIS)

    Schubak, G.E.; Scott, D.S.

    1993-01-01

    Autonomous underwater vehicles have traditionally been powered by low energy density lead-acid batteries. Recently, advanced battery technologies and H 2 -O 2 fuel cells have become available, offering significant improvements in performance. This paper compares the solid polymer fuel cell to the lithium-thionyl chloride primary battery, sodium-sulfur battery, and lead acid battery for a variety of missions. The power system performance is simulated using computer modelling techniques. Performance envelopes are constructed, indicating domains of preference for competing power system technologies. For most mission scenarios, the solid polymer fuel cell using liquid reactant storage is the preferred system. Nevertheless, the advanced battery systems are competitive with the fuel cell systems using gaseous hydrogen storage, and they illustrate preferred performance for missions requiring high power density. 11 figs., 4 tabs., 15 refs

  1. Prospective national and regional environmental performance: Boundary estimations using a combined data envelopment - stochastic frontier analysis approach

    International Nuclear Information System (INIS)

    Vaninsky, Alexander

    2010-01-01

    The environmental performance of regions and largest economies of the world - actually, the efficiency of their energy sectors - is estimated for the period 2010-2030 by using forecasted values of main economic indicators. Two essentially different methodologies, data envelopment analysis and stochastic frontier analysis, are used to obtain upper and lower boundaries of the environmental efficiency index. Greenhouse gas emission per unit of area is used as a resulting indicator, with GDP, energy consumption, and population forming a background of comparable estimations. The dynamics of the upper and lower boundaries and their average is analyzed. Regions and national economies having low level or negative dynamics of environmental efficiency are determined.

  2. The use of time-of-flight camera for navigating robots in computer-aided surgery: monitoring the soft tissue envelope of minimally invasive hip approach in a cadaver study.

    Science.gov (United States)

    Putzer, David; Klug, Sebastian; Moctezuma, Jose Luis; Nogler, Michael

    2014-12-01

    Time-of-flight (TOF) cameras can guide surgical robots or provide soft tissue information for augmented reality in the medical field. In this study, a method to automatically track the soft tissue envelope of a minimally invasive hip approach in a cadaver study is described. An algorithm for the TOF camera was developed and 30 measurements on 8 surgical situs (direct anterior approach) were carried out. The results were compared to a manual measurement of the soft tissue envelope. The TOF camera showed an overall recognition rate of the soft tissue envelope of 75%. On comparing the results from the algorithm with the manual measurements, a significant difference was found (P > .005). In this preliminary study, we have presented a method for automatically recognizing the soft tissue envelope of the surgical field in a real-time application. Further improvements could result in a robotic navigation device for minimally invasive hip surgery. © The Author(s) 2014.

  3. Robot Trajectories Comparison: A Statistical Approach

    Directory of Open Access Journals (Sweden)

    A. Ansuategui

    2014-01-01

    Full Text Available The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners.

  4. Robot Trajectories Comparison: A Statistical Approach

    Science.gov (United States)

    Ansuategui, A.; Arruti, A.; Susperregi, L.; Yurramendi, Y.; Jauregi, E.; Lazkano, E.; Sierra, B.

    2014-01-01

    The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners. PMID:25525618

  5. Uncertain data envelopment analysis

    CERN Document Server

    Wen, Meilin

    2014-01-01

    This book is intended to present the milestones in the progression of uncertain Data envelopment analysis (DEA). Chapter 1 gives some basic introduction to uncertain theories, including probability theory, credibility theory, uncertainty theory and chance theory. Chapter 2 presents a comprehensive review and discussion of basic DEA models. The stochastic DEA is introduced in Chapter 3, in which the inputs and outputs are assumed to be random variables. To obtain the probability distribution of a random variable, a lot of samples are needed to apply the statistics inference approach. Chapter 4

  6. Stochastic Frontier Approach and Data Envelopment Analysis to Total Factor Productivity and Efficiency Measurement of Bangladeshi Rice

    Science.gov (United States)

    Hossain, Md. Kamrul; Kamil, Anton Abdulbasah; Baten, Md. Azizul; Mustafa, Adli

    2012-01-01

    The objective of this paper is to apply the Translog Stochastic Frontier production model (SFA) and Data Envelopment Analysis (DEA) to estimate efficiencies over time and the Total Factor Productivity (TFP) growth rate for Bangladeshi rice crops (Aus, Aman and Boro) throughout the most recent data available comprising the period 1989–2008. Results indicate that technical efficiency was observed as higher for Boro among the three types of rice, but the overall technical efficiency of rice production was found around 50%. Although positive changes exist in TFP for the sample analyzed, the average growth rate of TFP for rice production was estimated at almost the same levels for both Translog SFA with half normal distribution and DEA. Estimated TFP from SFA is forecasted with ARIMA (2, 0, 0) model. ARIMA (1, 0, 0) model is used to forecast TFP of Aman from DEA estimation. PMID:23077500

  7. Safeguards Envelope Progress FY08

    International Nuclear Information System (INIS)

    Bean, Robert; Metcalf, Richard; Bevill, Aaron

    2008-01-01

    The Safeguards Envelope Project met its milestones by creating a rudimentary safeguards envelope, proving the value of the approach on a small scale, and determining the most appropriate path forward. The Idaho Chemical Processing Plant's large cache of reprocessing process monitoring data, dubbed UBER Data, was recovered and used in the analysis. A probabilistic Z test was used on a Markov Monte Carlo simulation of expected diversion data when compared with normal operating data. The data regarding a fully transient event in a tank was used to create a simple requirement, representative of a safeguards envelope, whose impact was a decrease in operating efficiency by 1.3% but an increase in material balance period of 26%. This approach is operator, state, and international safeguards friendly and should be applied to future reprocessing plants. Future requirements include tank-to-tank correlations in reprocessing facilities, detailed operations impact studies, simulation inclusion, automated optimization, advanced statistics analysis, and multi-attribute utility analysis

  8. Financial performance monitoring of the technical efficiency of critical access hospitals: a data envelopment analysis and logistic regression modeling approach.

    Science.gov (United States)

    Wilson, Asa B; Kerr, Bernard J; Bastian, Nathaniel D; Fulton, Lawrence V

    2012-01-01

    From 1980 to 1999, rural designated hospitals closed at a disproportionally high rate. In response to this emergent threat to healthcare access in rural settings, the Balanced Budget Act of 1997 made provisions for the creation of a new rural hospital--the critical access hospital (CAH). The conversion to CAH and the associated cost-based reimbursement scheme significantly slowed the closure rate of rural hospitals. This work investigates which methods can ensure the long-term viability of small hospitals. This article uses a two-step design to focus on a hypothesized relationship between technical efficiency of CAHs and a recently developed set of financial monitors for these entities. The goal is to identify the financial performance measures associated with efficiency. The first step uses data envelopment analysis (DEA) to differentiate efficient from inefficient facilities within a data set of 183 CAHs. Determining DEA efficiency is an a priori categorization of hospitals in the data set as efficient or inefficient. In the second step, DEA efficiency is the categorical dependent variable (efficient = 0, inefficient = 1) in the subsequent binary logistic regression (LR) model. A set of six financial monitors selected from the array of 20 measures were the LR independent variables. We use a binary LR to test the null hypothesis that recently developed CAH financial indicators had no predictive value for categorizing a CAH as efficient or inefficient, (i.e., there is no relationship between DEA efficiency and fiscal performance).

  9. Efficiency assessment of wastewater treatment plants: A data envelopment analysis approach integrating technical, economic, and environmental issues.

    Science.gov (United States)

    Castellet, Lledó; Molinos-Senante, María

    2016-02-01

    The assessment of the efficiency of wastewater treatment plants (WWTPs) is essential to compare their performance and consequently to identify the best operational practices that can contribute to the reduction of operational costs. Previous studies have evaluated the efficiency of WWTPs using conventional data envelopment analysis (DEA) models. Most of these studies have considered the operational costs of the WWTPs as inputs, while the pollutants removed from wastewater are treated as outputs. However, they have ignored the fact that each pollutant removed by a WWTP involves a different environmental impact. To overcome this limitation, this paper evaluates for the first time the efficiency of a sample of WWTPs by applying the weighted slacks-based measure model. It is a non-radial DEA model which allows assigning weights to the inputs and outputs according their importance. Thus, the assessment carried out integrates environmental issues with the traditional "techno-economic" efficiency assessment of WWTPs. Moreover, the potential economic savings for each cost item have been quantified at a plant level. It is illustrated that the WWTPs analyzed have significant room to save staff and energy costs. Several managerial implications to help WWTPs' operators make informed decisions were drawn from the methodology and empirical application carried out. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Enveloping Aerodynamic Decelerator

    Science.gov (United States)

    Nock, Kerry T. (Inventor); Aaron, Kim M. (Inventor); McRonald, Angus D. (Inventor); Gates, Kristin L. (Inventor)

    2018-01-01

    An inflatable aerodynamic deceleration method and system is provided for use with an atmospheric entry payload. The inflatable aerodynamic decelerator includes an inflatable envelope and an inflatant, wherein the inflatant is configured to fill the inflatable envelope to an inflated state such that the inflatable envelope surrounds the atmospheric entry payload, causing aerodynamic forces to decelerate the atmospheric entry payload.

  11. Nucleotide and deduced amino acid sequence of the envelope gene of the Vasilchenko strain of TBE virus; comparison with other flaviviruses.

    Science.gov (United States)

    Gritsun, T S; Frolova, T V; Pogodina, V V; Lashkevich, V A; Venugopal, K; Gould, E A

    1993-02-01

    A strain of tick-borne encephalitis virus known as Vasilchenko (Vs) exhibits relatively low virulence characteristics in monkeys, Syrian hamsters and humans. The gene encoding the envelope glycoprotein of this virus was cloned and sequenced. Alignment of the sequence with those of other known tick-borne flaviviruses and identification of the recognised amino acid genetic marker EHLPTA confirmed its identity as a member of the TBE complex. However, Vs virus was distinguishable from eastern and western tick-borne serotypes by the presence of the sequence AQQ at amino acid positions 232-234 and also by the presence of other specific amino acid substitutions which may be genetic markers for these viruses and could determine their pathogenetic characteristics. When compared with other tick-borne flaviviruses, Vs virus had 12 unique amino acid substitutions including an additional potential glycosylation site at position (315-317). The Vs virus strain shared closest nucleotide and amino acid homology (84.5% and 95.5% respectively) with western and far eastern strains of tick-borne encephalitis virus. Comparison with the far eastern serotype of tick-borne encephalitis virus, by cross-immunoelectrophoresis of Vs virions and PAGE analysis of the extracted virion proteins, revealed differences in surface charge and virus stability that may account for the different virulence characteristics of Vs virus. These results support and enlarge upon previous data obtained from molecular and serological analysis.

  12. The use of resource allocation approach for hospitals based on the initial efficiency by using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Nahid Yazdian Hossein Abadi

    2017-10-01

    Full Text Available Introduction: Recourse allocation is very important in today’s highly competitive environment to enhance the quality and reduce costs due to limited resources and unlimited needs of the society. The aim of this study was to implement resource allocation in order to improve the efficiency of hospital. Method: This is a mixed method study. The data used in this paper are secondary data related to the 30 large acute and general hospitals in the US. Bed, service mix, full-time equivalent (FTE, and operational expenses are input indicators in hospital, and adjusted admissions and outpatient visits are output indicators. Using goal programming (GP model and data envelopment analysis (DEA model with a common weights, we suggest three scenarios for resource allocation and budget allocation. “Resource allocation based on efficiency”, “budget allocation based on efficiency” and “two stage allocation of budget”. The first scenario was used for allocating the resources and the second and third ones for allocating budget to decision making units (DMUs. The data were analyzed by LINGO software. Results: Before the allocation, four hospitals were efficient and the efficiency of six hospitals was less than 50%, but after allocation, in the first case of the first scenario 14 hospitals, 11 hospitals in the second case of the first scenario, 24 hospitals in the second scenario and 17 hospitals in the third scenario were efficient, and it is an important point that after the allocation, efficiency of all hospitals increased. Conclusion: This study can be useful for hospital administrators; it can help them to allocate their resource and budget and increase the efficiency of their hospitals.

  13. Applying data envelopment analysis approach to improve energy efficiency and reduce GHG (greenhouse gas) emission of wheat production

    International Nuclear Information System (INIS)

    Khoshnevisan, Benyamin; Rafiee, Shahin; Omid, Mahmoud; Mousazadeh, Hossein

    2013-01-01

    In this study, DEA (data envelopment analysis) was applied to analyze the energy efficiency of wheat farms in order to separate efficient and inefficient growers and to calculate the wasteful uses of energy. Additionally, the degrees of TE (technical efficiency), PTE (pure technical efficiency) and SE (scale efficiency) were determined. Furthermore, the effect of energy optimization on GHG (greenhouse gas) emission was investigated and the total amount of GHG emission of efficient farms was compared with inefficient ones. Based on the results it was revealed that 18% of producers were technically efficient and the average of TE was calculated as 0.82. Based on the BCC (Banker–Charnes–Cooper) model 154 growers (59%) were identified efficient and the mean PTE of these farmers was found to be 0.99. Also, it was concluded that 2075.8 MJ ha −1 of energy inputs can be saved if the performance of inefficient farms rises to a high level. Additionally, it was observed that the total GHG emission from efficient and inefficient producers was 2713.3 and 2740.8 kg CO 2eq . ha −1 , respectively. By energy optimization the total GHG emission can be reduced to the value of 2684.29 kg CO 2eq . ha −1 . - Highlights: • 18% of producers were technically efficient and the average of TE was 0.82. • An average 2075.8 MJ ha −1 from energy input could be saved without reducing the yield. • GHG emission of efficient and inefficient producers was 2713.3 and 2740.8 kg CO 2eq. ha −1 . • Total GHG emission can be reduced to the value of 2684.29 kg CO 2eq. ha −1

  14. Measurement of Low Carbon Economy Efficiency with a Three-Stage Data Envelopment Analysis: A Comparison of the Largest Twenty CO2 Emitting Countries

    Directory of Open Access Journals (Sweden)

    Xiang Liu

    2016-11-01

    Full Text Available This paper employs a three-stage approach to estimate low carbon economy efficiency in the largest twenty CO2 emitting countries from 2000 to 2012. The approach includes the following three stages: (1 use of a data envelopment analysis (DEA model with undesirable output to estimate the low carbon economy efficiency and calculate the input and output slacks; (2 use of a stochastic frontier approach to eliminate the impacts of external environment variables on these slacks; (3 re-estimation of the efficiency with adjusted inputs and outputs to reflect the capacity of the government to develop a low carbon economy. The results indicate that the low carbon economy efficiency performances in these countries had worsened during the studied period. The performances in the third stage are larger than that in the first stage. Moreover, in general, low carbon economy efficiency in Annex I countries of the United Nations Framework Convention on Climate Change (UNFCCC is better than that in Non-Annex I countries. However, the gap of the average efficiency score between Annex I and Non-Annex I countries in the first stage is smaller than that in the third stage. It implies that the external environment variables show greater influence on Non-Annex I countries than that on Annex I countries. These external environment variables should be taken into account in the transnational negotiation of the responsibility of promoting CO2 reductions. Most importantly, the developed countries (mostly in Annex I should help the developing countries (mostly in Non-Annex I to reduce carbon emission by opening or expanding the trade, such as encouraging the import and export of the energy-saving and sharing emission reduction technology.

  15. Measurement of Low Carbon Economy Efficiency with a Three-Stage Data Envelopment Analysis: A Comparison of the Largest Twenty CO2 Emitting Countries

    Science.gov (United States)

    Liu, Xiang; Liu, Jia

    2016-01-01

    This paper employs a three-stage approach to estimate low carbon economy efficiency in the largest twenty CO2 emitting countries from 2000 to 2012. The approach includes the following three stages: (1) use of a data envelopment analysis (DEA) model with undesirable output to estimate the low carbon economy efficiency and calculate the input and output slacks; (2) use of a stochastic frontier approach to eliminate the impacts of external environment variables on these slacks; (3) re-estimation of the efficiency with adjusted inputs and outputs to reflect the capacity of the government to develop a low carbon economy. The results indicate that the low carbon economy efficiency performances in these countries had worsened during the studied period. The performances in the third stage are larger than that in the first stage. Moreover, in general, low carbon economy efficiency in Annex I countries of the United Nations Framework Convention on Climate Change (UNFCCC) is better than that in Non-Annex I countries. However, the gap of the average efficiency score between Annex I and Non-Annex I countries in the first stage is smaller than that in the third stage. It implies that the external environment variables show greater influence on Non-Annex I countries than that on Annex I countries. These external environment variables should be taken into account in the transnational negotiation of the responsibility of promoting CO2 reductions. Most importantly, the developed countries (mostly in Annex I) should help the developing countries (mostly in Non-Annex I) to reduce carbon emission by opening or expanding the trade, such as encouraging the import and export of the energy-saving and sharing emission reduction technology. PMID:27834890

  16. Measurement of Low Carbon Economy Efficiency with a Three-Stage Data Envelopment Analysis: A Comparison of the Largest Twenty CO₂ Emitting Countries.

    Science.gov (United States)

    Liu, Xiang; Liu, Jia

    2016-11-09

    This paper employs a three-stage approach to estimate low carbon economy efficiency in the largest twenty CO₂ emitting countries from 2000 to 2012. The approach includes the following three stages: (1) use of a data envelopment analysis (DEA) model with undesirable output to estimate the low carbon economy efficiency and calculate the input and output slacks; (2) use of a stochastic frontier approach to eliminate the impacts of external environment variables on these slacks; (3) re-estimation of the efficiency with adjusted inputs and outputs to reflect the capacity of the government to develop a low carbon economy. The results indicate that the low carbon economy efficiency performances in these countries had worsened during the studied period. The performances in the third stage are larger than that in the first stage. Moreover, in general, low carbon economy efficiency in Annex I countries of the United Nations Framework Convention on Climate Change (UNFCCC) is better than that in Non-Annex I countries. However, the gap of the average efficiency score between Annex I and Non-Annex I countries in the first stage is smaller than that in the third stage. It implies that the external environment variables show greater influence on Non-Annex I countries than that on Annex I countries. These external environment variables should be taken into account in the transnational negotiation of the responsibility of promoting CO₂ reductions. Most importantly, the developed countries (mostly in Annex I) should help the developing countries (mostly in Non-Annex I) to reduce carbon emission by opening or expanding the trade, such as encouraging the import and export of the energy-saving and sharing emission reduction technology.

  17. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk (Tamarix spp.) Habitat in the Western USA

    Science.gov (United States)

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-01-01

    Habitat suitability maps are commonly created by modeling a species’ environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk (Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  18. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk ( Tamarix spp.) Habitat in the Western USA

    Science.gov (United States)

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-10-01

    Habitat suitability maps are commonly created by modeling a species' environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk ( Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  19. A neuro-data envelopment analysis approach for optimization of uncorrelated multiple response problems with smaller the better type controllable factors

    Science.gov (United States)

    Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed

    2013-11-01

    In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

  20. Solitons, envelope solitons in collisonless plasmas

    International Nuclear Information System (INIS)

    Ichikawa, Y.H.; Watanabe, S.

    1977-08-01

    A review is given to extensive development of theoretical, computational and experimental studies of nonlinear wave propagation in collisionless plasmas. Firstly, the historical experiment of Ikezi et al. is discussed in comparison with theoretical analysis based on the Korteweg-de Vries equation. Systematic discrepancy between the observation and the theoretical prediction suggests that it is necessary to examine such as higher order mode coupling effect and contribution of trapped particles. Secondly, effects of the nonlinear Landau damping on the envelope solution of ion plasma wave is discussed on the basis of theoretical study of Ichikawa-Taniuti, experimental observation of Watanabe and numerical analysis of Yajima et al. Finally, a new type of evolution equation derived for the Alfven wave is examined in some detail. The rigorous solution obtained for this mode represents a new kind of envelope solution, in which both of its phase and amplitude are subject to modulation of comparable spatial extension. In conclusion, the emphasis will be placed on the fact that much more intensive experimental researches are expected to be done, since the powerful methods to disentangle various nonlinear evolution equations are now available for theoretical approach. (auth.)

  1. First and second law analysis applied to building envelope: A theoretical approach on the potentiality of Bejan’s theory

    Directory of Open Access Journals (Sweden)

    Cesare Biserni

    2015-11-01

    Full Text Available Especially in the last decade, efforts have been made in developing the sustainable building assessment tools, which are usually performed based on fundamentals of the First Law of Thermodynamics. However, this approach does not provide a faithful thermodynamic evaluation of the overall energy conversion processes that occur in buildings, and a more robust approach should be followed. The relevance of Second Law analysis has been here highlighted: in addition to the calculation of energy balances, the concept of exergy is used to evaluate the quality of energy sources, resulting in a higher flexibility of strategies to optimize a building design. Reviews of the progress being made with the constructal law show that diverse phenomena can be considered manifestations of the tendency towards optimization captured by the constructal law. The studies based on First and Second Principle of Thermodynamics results to be affected by the extreme generality of the two laws, which is consequent of the fact that in thermodynamics the “any system” is a black box with no information about design, organization and evolution. In this context, an exploratory analysis on the potentiality of constructal theory, that can be considered a law of thermodynamics, has been finally outlined in order to assess the energy performance in building design.

  2. Comparison of gimbal approaches to decrease drag force and radar cross sectional area in missile application

    Science.gov (United States)

    Sakarya, Doǧan Uǧur

    2017-05-01

    Drag force effect is an important aspect of range performance in missile applications especially for long flight time. However, old fashioned gimbal approaches force to increase missile diameter. This increase has negative aspect of rising in both drag force and radar cross sectional area. A new gimbal approach was proposed recently. It uses a beam steering optical arrangement. Therefore, it needs less volume envelope for same field of regard and same optomechanical assembly than the old fashioned gimbal approaches. In addition to longer range performance achieved with same fuel in the new gimbal approach, this method provides smaller cross sectional area which can be more invisible in enemies' radar. In this paper, the two gimbal approaches - the old fashioned one and the new one- are compared in order to decrease drag force and radar cross sectional area in missile application. In this study; missile parameters are assumed to generate gimbal and optical design parameters. Optical design is performed according to these missile criteria. Two gimbal configurations are designed with respect to modeled missile parameters. Also analyzes are performed to show decreased drag force and radar cross sectional area in the new approach for comparison.

  3. Storage envelopes or sleeves

    International Nuclear Information System (INIS)

    Freshwater, J.R.; Wagman, P.I.

    1980-01-01

    A storage envelope or sleeve particularly for processed X-ray films is described. It consists of front and back panels joined together at a hinge line and connected along the intermediate sides by connecting flaps. An inner pocket is formed from a third flap which is folded to lie against the inner face of the back panel. The panels may have additional score lines parallel to the closed sides of the envelope and the inner pocket so that the envelope and the inner pocket can accommodate bulky contents. The free edge of the pocket is inset from the open side of the envelope, and finger cut-outs may be provided to facilitate access to the contents of the envelope and the pocket. (author)

  4. Protective plasma envelope

    International Nuclear Information System (INIS)

    Bocharov, V.N.; Konstantinov, S.G.; Kudryavtsev, A.M.; Myskin, O.K.; Panasyuk, V.M.; Tsel'nik, F.A.

    1984-06-01

    A method of creating an annular plasma envelope used to protect the hot plasma from flows of impurities and gases from the walls of the vacuum chamber is described. The diameter of the envelope is 30 cm, the thickness of the wall is 1.5 cm, the length is 2.5 m, and its density is from 10 13 to 10 14 cm -3 . The envelope attenuates the incident (from outside) flow of helium 10-fold and the low of hydrogen 20-fold

  5. Safeguards Envelope Progress FY08

    Energy Technology Data Exchange (ETDEWEB)

    Robert Bean; Richard Metcalf; Aaron Bevill

    2008-09-01

    The Safeguards Envelope Project met its milestones by creating a rudimentary safeguards envelope, proving the value of the approach on a small scale, and determining the most appropriate path forward. The Idaho Chemical Processing Plant’s large cache of reprocessing process monitoring data, dubbed UBER Data, was recovered and used in the analysis. A probabilistic Z test was used on a Markov Monte Carlo simulation of expected diversion data when compared with normal operating data. The data regarding a fully transient event in a tank was used to create a simple requirement, representative of a safeguards envelope, whose impact was a decrease in operating efficiency by 1.3% but an increase in material balance period of 26%. This approach is operator, state, and international safeguards friendly and should be applied to future reprocessing plants. Future requirements include tank-to-tank correlations in reprocessing facilities, detailed operations impact studies, simulation inclusion, automated optimization, advanced statistics analysis, and multi-attribute utility analysis.

  6. Safe operating envelope

    Energy Technology Data Exchange (ETDEWEB)

    Oliva, N [Ontario Hydro, Toronto, ON (Canada)

    1997-12-01

    Safe Operating Envelope is described representing: The outer bound of plant conditions within which day-to-day plant operation must be maintained in order to comply with regulatory requirements, associated safety design criteria and corporate nuclear safety goals. Figs.

  7. Safe operating envelope

    International Nuclear Information System (INIS)

    Oliva, N.

    1997-01-01

    Safe Operating Envelope is described representing: The outer bound of plant conditions within which day-to-day plant operation must be maintained in order to comply with regulatory requirements, associated safety design criteria and corporate nuclear safety goals. Figs

  8. Comparison between Fisherian and Bayesian approach to ...

    African Journals Online (AJOL)

    ... of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to be better than Fisher's approach because of its low misclassification error rate. Keywords: variance-covariance matrices, centroids, prior probability, mahalanobis distance, probability of misclassification ...

  9. Environmental impact efficiency of natural gas combined cycle power plants: A combined life cycle assessment and dynamic data envelopment analysis approach.

    Science.gov (United States)

    Martín-Gamboa, Mario; Iribarren, Diego; Dufour, Javier

    2018-02-15

    The energy sector is still dominated by the use of fossil resources. In particular, natural gas represents the third most consumed resource, being a significant source of electricity in many countries. Since electricity production in natural gas combined cycle (NGCC) plants provides some benefits with respect to other non-renewable technologies, it is often seen as a transitional solution towards a future low‑carbon power generation system. However, given the environmental profile and operational variability of NGCC power plants, their eco-efficiency assessment is required. In this respect, this article uses a novel combined Life Cycle Assessment (LCA) and dynamic Data Envelopment Analysis (DEA) approach in order to estimate -over the period 2010-2015- the environmental impact efficiencies of 20 NGCC power plants located in Spain. A three-step LCA+DEA method is applied, which involves data acquisition, calculation of environmental impacts through LCA, and the novel estimation of environmental impact efficiency (overall- and term-efficiency scores) through dynamic DEA. Although only 1 out of 20 NGCC power plants is found to be environmentally efficient, all plants show a relatively good environmental performance with overall eco-efficiency scores above 60%. Regarding individual periods, 2011 was -on average- the year with the highest environmental impact efficiency (95%), accounting for 5 efficient NGCC plants. In this respect, a link between high number of operating hours and high environmental impact efficiency is observed. Finally, preliminary environmental benchmarks are presented as an additional outcome in order to further support decision-makers in the path towards eco-efficiency in NGCC power plants. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A Hybrid Fuzzy Analytic Network Process (FANP and Data Envelopment Analysis (DEA Approach for Supplier Evaluation and Selection in the Rice Supply Chain

    Directory of Open Access Journals (Sweden)

    Chia–Nan Wang

    2018-06-01

    Full Text Available In the market economy, competition is typically due to the difficulty in selecting the most suitable supplier, one that is capable to help a business to develop a profit to the highest value threshold and capable to meet sustainable development features. In addition, this research discusses a wide range of consequences from choosing an effective supplier, including reducing production cost, improving product quality, delivering the product on time, and responding flexibly to customer requirements. Therefore, the activities noted above are able to increase an enterprise’s competitiveness. It can be seen that selecting a supplier is complex in that decision-makers must have an understanding of the qualitative and quantitative features for assessing the symmetrical impact of the criteria to reach the most accurate result. In this research, the multi-criteria group decision-making (MCGDM approach was proposed to solve supplier selection problems. The authors collected data from 25 potential suppliers, and the four main criteria within contain 15 sub-criteria to define the most effective supplier, which has viewed factors, including financial efficiency guarantee, quality of materials, ability to deliver on time, and the conditioned response to the environment to improve the efficiency of the industry supply chain. Initially, fuzzy analytic network process (ANP is used to evaluate and rank these criteria, which are able to be utilized to clarify important criteria that directly affect the profitability of the business. Subsequently, data envelopment analysis (DEA models, including the Charnes Cooper Rhodes model (CCR model, Banker Charnes Cooper model (BCC model, and slacks-based measure model (SBM model, were proposed to rank suppliers. The result of the model has proposed 7/25 suppliers, which have a condition response to the enterprises’ supply requirements.

  11. Comparison of Traditional and Constructivist Teaching Approaches ...

    African Journals Online (AJOL)

    The second section of students had 47 students and was taught using traditional teaching approach. Learning strategy inventory questionnaire which was adapted from strategy inventory for language learning (SILL) L2 students of English, (Oxford, 1990) was employed before and after students were taught using two ...

  12. A fuzzy analytic hierarchy/data envelopment analysis approach for measuring the relative efficiency of hydrogen R and D programs in the sector of developing hydrogen energy technologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seongkon; Kim, Jongwook [Korea Institute of Energy Research (Korea, Republic of). Energy Policy Research Center; Mogi, Gento [Tokyo Univ. (Japan). Graduate School of Engineering; Hui, K.S. [Hong Kong City Univ. (China). Manufacturing Engineering and Engineering Management

    2010-07-01

    list of evaluation criteria for assessing and prioritize hydrogen energy technologies in the sector of hydrogen ETRM with finite resources and R and D funds. The criteria are composed of economic impact, commercial potential, inner capacity, and technical spin-off. Hydrogen ETRM supplies primary energy technologies to be developed with a long-term view for the low carbon green growth. We suggest Korea's long-term direction and strategy for developing hydrogen energy technologies in the sector of hydrogen ETRM with the hydrogen economy. The main purpose of this research is to assess the priority of hydrogen energy technologies in the sector of hydrogen ETRM since we allocate and invest R and D budgets strategically as an extended research [1]. In this paper, we focus on the assessment of hydrogen energy technologies econometrically by using an integrated 2- stage approach, which is fuzzy analytic hierarchy (Fuzzy AHP) process and the data envelopment analysis (DEA) in the sector of hydrogen energy technologies. The research results suggest the most efficient hydrogen energy technology is selected by the multi-criteria decision making approach. In addition it also provides Korean hydrogen energy technology policymakers and decision makers with the right hydrogen energy technologies econometrically as they implement a strategic R and D plan. This extended abstract is composed as follows: Section 2 presents the fuzzy sets and numbers, Section 3 includes the Fuzzy AHP concepts. Section 4 presents the DEA approach. Section 5 shows the numerical examples. Finally, Section 6 presents the conclusions. (orig.)

  13. Comparison of the convergent receptor utilization of a retargeted feline leukemia virus envelope with a naturally-occurring porcine endogenous retrovirus A.

    Science.gov (United States)

    Mazari, Peter M; Argaw, Takele; Valdivieso, Leonardo; Zhang, Xia; Marcucci, Katherine T; Salomon, Daniel R; Wilson, Carolyn A; Roth, Monica J

    2012-06-05

    In vitro screening of randomized FeLV Envelope libraries identified the CP isolate, which enters cells through HuPAR-1, one of two human receptors utilized by porcine endogenous retrovirus-A (PERV-A), a distantly related gammaretrovirus. The CP and PERV-A Envs however, share little amino acid homology. Their receptor utilization was examined to define the common receptor usage of these disparate viral Envs. We demonstrate that the receptor usage of CP extends to HuPAR-2 but not to the porcine receptor PoPAR, the cognate receptor for PERV-A. Reciprocal interference between virus expressing CP and PERV-A Envs was observed on human cells. Amino acid residues localized to within the putative second extracellular loop (ECL-2) of PAR-1 and PAR-2 are found to be critical for CP envelope function. Through a panel of receptor chimeras and point mutations, this area was also found to be responsible for the differential usage of the PoPAR receptor between CP and PERV-A. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. HIV-1 envelope glycoprotein

    Science.gov (United States)

    Caulfield, Michael; Cupo, Albert; Dean, Hansi; Hoffenberg, Simon; King, C. Richter; Klasse, P. J.; Marozsan, Andre; Moore, John P.; Sanders, Rogier W.; Ward, Andrew; Wilson, Ian; Julien, Jean-Philippe

    2017-08-22

    The present application relates to novel HIV-1 envelope glycoproteins, which may be utilized as HIV-1 vaccine immunogens, and antigens for crystallization, electron microscopy and other biophysical, biochemical and immunological studies for the identification of broad neutralizing antibodies. The present invention encompasses the preparation and purification of immunogenic compositions, which are formulated into the vaccines of the present invention.

  15. Common envelope evolution

    NARCIS (Netherlands)

    Taam, Ronald E.; Ricker, Paul M.

    2010-01-01

    The common envelope phase of binary star evolution plays a central role in many evolutionary pathways leading to the formation of compact objects in short period systems. Using three dimensional hydrodynamical computations, we review the major features of this evolutionary phase, focusing on the

  16. Comparison of Routable Control System Security Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Edgar, Thomas W.; Hadley, Mark D.; Carroll, Thomas E.; Manz, David O.; Winn, Jennifer D.

    2011-06-01

    This document is an supplement to the 'Secure and Efficient Routable Control Systems.' It addressed security in routable control system communication. The control system environment that monitors and manages the power grid historically has utilized serial communication mechanisms. Leased-line serial communication environments operating at 1200 to 9600 baud rates are common. However, recent trends show that communication media such as fiber, optical carrier 3 (OC-3) speeds, mesh-based high-speed wireless, and the Internet are becoming the media of choice. In addition, a dichotomy has developed between the electrical transmission and distribution environments, with more modern communication infrastructures deployed by transmission utilities. The preceding diagram represents a typical control system. The Communication Links cloud supports all of the communication mechanisms a utility might deploy between the control center and devices in the field. Current methodologies used for security implementations are primarily led by single vendors or standards bodies. However, these entities tend to focus on individual protocols. The result is an environment that contains a mixture of security solutions that may only address some communication protocols at an increasing operational burden for the utility. A single approach is needed that meets operational requirements, is simple to operate, and provides the necessary level of security for all control system communication. The solution should be application independent (e.g., Distributed Network Protocol/Internet Protocol [DNP/IP], International Electrotechnical Commission [IEC] C37.118, Object Linking and Embedding for Process Control [OPC], etc.) and focus on the transport layer. In an ideal setting, a well-designed suite of standards for control system communication will be used for vendor implementation and compliance testing. An expected outcome of this effort is an international standard.

  17. (Quasi-)Poisson enveloping algebras

    OpenAIRE

    Yang, Yan-Hong; Yao, Yuan; Ye, Yu

    2010-01-01

    We introduce the quasi-Poisson enveloping algebra and Poisson enveloping algebra for a non-commutative Poisson algebra. We prove that for a non-commutative Poisson algebra, the category of quasi-Poisson modules is equivalent to the category of left modules over its quasi-Poisson enveloping algebra, and the category of Poisson modules is equivalent to the category of left modules over its Poisson enveloping algebra.

  18. A Comparison of MOOC Development and Delivery Approaches

    Science.gov (United States)

    Smith, Neil; Caldwell, Helen; Richards, Mike; Bandara, Arosha

    2017-01-01

    Purpose: The purpose of this paper is to present a comparison of two ways of developing and delivering massive open online courses (MOOCs). One was developed by The Open University in collaboration with FutureLearn; the other was developed independently by a small team at the Northampton University. Design/methodology/approach: The different…

  19. Do projections from bioclimatic envelope models and climate change metrics match?

    DEFF Research Database (Denmark)

    Garcia, Raquel A.; Cabeza, Mar; Altwegg, Res

    2016-01-01

    as indicators of the exposure of species to climate change. Here, we investigate whether these two approaches provide qualitatively similar indications about where biodiversity is potentially most exposed to climate change. Location: Sub-Saharan Africa. Methods: We compared a range of climate change metrics...... for sub-Saharan Africa with ensembles of bioclimatic envelope models for 2723 species of amphibians, snakes, mammals and birds. For each taxonomic group, we performed three comparisons between the two approaches: (1) is projected change in local climatic suitability (models) greater in grid cells...... between the two approaches was found for all taxonomic groups, although it was stronger for species with a narrower climatic envelope breadth. Main conclusions: For sub-Saharan African vertebrates, projected patterns of exposure to climate change given by climate change metrics alone were qualitatively...

  20. The performance of energy efficient residential building envelope systems

    Energy Technology Data Exchange (ETDEWEB)

    Proskiw, G.

    1996-08-01

    The adequacy and durability of residential building envelope systems under actual field conditions were evaluated. A building envelope offers protection from cold, heat, moisture, wind and noise. However, they are exposed to thermal, structural, and moisture stresses and their performance can degrade over time. Envelope performance was evaluated at 20 energy efficient and four conventional, detached modern homes in Winnipeg, Canada. The three complementary measurement tools were wood moisture content (WMC) of framing members, thermographic examinations, and airtightness tests. As expected, energy efficient building envelope systems performed better than the conventional systems. No evidence of envelope degradation was found in any of the energy efficient houses. The building envelopes using polyethylene air barriers performed slightly better than those which used the airtight drywall approach, although both were considered satisfactory. WMC levels were a bit lower in the polyethylene-clad house. 1 ref., 1 tab.

  1. Thermal Activated Envelope

    DEFF Research Database (Denmark)

    Foged, Isak Worre; Pasold, Anke

    2015-01-01

    The research studies the making of a responsive architectural envelope based on bi-materials. The bi-materials are organized according to a method that combines different isotropic metals and plastic into an active composite structure that reacts to temperature variations. Through an evolutionary......, environmental dynamics and occupancy dynamics. Lastly, a physical prototype is created, which illustrates the physical expression of the bi-materials and the problems related to manufacturing of these composite structures.......The research studies the making of a responsive architectural envelope based on bi-materials. The bi-materials are organized according to a method that combines different isotropic metals and plastic into an active composite structure that reacts to temperature variations. Through an evolutionary...

  2. Semiparametric Power Envelopes for Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael

    This paper derives asymptotic power envelopes for tests of the unit root hypothesis in a zero-mean AR(1) model. The power envelopes are derived using the limits of experiments approach and are semiparametric in the sense that the underlying error distribution is treated as an unknown...

  3. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  4. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  5. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  6. A Comparison of Microeconomic and Macroeconomic Approaches to Deforestation Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Felardo

    2016-01-01

    Full Text Available The economics of deforestation has been explored in detail. Generally, the frame of analysis takes either a microeconomics or macroeconomics approach. The microeconomics approach assumes that individual decision makers are responsible for deforestation as a result of utility maximizing behavior and imperfect property right regimes. The macroeconomics approach explores nationwide trends thought to be associated with forest conversion. This paper investigates the relationship between these two approaches by empirically testing the determinants of deforestation using the same data set from Thailand. The theory for both the microeconomics-based and macroeconomics-based approaches are developed and then tested statistically. The models were constructed using established theoretical frames developed in the literature. The results from both models show statistical significance consistent with prior results in the tropical deforestation literature. A comparison of the two approaches demonstrates that the macro approach is useful in identifying relevant aggregate trends in the deforestation process; the micro approach provides the opportunity to isolate factors of those trends which are necessary for effective policy decisions.

  7. Uma abordagem via análise envoltória de dados para o estabelecimento de melhorias em segurança baseadas na FMEA A data envelopment analysis approach to safety improvements based on FMEA

    Directory of Open Access Journals (Sweden)

    Pauli Adriano de Almada Garcia

    2013-03-01

    Full Text Available O presente trabalho apresenta uma abordagem baseada em análise envoltória de dados (Data Envelopment Analysis - DEA para estabelecer direcionamentos de melhoria para os modos de falhas identificados na análise dos modos e efeitos de falha (FMEA. A abordagem tradicional da FMEA baseia-se num número de priorização de riscos que vem sendo alvo de inúmeras críticas embasadas em diversos artigos científicos. No presente trabalho, o número de priorização de risco é baseado em DEA. Por meio desse número de priorização de risco DEA, considerando-se o conceito de fronteira de eficiência, serão identificadas diretrizes de melhoria para os modos de falha com base nos índices de risco estabelecidos pela equipe executora da FMEA. Por meio dos resultados alcançados nas duas aplicações práticas, pode-se demonstrar a eficácia da abordagem proposta para tratar essa classe de problemas.The present study presents a data envelopment analysis (DEA based approach to establish guidelines for the improvement of the failure modes identified by a failure mode and effect analysis (FMEA. The traditional FMEA approach is based on risk priority number (RPN, which has been target of scientific criticism. In the present study, the RPN is based on DEA and considers the concept of efficiency frontier. Based on this concept and on the risk indexes, some improvement directives for the failure modes can be established. The obtained results concerning the two practical applications demonstrate the effectiveness of the proposed approach in dealing with this kind of problem.

  8. Comparison of Different Approaches for Measuring Tibial Cartilage Thickness

    Directory of Open Access Journals (Sweden)

    Maier Jennifer

    2017-07-01

    Full Text Available Osteoarthritis is a degenerative disease affecting bones and cartilage especially in the human knee. In this context, cartilage thickness is an indicator for knee cartilage health. Thickness measurements are performed on medical images acquired in-vivo. Currently, there is no standard method agreed upon that defines a distance measure in articular cartilage. In this work, we present a comparison of different methods commonly used in literature. These methods are based on nearest neighbors, surface normal vectors, local thickness and potential field lines. All approaches were applied to manual segmentations of tibia and lateral and medial tibial cartilage performed by experienced raters. The underlying data were contrast agent-enhanced cone-beam C-arm CT reconstructions of one healthy subject’s knee. The subject was scanned three times, once in supine position and two times in a standing weight-bearing position. A comparison of the resulting thickness maps shows similar distributions and high correlation coefficients between the approaches above 0.90. The nearest neighbor method results on average in the lowest cartilage thickness values, while the local thickness approach assigns the highest values. We showed that the different methods agree in their thickness distribution. The results will be used for a future evaluation of cartilage change under weight-bearing conditions.

  9. Benchmarking energy performance of residential buildings using two-stage multifactor data envelopment analysis with degree-day based simple-normalization approach

    International Nuclear Information System (INIS)

    Wang, Endong; Shen, Zhigang; Alp, Neslihan; Barry, Nate

    2015-01-01

    Highlights: • Two-stage DEA model is developed to benchmark building energy efficiency. • Degree-day based simple normalization is used to neutralize the climatic noise. • Results of a real case study validated the benefits of this new model. - Abstract: Being able to identify detailed meta factors of energy performance is essential for creating effective residential energy-retrofitting strategies. Compared to other benchmarking methods, nonparametric multifactor DEA (data envelopment analysis) is capable of discriminating scale factors from management factors to reveal more details to better guide retrofitting practices. A two-stage DEA energy benchmarking method is proposed in this paper. This method includes (1) first-stage meta DEA which integrates the common degree day metrics for neutralizing noise energy effects of exogenous climatic variables; and (2) second-stage Tobit regression for further detailed efficiency analysis. A case study involving 3-year longitudinal panel data of 189 residential buildings indicated the proposed method has advantages over existing methods in terms of its efficiency in data processing and results interpretation. The results of the case study also demonstrated high consistency with existing linear regression based DEA.

  10. Impact of the Local Public Hospital Reform on the Efficiency of Medium-Sized Hospitals in Japan: An Improved Slacks-Based Measure Data Envelopment Analysis Approach.

    Science.gov (United States)

    Zhang, Xing; Tone, Kaoru; Lu, Yingzhe

    2018-04-01

    To assess the change in efficiency and total factor productivity (TFP) of the local public hospitals in Japan after the local public hospital reform launched in late 2007, which was aimed at improving the financial capability and operational efficiency of hospitals. Secondary data were collected from the Ministry of Internal Affairs and Communications on 213 eligible medium-sized hospitals, each operating 100-400 beds from FY2006 to FY2011. The improved slacks-based measure nonoriented data envelopment analysis models (Quasi-Max SBM nonoriented DEA models) were used to estimate dynamic efficiency score and Malmquist Index. The dynamic efficiency measure indicated an efficiency gain in the first several years of the reform and then was followed by a decrease. Malmquist Index analysis showed a significant decline in the TFP between 2006 and 2011. The financial improvement of medium-sized hospitals was not associated with enhancement of efficiency. Hospital efficiency was not significantly different among ownership structure and law-application system groups, but it was significantly affected by hospital location. The results indicate a need for region-tailored health care policies and for a more comprehensive reform to overcome the systemic constraints that might contribute to the decline of the TFP. © Health Research and Educational Trust.

  11. Full waveform inversion using envelope-based global correlation norm

    Science.gov (United States)

    Oh, Ju-Won; Alkhalifah, Tariq

    2018-05-01

    To increase the feasibility of full waveform inversion on real data, we suggest a new objective function, which is defined as the global correlation of the envelopes of modelled and observed data. The envelope-based global correlation norm has the advantage of the envelope inversion that generates artificial low-frequency information, which provides the possibility to recover long-wavelength structure in an early stage. In addition, the envelope-based global correlation norm maintains the advantage of the global correlation norm, which reduces the sensitivity of the misfit to amplitude errors so that the performance of inversion on real data can be enhanced when the exact source wavelet is not available and more complex physics are ignored. Through the synthetic example for 2-D SEG/EAGE overthrust model with inaccurate source wavelet, we compare the performance of four different approaches, which are the least-squares waveform inversion, least-squares envelope inversion, global correlation norm and envelope-based global correlation norm. Finally, we apply the envelope-based global correlation norm on the 3-D Ocean Bottom Cable (OBC) data from the North Sea. The envelope-based global correlation norm captures the strong reflections from the high-velocity caprock and generates artificial low-frequency reflection energy that helps us recover long-wavelength structure of the model domain in the early stages. From this long-wavelength model, the conventional global correlation norm is sequentially applied to invert for higher-resolution features of the model.

  12. A risk based approach for SSTO/TSTO comparisons

    Science.gov (United States)

    Greenberg, Joel S.

    1996-03-01

    An approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. Risk considerations are necessary since the transportation systems are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization. The approach considers the uncertainty of achievement of technology goals, effect that the achieved technology level will have on transportation system performance and the relationship between system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of nonrecurring, recurring, and the present value of transportation system life cycle costs.

  13. A Comparison of Machine Learning Approaches for Corn Yield Estimation

    Science.gov (United States)

    Kim, N.; Lee, Y. W.

    2017-12-01

    Machine learning is an efficient empirical method for classification and prediction, and it is another approach to crop yield estimation. The objective of this study is to estimate corn yield in the Midwestern United States by employing the machine learning approaches such as the support vector machine (SVM), random forest (RF), and deep neural networks (DNN), and to perform the comprehensive comparison for their results. We constructed the database using satellite images from MODIS, the climate data of PRISM climate group, and GLDAS soil moisture data. In addition, to examine the seasonal sensitivities of corn yields, two period groups were set up: May to September (MJJAS) and July and August (JA). In overall, the DNN showed the highest accuracies in term of the correlation coefficient for the two period groups. The differences between our predictions and USDA yield statistics were about 10-11 %.

  14. Comparison of Human Exploration Architecture and Campaign Approaches

    Science.gov (United States)

    Goodliff, Kandyce; Cirillo, William; Mattfeld, Bryan; Stromgren, Chel; Shyface, Hilary

    2015-01-01

    As part of an overall focus on space exploration, National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). In addition, various external organizations are studying options for beyond LEO exploration. Recent studies include NASA's Evolvable Mars Campaign and Design Reference Architecture (DRA) 5.0, JPL's Minimal Mars Architecture; the Inspiration Mars mission; the Mars One campaign; and the Global Exploration Roadmap (GER). Each of these potential exploration constructs applies unique methods, architectures, and philosophies for human exploration. It is beneficial to compare potential approaches in order to better understand the range of options available for exploration. Since most of these studies were conducted independently, the approaches, ground rules, and assumptions used to conduct the analysis differ. In addition, the outputs and metrics presented for each construct differ substantially. This paper will describe the results of an effort to compare and contrast the results of these different studies under a common set of metrics. The paper will first present a summary of each of the proposed constructs, including a description of the overall approach and philosophy for exploration. Utilizing a common set of metrics for comparison, the paper will present the results of an evaluation of the potential benefits, critical challenges, and uncertainties associated with each construct. The analysis framework will include a detailed evaluation of key characteristics of each construct. These will include but are not limited to: a description of the technology and capability developments required to enable the construct and the uncertainties associated with these developments; an analysis of significant operational and programmatic risks associated with that construct; and an evaluation of the extent to which exploration is enabled by the construct, including the destinations

  15. A Comparison of Routing Protocol for WSNs: Redundancy Based Approach A Comparison of Routing Protocol for WSNs: Redundancy Based Approach

    Directory of Open Access Journals (Sweden)

    Anand Prakash

    2014-03-01

    Full Text Available Wireless Sensor Networks (WSNs with their dynamic applications gained a tremendous attention of researchers. Constant monitoring of critical situations attracted researchers to utilize WSNs at vast platforms. The main focus in WSNs is to enhance network localization as much as one could, for efficient and optimal utilization of resources. Different approaches based upon redundancy are proposed for optimum functionality. Localization is always related with redundancy of sensor nodes deployed at remote areas for constant and fault tolerant monitoring. In this work, we propose a comparison of classic flooding and the gossip protocol for homogenous networks which enhances stability and throughput quiet significantly.  

  16. Comparison of two Minkowski-space approaches to heavy quarkonia

    Energy Technology Data Exchange (ETDEWEB)

    Leitao, Sofia; Biernat, Elmar P. [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Li, Yang [Iowa State University, Department of Physics and Astronomy, Ames, IA (United States); College of William and Mary, Department of Physics, Williamsburg, VA (United States); Maris, Pieter; Vary, James P. [Iowa State University, Department of Physics and Astronomy, Ames, IA (United States); Pena, M.T. [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Universidade de Lisboa, Departamento de Fisica, Instituto Superior Tecnico, Lisbon (Portugal); Stadler, Alfred [Universidade de Lisboa, CFTP, Instituto Superior Tecnico, Lisbon (Portugal); Universidade de Evora, Departamento de Fisica, Evora (Portugal)

    2017-10-15

    In this work we compare mass spectra and decay constants obtained from two recent, independent, and fully relativistic approaches to the quarkonium bound-state problem: the Basis Light-Front Quantization approach, where light-front wave functions are naturally formulated; and, the Covariant Spectator Theory (CST), based on a reorganization of the Bethe-Salpeter equation. Even though conceptually different, both solutions are obtained in Minkowski space. Comparisons of decay constants for more than ten states of charmonium and bottomonium show favorable agreement between the two approaches as well as with experiment where available. We also apply the Brodsky-Huang-Lepage prescription to convert the CST amplitudes into functions of light-front variables. This provides an ideal opportunity to investigate the similarities and differences at the level of the wave functions. Several qualitative features are observed in remarkable agreement between the two approaches even for the rarely addressed excited states. Leading-twist distribution amplitudes as well as parton distribution functions of heavy quarkonia are also analyzed. (orig.)

  17. Snell Envelope with Small Probability Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Del Moral, Pierre, E-mail: Pierre.Del-Moral@inria.fr; Hu, Peng, E-mail: Peng.Hu@inria.fr [Universite de Bordeaux I, Centre INRIA Bordeaux et Sud-Ouest and Institut de Mathematiques de Bordeaux (France); Oudjane, Nadia, E-mail: Nadia.Oudjane@edf.fr [EDF R and D Clamart (France)

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  18. Urban pavement surface temperature. Comparison of numerical and statistical approach

    Science.gov (United States)

    Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia

    2015-04-01

    The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.

  19. Energetics of Brazilian ethanol: Comparison between assessment approaches

    International Nuclear Information System (INIS)

    Ramirez Triana, Carlos Ariel

    2011-01-01

    As with any other bioenergy product, bioethanol production requires fossil fuel inputs; hence the alleged benefits of energy security and carbon mitigation depend on the extent to which these inputs are capable of drawing a substantive bioenergetic yield. Brazilian ethanol, made out of sugarcane, has been reported as the most efficient gasoline substitute that is commercially available nowadays. For that reason it has been the object of several analyses on the energetics, i.e. energy balances. These studies surprisingly vary widely according with the scholar approach and are not fully comparable among them due to divergences in the assessment method. This paper standardises results of the four most prominent authors in the field, establishing a point of comparison and drawing some light on the energetics studies on biofuels. The main result is shown in , which homogenises the outcomes for referred studies in terms of unit of assessment in the energy input analysis. Subsequently, this information is also charted () explaining the source of divergence among authors. This work ends with a short reference and comparison to some energy balance studies carried out on feedstocks of diverse nature, highlighting the potential that sugarcane-based bioethanol represents nowadays. - Highlights: → Distribution stage could reduce energy ratio but its contribution is not significant. → In Pimentel and Patzek there is an evident impact of the industrial stage. → A coincidence across the studies was the major impact of the agricultural stage. → Brazilian technology to produce ethanol was proved the most energy efficient one.

  20. Comparison of different approaches of modelling in a masonry building

    Science.gov (United States)

    Saba, M.; Meloni, D.

    2017-12-01

    The present work has the objective to model a simple masonry building, through two different modelling methods in order to assess their validity in terms of evaluation of static stresses. Have been chosen two of the most commercial software used to address this kind of problem, which are of S.T.A. Data S.r.l. and Sismicad12 of Concrete S.r.l. While the 3Muri software adopts the Frame by Macro Elements Method (FME), which should be more schematic and more efficient, Sismicad12 software uses the Finite Element Method (FEM), which guarantees accurate results, with greater computational burden. Remarkably differences of the static stresses, for such a simple structure between the two approaches have been found, and an interesting comparison and analysis of the reasons is proposed.

  1. A comparison of approaches in fitting continuum SEDs

    International Nuclear Information System (INIS)

    Liu Yao; Wang Hong-Chi; Madlener David; Wolf Sebastian

    2013-01-01

    We present a detailed comparison of two approaches, the use of a pre-calculated database and simulated annealing (SA), for fitting the continuum spectral energy distribution (SED) of astrophysical objects whose appearance is dominated by surrounding dust. While pre-calculated databases are commonly used to model SED data, only a few studies to date employed SA due to its unclear accuracy and convergence time for this specific problem. From a methodological point of view, different approaches lead to different fitting quality, demand on computational resources and calculation time. We compare the fitting quality and computational costs of these two approaches for the task of SED fitting to provide a guide to the practitioner to find a compromise between desired accuracy and available resources. To reduce uncertainties inherent to real datasets, we introduce a reference model resembling a typical circumstellar system with 10 free parameters. We derive the SED of the reference model with our code MC3 D at 78 logarithmically distributed wavelengths in the range [0.3 μm, 1.3 mm] and use this setup to simulate SEDs for the database and SA. Our result directly demonstrates the applicability of SA in the field of SED modeling, since the algorithm regularly finds better solutions to the optimization problem than a pre-calculated database. As both methods have advantages and shortcomings, a hybrid approach is preferable. While the database provides an approximate fit and overall probability distributions for all parameters deduced using Bayesian analysis, SA can be used to improve upon the results returned by the model grid.

  2. The LHC on an envelope

    CERN Multimedia

    2007-01-01

    The series of envelopes featuring CERN issued this summer was a huge success. The French postal services of the Pays de Gex will shortly be launching the second set of pre-paid envelopes issued in collaboration with the Laboratory this year, this time highlighting the LHC. Five thousand envelopes describing the accelerator’s capabilities will go on sale on 12 November, and some of the packs will even contain a small sample of the cables from the heart of the LHC magnets. The sets of ten pre-paid envelopes will tell you everything about CERN’s flagship accelerator, from its astounding technical capabilities to its spin-offs in the fields of technology and human resources. Each envelope will feature a different attribute or spin-off of the LHC. People will be invited to consult CERN’s public website for more detailed explanations if they want to know more. The new envelopes will be available from five post offices in the Pays ...

  3. The LHC in an envelope

    CERN Multimedia

    2007-01-01

    The series of envelopes featuring CERN issued this summer was a huge success. The French postal services of the Pays de Gex will shortly be launching the second set of pre-paid envelopes issued in collaboration with the Laboratory this year, this time highlighting the LHC. Five thousand envelopes describing the accelerator’s capabilities will go on sale on 12 November, and some of the packs will even contain a small sample of the cables from the heart of the LHC magnets. The sets of ten pre-paid envelopes will tell you everything about CERN’s flagship accelerator, from its astounding technical capabilities to its spin-offs in the fields of technology and human resources. Each envelope will feature a different attribute or spin-off of the LHC. People will be invited to consult CERN’s public website for more detailed explanations if they want to know more. The new envelopes will be available from five post offices in the Pays de Gex (Ferney-Voltaire, Prévessin...

  4. PSA data base, comparison of the German and French approach

    International Nuclear Information System (INIS)

    Kreuser, A.; Tirira, J.

    2001-01-01

    The results of probabilistic safety assessments (PSA) of nuclear power plants strongly depend on the reliability data used. This report describes coarsely the general process to generate reliability data for components and resumes the differences between the German and French approaches. As has been shown in former studies which compared international PSA data, PSA data are closely related to the model definitions of the PSA. Therefore single PSA data cannot be compared directly without regard e.g. to the corresponding fault trees. These findings are confirmed by this study. The comparison of German and French methods shows a lot of differences concerning various details of the data generation process. Some differences between single reliability data should be eliminated when taking into account the complete fault tree analysis. But there are some other differences which have a direct impact on the obtained results of a PSA. In view of the all differences between both approaches concerning the definition of data and the data collection process, it is not possible to compare directly German and French PSA data. However, the database differences give no indication on the influence on the PSA results. Therefore, it is a need to perform a common IPSN/GRS assessment on how the different databases impact the PSA results. (orig.)

  5. [An improved algorithm for electrohysterogram envelope extraction].

    Science.gov (United States)

    Lu, Yaosheng; Pan, Jie; Chen, Zhaoxia; Chen, Zhaoxia

    2017-02-01

    Extraction uterine contraction signal from abdominal uterine electromyogram(EMG) signal is considered as the most promising method to replace the traditional tocodynamometer(TOCO) for detecting uterine contractions activity. The traditional root mean square(RMS) algorithm has only some limited values in canceling the impulsive noise. In our study, an improved algorithm for uterine EMG envelope extraction was proposed to overcome the problem. Firstly, in our experiment, zero-crossing detection method was used to separate the burst of uterine electrical activity from the raw uterine EMG signal. After processing the separated signals by employing two filtering windows which have different width, we used the traditional RMS algorithm to extract uterus EMG envelope. To assess the performance of the algorithm, the improved algorithm was compared with two existing intensity of uterine electromyogram(IEMG) extraction algorithms. The results showed that the improved algorithm was better than the traditional ones in eliminating impulsive noise present in the uterine EMG signal. The measurement sensitivity and positive predictive value(PPV) of the improved algorithm were 0.952 and 0.922, respectively, which were not only significantly higher than the corresponding values(0.859 and 0.847) of the first comparison algorithm, but also higher than the values(0.928 and 0.877) of the second comparison algorithm. Thus the new method is reliable and effective.

  6. Use of response envelopes for seismic margin assessment of reinforced concrete walls and slabs

    Energy Technology Data Exchange (ETDEWEB)

    Ile, Nicolas; Frau, Alberto, E-mail: alberto.frau@cea.fr

    2017-04-01

    Highlights: • Proposal of a method for application of the elliptical envelope to RC shell elements. • Proposal of new algorithms for the seismic margin evaluation for RC shell elements. • Verification of a RC wall 3D structure, using the proposed assessment approach. - Abstract: Seismic safety evaluations of existing nuclear facilities are usually based on the assumption of structural linearity. For the design basis earthquake (DBE), it is reasonable to apply a conventional evaluation of the seismic safety of building structures and carry out a linear elastic analysis to assess the load effects on structural elements. Estimating the seismic capacity of a structural element requires an estimation of the critical combination of responses acting in this structural element and compare this combination with the capacity of the element. By exploiting the response-spectrum-based procedure for predicting the response envelopes in linear structures formulated by Menun and Der Kiureghian (2000a), algorithms are developed for the seismic margin assessment of reinforced concrete shell finite elements. These algorithms facilitate the comparison of the response-spectrum-based envelopes to prescribed capacity surfaces for the purpose of assessing the safety margin of this kind of structures. The practical application of elliptical response envelopes in case of shell finite elements is based on the use of layer models such as those developed by Marti (1990), which transfer the generalized stress field to three layers under the assumption that the two outer layers carry membrane forces and the internal layer carries only the out-of-plane shears. The utility of the assessment approach is discussed with reference to a case study of a 3D structure made of reinforced concrete walls.

  7. Use of response envelopes for seismic margin assessment of reinforced concrete walls and slabs

    International Nuclear Information System (INIS)

    Ile, Nicolas; Frau, Alberto

    2017-01-01

    Highlights: • Proposal of a method for application of the elliptical envelope to RC shell elements. • Proposal of new algorithms for the seismic margin evaluation for RC shell elements. • Verification of a RC wall 3D structure, using the proposed assessment approach. - Abstract: Seismic safety evaluations of existing nuclear facilities are usually based on the assumption of structural linearity. For the design basis earthquake (DBE), it is reasonable to apply a conventional evaluation of the seismic safety of building structures and carry out a linear elastic analysis to assess the load effects on structural elements. Estimating the seismic capacity of a structural element requires an estimation of the critical combination of responses acting in this structural element and compare this combination with the capacity of the element. By exploiting the response-spectrum-based procedure for predicting the response envelopes in linear structures formulated by Menun and Der Kiureghian (2000a), algorithms are developed for the seismic margin assessment of reinforced concrete shell finite elements. These algorithms facilitate the comparison of the response-spectrum-based envelopes to prescribed capacity surfaces for the purpose of assessing the safety margin of this kind of structures. The practical application of elliptical response envelopes in case of shell finite elements is based on the use of layer models such as those developed by Marti (1990), which transfer the generalized stress field to three layers under the assumption that the two outer layers carry membrane forces and the internal layer carries only the out-of-plane shears. The utility of the assessment approach is discussed with reference to a case study of a 3D structure made of reinforced concrete walls.

  8. Comparison of ductile-to-brittle transition curve fitting approaches

    International Nuclear Information System (INIS)

    Cao, L.W.; Wu, S.J.; Flewitt, P.E.J.

    2012-01-01

    Ductile-to-brittle transition (DBT) curve fitting approaches are compared over the transition temperature range for reactor pressure vessel steels with different kinds of data, including Charpy-V notch impact energy data and fracture toughness data. Three DBT curve fitting methods have been frequently used in the past, including the Burr S-Weibull and tanh distributions. In general there is greater scatter associated with test data obtained within the transition region. Therefore these methods give results with different accuracies, especially when fitting to small quantities of data. The comparison shows that the Burr distribution and tanh distribution can almost equally fit well distributed and large data sets extending across the test temperature range to include the upper and lower shelves. The S-Weibull distribution fit is poor for the lower shelf of the DBT curve. Overall for both large and small quantities of measured data the Burr distribution provides the best description. - Highlights: ► Burr distribution offers a better fit than that of a S-Weibull and tanh fit. ► Burr and tanh methods show similar fitting ability for a large data set. ► Burr method can fit sparse data well distributed across the test temperature. ► S-Weibull method cannot fit the lower shelf well and show poor fitting quality.

  9. Robustness Envelopes of Networks

    NARCIS (Netherlands)

    Trajanovski, S.; Martín-Hernández, J.; Winterbach, W.; Van Mieghem, P.

    2013-01-01

    We study the robustness of networks under node removal, considering random node failure, as well as targeted node attacks based on network centrality measures. Whilst both of these have been studied in the literature, existing approaches tend to study random failure in terms of average-case

  10. Moisture dynamics in building envelopes

    Energy Technology Data Exchange (ETDEWEB)

    Peuhkuri, R.

    2003-07-01

    The overall scope of this Thesis 'Moisture dynamics in building envelopes' has been to characterise how the various porous insulation materials investigated performed hygro thermally under conditions similar to those in a typical building envelope. As a result of the changing temperature and moisture conditions in the exterior weather and indoor climate the materials dynamically absorb and release moisture. The complexity of the impact of these conditions on the resulting moisture transport and content of the materials has been studied in this Thesis with controlled laboratory tests. (au)

  11. Moisture Dynamics in Building Envelopes

    DEFF Research Database (Denmark)

    Peuhkuri, Ruut Hannele

    2003-01-01

    The overall scope of this Thesis "Moisture dynamics in building envelopes" has been to characterise how the various porous insulation materials investigated performed hygrothermally under conditions similar to those in a typical building envelope. As a result of the changing temperature...... part of the Thesis consists of a theory and literature review on the moisture storage and transport processes (Chapter 2), on the non-Fickian moisture transport (Chapter 3)and on the methods for determining the moisture properties (Chapter 4). In the second part, the conducted experimental work...

  12. Nature of 'unseen' galactic envelopes

    International Nuclear Information System (INIS)

    McCrea, W.H.

    1983-01-01

    In this paper, it is suggested that unseen matter in a galactic envelope or in a group of galaxies may consist of substellar bodies originating as the first permanent 'stars' in the formation of a very massive galaxy according to a model for galaxy-formation on the basis of simple big-bang cosmology. (Auth.)

  13. Handbook on data envelopment analysis

    CERN Document Server

    Cooper, William W; Zhu, Joe

    2011-01-01

    Focusing on extensively used Data Envelopment Analysis topics, this volume aims to both describe the state of the field and extend the frontier of DEA research. New chapters include DEA models for DMUs, network DEA, models for supply chain operations and applications, and new developments.

  14. Sensitivity of Technical Efficiency Estimates to Estimation Methods: An Empirical Comparison of Parametric and Non-Parametric Approaches

    OpenAIRE

    de-Graft Acquah, Henry

    2014-01-01

    This paper highlights the sensitivity of technical efficiency estimates to estimation approaches using empirical data. Firm specific technical efficiency and mean technical efficiency are estimated using the non parametric Data Envelope Analysis (DEA) and the parametric Corrected Ordinary Least Squares (COLS) and Stochastic Frontier Analysis (SFA) approaches. Mean technical efficiency is found to be sensitive to the choice of estimation technique. Analysis of variance and Tukey’s test sugge...

  15. Conservation of the egg envelope digestion mechanism of hatching enzyme in euteleostean fishes.

    Science.gov (United States)

    Kawaguchi, Mari; Yasumasu, Shigeki; Shimizu, Akio; Sano, Kaori; Iuchi, Ichiro; Nishida, Mutsumi

    2010-12-01

    We purified two hatching enzymes, namely high choriolytic enzyme (HCE; EC 3.4.24.67) and low choriolytic enzyme (LCE; EC 3.4.24.66), from the hatching liquid of Fundulus heteroclitus, which were named Fundulus HCE (FHCE) and Fundulus LCE (FLCE). FHCE swelled the inner layer of egg envelope, and FLCE completely digested the FHCE-swollen envelope. In addition, we cloned three Fundulus cDNAs orthologous to cDNAs for the medaka precursors of egg envelope subunit proteins (i.e. choriogenins H, H minor and L) from the female liver. Cleavage sites of FHCE and FLCE on egg envelope subunit proteins were determined by comparing the N-terminal amino acid sequences of digests with the sequences deduced from the cDNAs for egg envelope subunit proteins. FHCE and FLCE cleaved different sites of the subunit proteins. FHCE efficiently cleaved the Pro-X-Y repeat regions into tripeptides to dodecapeptides to swell the envelope, whereas FLCE cleaved the inside of the zona pellucida domain, the core structure of egg envelope subunit protein, to completely digest the FHCE-swollen envelope. A comparison showed that the positions of hatching enzyme cleavage sites on egg envelope subunit proteins were strictly conserved between Fundulus and medaka. Finally, we extended such a comparison to three other euteleosts (i.e. three-spined stickleback, spotted halibut and rainbow trout) and found that the egg envelope digestion mechanism was well conserved among them. During evolution, the egg envelope digestion by HCE and LCE orthologs was established in the lineage of euteleosts, and the mechanism is suggested to be conserved. © 2010 The Authors Journal compilation © 2010 FEBS.

  16. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity

  17. Transparent ceramic lamp envelope materials

    Energy Technology Data Exchange (ETDEWEB)

    Wei, G C [OSRAM SYLVANIA, 71 Cherry Hill Drive, Beverly, MA 01915 (United States)

    2005-09-07

    Transparent ceramic materials with optical qualities comparable to single crystals of similar compositions have been developed in recent years, as a result of the improved understanding of powder-processing-fabrication- sintering-property inter-relationships. These high-temperature materials with a range of thermal and mechanical properties are candidate envelopes for focused-beam, short-arc lamps containing various fills operating at temperatures higher than quartz. This paper reviews the composition, structure and properties of transparent ceramic lamp envelope materials including sapphire, small-grained polycrystalline alumina, aluminium oxynitride, yttrium aluminate garnet, magnesium aluminate spinel and yttria-lanthana. A satisfactory thermal shock resistance is required for the ceramic tube to withstand the rapid heating and cooling cycles encountered in lamps. Thermophysical properties, along with the geometry, size and thickness of a transparent ceramic tube, are important parameters in the assessment of its resistance to fracture arising from thermal stresses in lamps during service. The corrosive nature of lamp-fill liquid and vapour at high temperatures requires that all lamp components be carefully chosen to meet the target life. The wide range of new transparent ceramics represents flexibility in pushing the limit of envelope materials for improved beamer lamps.

  18. Development of a pneumatic roof envelope for industrial greenhouses

    NARCIS (Netherlands)

    Lindner, G.; Vos, de G.J.

    2008-01-01

    The Eindhoven University of Technology (TU/e) was approached by Van Diemen BV, a turn-key greenhouse builder looking for a new and higher insulated design for their green house envelope. They had developed a new system of climate control which rendered windows for ventilation purposes unnecessary

  19. A Fiducial Approach to Extremes and Multiple Comparisons

    Science.gov (United States)

    Wandler, Damian V.

    2010-01-01

    Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…

  20. Safety analysis to support a safe operating envelope for fuel

    International Nuclear Information System (INIS)

    Gibb, R.A.; Reid, P.J.

    1998-01-01

    This paper presents an approach for defining a safe operating envelope for fuel. 'Safe operating envelope' is defined as an envelope of fuel parameters defined for application in safety analysis that can be related to, or used to define, the acceptable range of fuel conditions due to operational transients or deviations in fuel manufacturing processes. The paper describes the motivation for developing such a methodology. The methodology involved four steps: the update of fission product inventories, the review of sheath failure criteria, a review of input parameters to be used in fuel modelling codes, and the development of an improved fission product release code. This paper discusses the aspects of fuel sheath failure criteria that pertain to operating or manufacturing conditions and to the evaluation and selection of modelling input data. The other steps are not addressed in this paper since they have been presented elsewhere. (author)

  1. Data Envelopment Analysis of different climate policy scenarios

    International Nuclear Information System (INIS)

    Bosetti, Valentina; Buchner, Barbara

    2009-01-01

    Recent developments in the political, scientific and economic debate on climate change suggest that it is of critical importance to develop new approaches able to compare policy scenarios for their environmental effectiveness, their distributive effects, their enforceability, their costs and many other dimensions. This paper discusses a quantitative methodology to assess the relative performance of different climate policy scenarios when accounting for their long-term economic, social and environmental impacts. The proposed procedure is based on Data Envelopment Analysis, here employed in evaluating the relative efficiency of eleven global climate policy scenarios. The methodology provides a promising comparison framework; it can be seen as a way of setting some basic guidelines to frame further debates and negotiations and can be flexibly adopted and modified by decision makers to obtain relevant information for policy design. Three major findings emerge from this analysis: (1) stringent climate policies can outperform less ambitious proposals if all sustainability dimensions are taken into account; (2) a carefully chosen burden-sharing rule is able to bring together climate stabilisation and equity considerations; and (3) the most inefficient strategy results from the failure to negotiate a post-2012 global climate agreement. (author)

  2. A Quantitative Comparison of Semantic Web Page Segmentation Approaches

    NARCIS (Netherlands)

    Kreuzer, Robert; Hage, J.; Feelders, A.J.

    2015-01-01

    We compare three known semantic web page segmentation algorithms, each serving as an example of a particular approach to the problem, and one self-developed algorithm, WebTerrain, that combines two of the approaches. We compare the performance of the four algorithms for a large benchmark of modern

  3. Hong Kong Students' Approaches to Learning: Cross-Cultural Comparisons

    Science.gov (United States)

    Dasari, Bhoomiah

    2009-01-01

    Anecdotal evidence abounds in Hong Kong to the effect that students entering tertiary education are predisposed to a "rote" learning approach. With the internalisation of higher education in many countries, there is still insufficient understanding of how Chinese students approach their learning. Except few studies were conducted…

  4. Comparison of effective Hough transform-based fingerprint alignment approaches

    CSIR Research Space (South Africa)

    Mlambo, CS

    2014-08-01

    Full Text Available points set with larger rotation and small number of points. The DRBA approach was found to perform better with minutiae points with large amount of translation, and the computational time was less than that of LMBA approach. However, the memory usage...

  5. Investigating design: A comparison of manifest and latent approaches

    DEFF Research Database (Denmark)

    Cash, Philip; Snider, Chris

    2014-01-01

    This paper contributes to the on-going focus on improving design research methods, by exploring and synthesising two key interrelated research approaches e manifest and latent. These approaches are widely used individually in design research, however, this paper represents the first work bringing...

  6. A Comparison of Five Alternative Approaches to Information Systems Development

    Directory of Open Access Journals (Sweden)

    Rudy Hirschheim

    1997-11-01

    Full Text Available The field of information systems (IS has grown dramatically over the past three decades. Recent trends have transformed the IS landscape. These trends include: the evolution of implementation technology from centralized mainframe environments towards distributed client-server architectures, embracing the internet and intranets; changes in user interface technology from character-based to graphical user interfaces, multimedia, and the World Wide Web; changes in applications from transaction processing systems towards systems supporting collaborative work; and the use of information technology as an enabler of business process reengineering and redesign. These technology changes coupled with changes in organizations and their operating environment, such as the growth of the network and virtual organization, internationalization and globalization of many organizations, intensified global competition, changes in values such as customer orientation (service quality and Quality of Working Life, have imposed new demands on the development of information systems. These changes have led to an increasing discussion about information systems development (ISO, and in particular, the various methods, tools, methodologies, and approaches for ISD. We believe such discussion has opened the door for new, alternative IS development approaches and methodologies. Our paper takes up this theme by describing five alternative ISD approaches, namely the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, the Trade Unionist approach, and the Professional Work Practices approach. Despite the fact that most of these approaches have a history of over 15 years, their relevance to IS development is not well recognized in the mainstream of IS practice and research, nor is their institutional status comparable to traditional approaches such as structured analysis and design methods. Therefore we characterize the five approaches as 'alternative' in

  7. Knowledge-based biomedical word sense disambiguation: comparison of approaches

    Directory of Open Access Journals (Sweden)

    Aronson Alan R

    2010-11-01

    Full Text Available Abstract Background Word sense disambiguation (WSD algorithms attempt to select the proper sense of ambiguous terms in text. Resources like the UMLS provide a reference thesaurus to be used to annotate the biomedical literature. Statistical learning approaches have produced good results, but the size of the UMLS makes the production of training data infeasible to cover all the domain. Methods We present research on existing WSD approaches based on knowledge bases, which complement the studies performed on statistical learning. We compare four approaches which rely on the UMLS Metathesaurus as the source of knowledge. The first approach compares the overlap of the context of the ambiguous word to the candidate senses based on a representation built out of the definitions, synonyms and related terms. The second approach collects training data for each of the candidate senses to perform WSD based on queries built using monosemous synonyms and related terms. These queries are used to retrieve MEDLINE citations. Then, a machine learning approach is trained on this corpus. The third approach is a graph-based method which exploits the structure of the Metathesaurus network of relations to perform unsupervised WSD. This approach ranks nodes in the graph according to their relative structural importance. The last approach uses the semantic types assigned to the concepts in the Metathesaurus to perform WSD. The context of the ambiguous word and semantic types of the candidate concepts are mapped to Journal Descriptors. These mappings are compared to decide among the candidate concepts. Results are provided estimating accuracy of the different methods on the WSD test collection available from the NLM. Conclusions We have found that the last approach achieves better results compared to the other methods. The graph-based approach, using the structure of the Metathesaurus network to estimate the relevance of the Metathesaurus concepts, does not perform well

  8. Green Infrastructure and German Landscape Planning: A Comparison of Approaches

    Directory of Open Access Journals (Sweden)

    Catalina VIEIRA MEJÍA

    2015-11-01

    Full Text Available A variety of similarities between green infrastructure and the German landscape planning can be found in comparing the approaches of the two planning instruments. Principles of green infrastructure such as multifunctionality, the multi-scale approach and connectivity show correspondences with landscape planning elements. However, some differences are apparent. The objective of this paper is to determine whether the main aims of these two frameworks overlap. It also seeks to deduce what benefits from ecosystem services could be provided by integrating the green infrastructure approach into the German landscape planning system. The results show that the green infrastructure concept is not well-known in German planning practice, although its principles are generally implemented through traditional landscape planning. Nevertheless, green infrastructure could act as a supplementary approach to current landscape planning practices by improving public acceptance and strengthening the social focus of the current landscape planning system.

  9. Comparisons on International Approaches of Business and Project Risk Management

    OpenAIRE

    Nadia Carmen ENE

    2005-01-01

    In this article we intend to present a comparative approach between three recognized international methodologies for risk management: RISKMAN, Project Management Institute Methodology-PMBoK and Project Risk Analysis and Management Guide (produced by Association for Project Management).

  10. Comparison of topic extraction approaches and their results

    NARCIS (Netherlands)

    Velden, Theresa; Boyack, Kevin W.; Gläser, Jochen; Koopman, Rob; Scharnhorst, Andrea; Wang, Shenghui

    2017-01-01

    This is the last paper in the Synthesis section of this special issue on ‘Same Data, Different Results’. We first provide a framework of how to describe and distinguish approaches to topic extraction

  11. Cortical processing of dynamic sound envelope transitions.

    Science.gov (United States)

    Zhou, Yi; Wang, Xiaoqin

    2010-12-08

    Slow envelope fluctuations in the range of 2-20 Hz provide important segmental cues for processing communication sounds. For a successful segmentation, a neural processor must capture envelope features associated with the rise and fall of signal energy, a process that is often challenged by the interference of background noise. This study investigated the neural representations of slowly varying envelopes in quiet and in background noise in the primary auditory cortex (A1) of awake marmoset monkeys. We characterized envelope features based on the local average and rate of change of sound level in envelope waveforms and identified envelope features to which neurons were selective by reverse correlation. Our results showed that envelope feature selectivity of A1 neurons was correlated with the degree of nonmonotonicity in their static rate-level functions. Nonmonotonic neurons exhibited greater feature selectivity than monotonic neurons in quiet and in background noise. The diverse envelope feature selectivity decreased spike-timing correlation among A1 neurons in response to the same envelope waveforms. As a result, the variability, but not the average, of the ensemble responses of A1 neurons represented more faithfully the dynamic transitions in low-frequency sound envelopes both in quiet and in background noise.

  12. Spectral Envelope Transformation in Singing Voice for Advanced Pitch Shifting

    Directory of Open Access Journals (Sweden)

    José L. Santacruz

    2016-11-01

    Full Text Available The aim of the present work is to perform a step towards more natural pitch shifting techniques in singing voice for its application in music production and entertainment systems. In this paper, we present an advanced method to achieve natural modifications when applying a pitch shifting process to singing voice by modifying the spectral envelope of the audio excerpt. To this end, an all-pole model has been selected to model the spectral envelope, which is estimated using a constrained non-linear optimization. The analysis of the global variations of the spectral envelope was carried out by identifying changes of the parameters of the model along with the changes of the pitch. With the obtained spectral envelope transformation functions, we applied our pitch shifting scheme to some sustained vowels in order to compare results with the same transformation made by using the Flex Pitch plugin of Logic Pro X and pitch synchronous overlap and add technique (PSOLA. This comparison has been carried out by means of both an objective and a subjective evaluation. The latter was done with a survey open to volunteers on our website.

  13. CONTROL OF INDOOR ENVIRONMENTS VIA THE REGULATION OF BUILDING ENVELOPES

    Directory of Open Access Journals (Sweden)

    Mitja Košir

    2011-01-01

    Full Text Available The design of comfortable, healthy and stimulating indoor environments in buildings has a direct impact on the users and on energy consumption, as well as on the wider soci-economic environment of society.The indoor environment of buildings is defined with the formulation of the building envelope, which functions as an interface between the internal and external environments and its users. A properly designed, flexible and adequately controlled building envelope is a starting point in the formulation of a high-quality indoor environment. The systematic treatment of the indoor environment and building envelope from a user’s point of view represents an engineering approach that enables the holistic treatment of buildings, as well as integrated components and systems. The presented division of indoor environment in terms of visual, thermal, olfactory, acoustic and ergonomic sub-environments enables the classification and selection of crucial factors influencing design. This selection and classification can be implemented in the design, as well as in control applications of the building envelope. The implementation of the approach described is demonstrated with an example of an automated control system for the internal environment of an office in the building of the Faculty of Civil and Geodetic Engineering.

  14. A Multiple Cross-Cultural Comparison of Approaches to Learning

    Science.gov (United States)

    Bowden, Mark P.; Abhayawansa, Subhash; Manzin, Gregoria

    2015-01-01

    This study compares learning approaches of local English-speaking students and students from Asian countries studying at an Australian metropolitan university. The sample consists of students across 13 different countries. Unlike previous studies, students from Asian countries are subdivided into two categories: students from Confucian Heritage…

  15. Comparison of educational facilitation approaches for Grade R ...

    African Journals Online (AJOL)

    The Early Childhood Development Manager in Mpumalanga is faced with the problem of providing evidence-based guidance of the best facilitation approach in the Grade R context. An investigation on the effect of facilitation, i.e. play-based or formal instruction, on Grade R performance scores in English Additional ...

  16. A Comparison of HPT and Traditional Training Approaches.

    Science.gov (United States)

    Kretz, Richard

    2002-01-01

    Focuses on the comparative use of training from human performance technology (HPT) and traditional training perspectives, based on taxonomy. Concludes that the primary difference is a holistic systems performance improvement approach by eliminating barriers with HPT versus reaction or response to a set of business objectives in traditional…

  17. Comparison of Calculation Approaches for Monopiles for Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Augustesen, Anders Hust; Sørensen, Søren Peder Hyldal; Ibsen, Lars Bo

    2010-01-01

    Large-diameter (4 to 6m) monopiles are often used as foundations for offshore wind turbines. The monopiles are subjected to large horizontal forces and overturning moments and they are traditionally designed based on the p-y curve method (Winkler type approach). The p-y curves recommended in offs...

  18. Safeguards Envelope Progress FY10

    International Nuclear Information System (INIS)

    Metcalf, Richard

    2010-01-01

    The Safeguards Envelope is a strategy to determine a set of specific operating parameters within which nuclear facilities may operate to maximize safeguards effectiveness without sacrificing safety or plant efficiency. This paper details the additions to the advanced operating techniques that will be applied to real plant process monitoring (PM) data from the Idaho Chemical Processing Plant (ICPP). Research this year focused on combining disparate pieces of data together to maximize operating time with minimal downtime due to safeguards. A Chi-Square and Croiser's cumulative sum were both included as part of the new analysis. Because of a major issue with the original data, the implementation of the two new tests did not add to the existing set of tests, though limited one-variable optimization made a small increase in detection probability. Additional analysis was performed to determine if prior analysis would have caused a major security or safety operating envelope issue. It was determined that a safety issue would have resulted from the prior research, but that the security may have been increased under certain conditions.

  19. Comparison of attitude determination approaches using multiple Global Positioning System (GPS antennas

    Directory of Open Access Journals (Sweden)

    Wang Bing

    2013-02-01

    Full Text Available GPS-based attitude system is an important research field, since it is a valuable technique for the attitude determination of platforms. There exist two classes approaches for attitude determination using the GPS. The one determines attitude via baseline estimates in two frames, the other one solves for attitude by incorporating the attitude parameters directly into the GPS measurements. However, comparisons between these two classes approaches have been unexplored. First of all, two algorithms are introduced in detail which on behalf of these two kinds of approaches. Then we present numerical simulations demonstrating the performance of our algorithms and provide a comparison evaluating.

  20. Evolution of envelope solitons of ionization waves

    International Nuclear Information System (INIS)

    Ohe, K.; Hashimoto, M.

    1985-01-01

    The time evolution of a particle-like envelope soliton of ionization waves in plasma was investigated theoretically. The hydrodynamic equations of one spatial dimension were solved and the nonlinear dispersion relation was derived. For the amplitude of the wave the nonlinear Schroedinger equation was derived. Its soliton solution was interpreted as the envelope soliton which was experimentally found. The damping rate of the envelope soliton was estimated. (D.Gy.)

  1. Mutational library analysis of selected amino acids in the receptor binding domain of envelope of Akv murine leukemia virus by conditionally replication competent bicistronic vectors

    DEFF Research Database (Denmark)

    Bahrami, Shervin; Jespersen, Thomas; Pedersen, Finn Skou

    2003-01-01

    The envelope protein of retroviruses is responsible for viral entry into host cells. Here, we describe a mutational library approach to dissect functional domains of the envelope protein involving a retroviral vector, which expresses both the envelope protein of Akv murine leukemia virus (MLV) an...

  2. Comparison of Plant Life Management Approaches for Long Term Operations

    International Nuclear Information System (INIS)

    Kang, Kisig

    2012-01-01

    Plant life management can be defined as the integration of ageing and economic planning to maintain a high level of safety and optimize operations. Many Member States have given high priority to long term operation of nuclear power plants beyond the time frame originally anticipated (e. g. 30 or 40 years). Out of a total of 445 (369 GWe) operating nuclear power plants, 349 units (297 GWe) have been in operation for more than 20 years (as of November 2011). The need for engineering support to operation, maintenance, safety review and life management for long term operation as well as education and training in the field is increasingly evident. In addition the Fukushima accident has rendered all stake holders even more attentive to safety concerns and to the provision of beyond safety measures in the preparation and scrutiny of applications for operational design life extensions. In many countries, the safety performance of NPPs is periodically followed and characterized via the periodic safety review (PSR) approach. The regulatory The regulatory review and acceptance of the PSR gives the licensee the permission to operate the plant for up to the end of the next PSR cycle (usually 10 years). In the USA and other countries operating US designed plants, the license renewal application is based on the five pre-requisite requirements and ageing management programme for passive long life system structure and components(SSCs) and active systems is adequately addressed by the maintenance rule (MR) requirements and other established regulatory processes. Other Member States have adopted a combined approach that incorporates elements of both PSR and additional LRA specific requirements primarily focused on time limited ageing analysis. Taking into account this variety of approaches, the international atomic energy agency (IAEA) initiated work for collecting and sharing information among Member States about good practices on plant life management for long term operation in

  3. Comparison of Different Approaches to the Cutting Plan Scheduling

    Directory of Open Access Journals (Sweden)

    Peter Bober

    2011-10-01

    Full Text Available Allocation of specific cutting plans and their scheduling to individual cutting machines presents a combinatorial optimization problem. In this respect, various approaches and methods are used to arrive to a viable solution. The paper reports three approaches represented by three discreet optimization methods. The first one is back-tracing algorithm and serves as a reference to verify functionality of the other two ones. The second method is optimization using genetic algorithms, and the third one presents heuristic approach to optimization based on anticipated properties of an optimal solution. Research results indicate that genetic algorithms are demanding to calculate though not dependant on the selected objective function. Heuristic algorithm is fast but dependant upon anticipated properties of the optimal solution. Hence, at change of the objective function it has to be changed. When the scheduling by genetic algorithms is solvable in a sufficiently short period of time, it is more appropriate from the practical point than the heuristic algorithm. The back-tracing algorithm usually does not provide a result in a feasible period of time.

  4. Envelope correlation in (N, N) MIMO antenna array from scattering parameters

    DEFF Research Database (Denmark)

    Thaysen, Jesper; Jakobsen, Kaj Bjarne

    2006-01-01

    the envelope correlation coefficient. This approach has the advantage that it does not require knowledge of the antenna radiation pattern. Numerical data that include conductor and permittivity loss are shown to validate the approach. Using the scattering parameters for calculating the envelope correlation......A simple closed-form equation to calculate the envelope correlation between any two receiver or transmitter antennas in a multi-input multi-output (MIMO) system of an arbitrary number of elements is derived. The equation uses the scattering parameters obtained at the antenna feed point to calculate...

  5. An ANOVA approach for statistical comparisons of brain networks.

    Science.gov (United States)

    Fraiman, Daniel; Fraiman, Ricardo

    2018-03-16

    The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.

  6. Comparison of different homogenization approaches for elastic–viscoplastic materials

    International Nuclear Information System (INIS)

    Mercier, S; Molinari, A; Berbenni, S; Berveiller, M

    2012-01-01

    Homogenization of linear viscoelastic and non-linear viscoplastic composite materials is considered in this paper. First, we compare two homogenization schemes based on the Mori–Tanaka method coupled with the additive interaction (AI) law proposed by Molinari et al (1997 Mech. Mater. 26 43–62) or coupled with a concentration law based on translated fields (TF) originally proposed for the self-consistent scheme by Paquin et al (1999 Arch. Appl. Mech. 69 14–35). These methods are also evaluated against (i) full-field calculations of the literature based on the finite element method and on fast Fourier transform, (ii) available analytical exact solutions obtained in linear viscoelasticity and (iii) homogenization methods based on variational approaches. Developments of the AI model are obtained for linear and non-linear material responses while results for the TF method are shown for the linear case. Various configurations are considered: spherical inclusions, aligned fibers, hard and soft inclusions, large material contrasts between phases, volume-preserving versus dilatant anelastic flow, non-monotonic loading. The agreement between the AI and TF methods is excellent and the correlation with full field calculations is in general of quite good quality (with some exceptions for non-linear composites with a large volume fraction of very soft inclusions for which a discrepancy of about 15% was found for macroscopic stress). Description of the material behavior with internal variables can be accounted for with the AI and TF approaches and therefore complex loadings can be easily handled in contrast with most hereditary approaches. (paper)

  7. A comparison of the Bayesian and frequentist approaches to estimation

    CERN Document Server

    Samaniego, Francisco J

    2010-01-01

    This monograph contributes to the area of comparative statistical inference. Attention is restricted to the important subfield of statistical estimation. The book is intended for an audience having a solid grounding in probability and statistics at the level of the year-long undergraduate course taken by statistics and mathematics majors. The necessary background on Decision Theory and the frequentist and Bayesian approaches to estimation is presented and carefully discussed in Chapters 1-3. The 'threshold problem' - identifying the boundary between Bayes estimators which tend to outperform st

  8. A comprehensive comparison of comparative RNA structure prediction approaches

    DEFF Research Database (Denmark)

    Gardner, P. P.; Giegerich, R.

    2004-01-01

    -finding and multiple-sequence-alignment algorithms. Results Here we evaluate a number of RNA folding algorithms using reliable RNA data-sets and compare their relative performance. Conclusions We conclude that comparative data can enhance structure prediction but structure-prediction-algorithms vary widely in terms......Background An increasing number of researchers have released novel RNA structure analysis and prediction algorithms for comparative approaches to structure prediction. Yet, independent benchmarking of these algorithms is rarely performed as is now common practice for protein-folding, gene...

  9. Mosaic HIV envelope immunogenic polypeptides

    Science.gov (United States)

    Korber, Bette T. M.; Gnanakaran, S.; Perkins, Simon; Sodroski, Joseph; Haynes, Barton

    2018-01-02

    Disclosed herein are mosaic HIV envelope (Env) polypeptides that can elicit an immune response to HIV (such as cytotoxic T cell (CTL), helper T cell, and/or humoral responses). Also disclosed are sets of the disclosed mosaic Env polypeptides, which include two or more (for example, three) of the polypeptides. Also disclosed herein are methods for treating or inhibiting HIV in a subject including administering one or more of the disclosed immunogenic polypeptides or compositions to a subject infected with HIV or at risk of HIV infection. In some embodiments, the methods include inducing an immune response to HIV in a subject comprising administering to the subject at least one (such as two, three, or more) of the immunogenic polypeptides or at least one (such as two, three, or more) nucleic acids encoding at least one of the immunogenic polypeptides disclosed herein.

  10. Solar envelope zoning: application to the city planning process. Los Angeles case study

    Energy Technology Data Exchange (ETDEWEB)

    1980-06-01

    Solar envelope zoning represents a promising approach to solar access protection. A solar envelope defines the volume within which a building will not shade adjacent lots or buildings. Other solar access protection techniques, such as privately negotiated easements, continue to be tested and implemented but none offer the degree of comprehensiveness evident in this approach. Here, the City of Los Angeles, through the Mayor's Energy Office, the City Planning Department, and the City Attorney's Office, examine the feasibility of translating the concept of solar envelopes into zoning techniques. They concluded that envelope zoning is a fair and consistent method of guaranteeing solar access, but problems of complexity and uncertainty may limit its usefulness. Envelope zoning may be inappropriate for the development of high density centers and for more restrictive community plans. Aids or tools to administer envelope zoning need to be developed. Finally, some combination of approaches, including publicly recorded easements, subdivision approval and envelope zoning, need to be adopted to encourage solar use in cities. (MHR)

  11. Comparison of interbody fusion approaches for disabling low back pain.

    Science.gov (United States)

    Hacker, R J

    1997-03-15

    This is a study comparing two groups of patients surgically treated for disabling low back pain. One group was treated with lumbar anteroposterior fusion (360 degrees fusion), the other with posterior lumbar interbody fusion and an interbody fixation device. To determine which approach provided the best and most cost-effective outcome using similar patient selection criteria. Others have shown that certain patients with disabling low back pain benefit from lumbar fusion. Although rarely reported, the costs of different surgical treatments appear to vary significantly, whereas the patient outcome may vary little. Since 1991, 75 patients have been treated Starting in 1993, posterior lumbar interbody fusion BAK was offered to patients as an alternative to 360 degrees fusion. The treating surgeon reviewed the cases. The interbody fixation device used (BAK; Spine-Tech, Inc., Minneapolis, MN) was part of a Food and Drug Administration study. Patient selection criteria included examination, response to conservative therapy, imaging, psychological profile, and discography. North American Spine Society outcome questionnaires, BAK investigation data radiographs, chart entries, billing records and patient interviews were the basis for assessment. Age, sex compensable injury history and history of previous surgery were similar. Operative time; blood loss, hospitalization time, and total costs were significantly different. There was a quicker return to work and closure of workers compensation claims for the posterior lumbar interbody fusion-BAK group. Patient satisfaction was comparable at last follow-up. Posterior lumbar interbody fusion-BAK achieves equal patient satisfaction but fiscally surpasses the 360 degrees fusion approach. Today's environment of regulated medical practice requires the surgeon to consider cost effectiveness when performing fusion for low back pain.

  12. Comparison of Computational Approaches for Rapid Aerodynamic Assessment of Small UAVs

    Science.gov (United States)

    Shafer, Theresa C.; Lynch, C. Eric; Viken, Sally A.; Favaregh, Noah; Zeune, Cale; Williams, Nathan; Dansie, Jonathan

    2014-01-01

    Computational Fluid Dynamic (CFD) methods were used to determine the basic aerodynamic, performance, and stability and control characteristics of the unmanned air vehicle (UAV), Kahu. Accurate and timely prediction of the aerodynamic characteristics of small UAVs is an essential part of military system acquisition and air-worthiness evaluations. The forces and moments of the UAV were predicted using a variety of analytical methods for a range of configurations and conditions. The methods included Navier Stokes (N-S) flow solvers (USM3D, Kestrel and Cobalt) that take days to set up and hours to converge on a single solution; potential flow methods (PMARC, LSAERO, and XFLR5) that take hours to set up and minutes to compute; empirical methods (Datcom) that involve table lookups and produce a solution quickly; and handbook calculations. A preliminary aerodynamic database can be developed very efficiently by using a combination of computational tools. The database can be generated with low-order and empirical methods in linear regions, then replacing or adjusting the data as predictions from higher order methods are obtained. A comparison of results from all the data sources as well as experimental data obtained from a wind-tunnel test will be shown and the methods will be evaluated on their utility during each portion of the flight envelope.

  13. Low-cost phase change material as an energy storage medium in building envelopes: Experimental and numerical analyses

    International Nuclear Information System (INIS)

    Biswas, Kaushik; Abhari, Ramin

    2014-01-01

    Highlights: • Testing of a low-cost bio-PCM in an exterior wall under varying weather conditions. • Numerical model validation and annual simulations of PCM-enhanced cellulose insulation. • Reduced wall-generated cooling electricity consumption due to the application of PCM. • PCM performance was sensitive to its location and distribution within the wall. - Abstract: A promising approach to increasing the energy efficiency of buildings is the implementation of a phase change material (PCM) in the building envelope. Numerous studies over the last two decades have reported the energy saving potential of PCMs in building envelopes, but their wide application has been inhibited, in part, by their high cost. This article describes a novel PCM made of naturally occurring fatty acids/glycerides trapped into high density polyethylene (HDPE) pellets and its performance in a building envelope application. The PCM–HDPE pellets were mixed with cellulose insulation and then added to an exterior wall of a test building in a hot and humid climate, and tested over a period of several months. To demonstrate the efficacy of the PCM-enhanced cellulose insulation in reducing the building envelope heat gains and losses, a side-by-side comparison was performed with another wall section filled with cellulose-only insulation. Further, numerical modeling of the test wall was performed to determine the actual impact of the PCM–HDPE pellets on wall-generated heating and cooling loads and the associated electricity consumption. The model was first validated using experimental data and then used for annual simulations using typical meteorological year (TMY3) weather data. This article presents the experimental data and numerical analyses showing the energy-saving potential of the new PCM

  14. Implementation of an Improved Safe Operating Envelope

    International Nuclear Information System (INIS)

    Prime, Robyn; McIntyre, Mark; Reeves, David

    2008-01-01

    This paper is a continuation of the paper presented at IYNC 2004 on 'The Definition of a Safe Operating Envelope'. The current paper concentrates on the implementation process of the Safe Operating Envelope employed at the Point Lepreau Generating Station. (authors)

  15. Physical properties of the red giant envelopes

    Energy Technology Data Exchange (ETDEWEB)

    Maciel, W J [Instituto de Astronomia e Geofisico da Universidade de Sao Paulo (Brazil)

    1978-12-01

    In this work, several model envelopes are calculated for cool giant stars with mass loss due to the action of stellar radiation pressure on molecules and grains. Molecular profiles as well as average values of some physical parameters of the envelopes are obtained.

  16. Physical properties of the red giant envelopes

    International Nuclear Information System (INIS)

    Maciel, W.J.

    1978-01-01

    In this work, several model envelopes are calculated for cool giant stars with mass loss due to the action of stellar radiation pressure on molecules and grains. Molecular profiles as well as average values of some physical parameters of the envelopes are obtained [pt

  17. Implementation of an Improved Safe Operating Envelope

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Robyn; McIntyre, Mark [NB Power Nuclear, P.O. Box 600, Lepreau, NB (Canada); Reeves, David [Atlantic Nuclear Services Ltd., PO Box 1268 Fredericton, NB (Canada)

    2008-07-01

    This paper is a continuation of the paper presented at IYNC 2004 on 'The Definition of a Safe Operating Envelope'. The current paper concentrates on the implementation process of the Safe Operating Envelope employed at the Point Lepreau Generating Station. (authors)

  18. Comparison of four approaches to a rock facies classification problem

    Science.gov (United States)

    Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.

    2007-01-01

    In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.

  19. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  20. A Comparison of the Approaches to Customer Experience Analysis

    Directory of Open Access Journals (Sweden)

    Havíř David

    2017-08-01

    Full Text Available Nowadays, customer experience is receiving much attention in scientific and managerial community. Scholars and practitioners state that customer experience is the next area of competition. For a long time, there has been a call for a uniform, accurate definition, definition of its components, and the development of the customer experience frameworks. As this topic is new, there has been a considerable fragmentation. The question is if the fragmentation is still present and how can we address it. The aim of this paper is to summarize research on customer experience analysis and to explore and compare the dimensions describing customer experience listed in seven conceptual models with findings from 17 research projects on customer experience conducted after the year 2010. The purpose of this is to summarize recent knowledge, get the most comprehensive view on customer experience and its possible decomposition, and to reveal possible relationships between the dimensions. Based on a review of the available literature, the paper juxtaposes several approaches to customer experience analysis and compares their results to find similarities and differences among them. In the first step, the dimensions and factors of the customer experience were extracted from the seven models to analyze customer experience and they were compared with each other. This resulted in a set of dimensions and factors. In the next step, customer experience factors and dimensions were extracted from 17 practical research papers on customer experience. Finally, based on their descriptions and found similarities, the dimensions and factors were put together into several groups, as this grouping and creation of the new universal set of dimensions might solve the fragmentation issue.

  1. Comparison of wind mill cluster performance: A multicriteria approach

    Energy Technology Data Exchange (ETDEWEB)

    Rajakumar, D.G.; Nagesha, N. [Visvesvaraya Technological Univ., Karnataka (India)

    2012-07-01

    Energy is a crucial input for the economic and social development of any nation. Both renewable and non-renewable energy contribute in meeting the total requirement of the economy. As an affordable and clean energy source, wind energy is amongst the world's fastest growing renewable energy forms. Though there are several wind-mill clusters producing energy in different geographical locations, evaluating their performance is a complex task and not much of literature is available in this area. In this backdrop, an attempt is made in the current paper to estimate the performance of a wind-mill cluster through an index called Cluster Performance Index (CPI) adopting a multi-criteria approach. The proposed CPI comprises four criteria viz., Technical Performance Indicators (TePI), Economic Performance Indicators (EcPI), Environmental Performance Indicators (EnPI), and Sociological Performance Indicators (SoPI). Under each performance criterion a total of ten parameters are considered with five subjective and five objective oriented responses. The methodology is implemented by collecting empirical data from three wind-mill clusters located at Chitradurga, Davangere, and Gadag in the southern Indian State of Karnataka. Totally fifteen different stake holders are consulted through a set of structured researcher administered questionnaire to collect the relevant data in each wind farm. Stake holders involved engineers working in wind farms, wind farm developers, Government officials from energy department and a few selected residential people near the wind farms. The results of the study revealed that Chitradurga wind farm performed much better with a CPI of 45.267 as compared to Gadag (CPI of 28.362) and Davangere (CPI of 19.040) wind farms. (Author)

  2. Use of an excess variance approach for the certification of reference materials by interlaboratory comparison

    International Nuclear Information System (INIS)

    Crozet, M.; Rigaux, C.; Roudil, D.; Tuffery, B.; Ruas, A.; Desenfant, M.

    2014-01-01

    In the nuclear field, the accuracy and comparability of analytical results are crucial to insure correct accountancy, good process control and safe operational conditions. All of these require reliable measurements based on reference materials whose certified values must be obtained by robust metrological approaches according to the requirements of ISO guides 34 and 35. The data processing of the characterization step is one of the key steps of a reference material production process. Among several methods, the use of interlaboratory comparison results for reference material certification is very common. The DerSimonian and Laird excess variance approach, described and implemented in this paper, is a simple and efficient method for the data processing of interlaboratory comparison results for reference material certification. By taking into account not only the laboratory uncertainties but also the spread of the individual results into the calculation of the weighted mean, this approach minimizes the risk to get biased certified values in the case where one or several laboratories either underestimate their measurement uncertainties or do not identify all measurement biases. This statistical method has been applied to a new CETAMA plutonium reference material certified by interlaboratory comparison and has been compared to the classical weighted mean approach described in ISO Guide 35. This paper shows the benefits of using an 'excess variance' approach for the certification of reference material by interlaboratory comparison. (authors)

  3. On the equivalence of generalized least-squares approaches to the evaluation of measurement comparisons

    Science.gov (United States)

    Koo, A.; Clare, J. F.

    2012-06-01

    Analysis of CIPM international comparisons is increasingly being carried out using a model-based approach that leads naturally to a generalized least-squares (GLS) solution. While this method offers the advantages of being easier to audit and having general applicability to any form of comparison protocol, there is a lack of consensus over aspects of its implementation. Two significant results are presented that show the equivalence of three differing approaches discussed by or applied in comparisons run by Consultative Committees of the CIPM. Both results depend on a mathematical condition equivalent to the requirement that any two artefacts in the comparison are linked through a sequence of measurements of overlapping pairs of artefacts. The first result is that a GLS estimator excluding all sources of error common to all measurements of a participant is equal to the GLS estimator incorporating all sources of error, including those associated with any bias in the standards or procedures of the measuring laboratory. The second result identifies the component of uncertainty in the estimate of bias that arises from possible systematic effects in the participants' measurement standards and procedures. The expression so obtained is a generalization of an expression previously published for a one-artefact comparison with no inter-participant correlations, to one for a comparison comprising any number of repeat measurements of multiple artefacts and allowing for inter-laboratory correlations.

  4. Next-Generation Mitogenomics: A Comparison of Approaches Applied to Caecilian Amphibian Phylogeny

    OpenAIRE

    Maddock, Simon T.; Briscoe, Andrew G.; Wilkinson, Mark; Waeschenbach, Andrea; San Mauro, Diego; Day, Julia J.; Littlewood, D. Tim J.; Foster, Peter G.; Nussbaum, Ronald A.; Gower, David J.

    2016-01-01

    Mitochondrial genome (mitogenome) sequences are being generated with increasing speed due to the advances of next-generation sequencing (NGS) technology and associated analytical tools. However, detailed comparisons to explore the utility of alternative NGS approaches applied to the same taxa have not been undertaken. We compared a ‘traditional’ Sanger sequencing method with two NGS approaches (shotgun sequencing and non-indexed, multiplex amplicon sequencing) on four different sequencing pla...

  5. Fluoroscopically-Guided Posterior Approach for Shoulder Magnetic Resonance Arthrography: Comparison with Conventional Anterior Approach

    International Nuclear Information System (INIS)

    Yoo, Koun J.; Ha, Doo Hoe; Lee, Sang Min

    2011-01-01

    To prospectively evaluate the usefulness of the fluoroscopically-guided posterior approach compared with the anterior approach for shoulder magnetic resonance(MR) arthrography. Institutional review board approval and informed consent were obtained. Among 60 shoulder MR arthrographies performed on 59 patients with symptomatic shoulders, an intra-articular injection was performed (30 cases using the anterior approach and 30 using the posterior approach). Procedure-related pain was assessed by using a 5 score visual analogue scale (VAS). Depth of the puncture and standardized depth of puncture by body mass index (BMI) were recorded. The contrast leakage along the course of the puncture was evaluated by reviewing the MR. The statistical analyses included the Mann-Whitney U and Kruskal-Wallis test. There was no significant difference in VAS scores between the anterior and posterior groups (1.77 ± 1.10 vs. 1.80 ± 0.96). Depth of puncture and standardized depth of puncture by BMI were significantly shorter in the posterior group than those in the anterior group (4.4 ± 0.8 cm and 1.8 ± 0.3 cm vs. 6.6 ± 0.9 cm and 2.8 ± 0.4 cm, p < 0.001), respectively. The incidence of contrast leakage was more frequent in the posterior group (p = 0.003). The posterior approach will be useful in shoulder MR arthrography with a suspected anterior pathology, a postoperative follow-up study or obese patient.

  6. Injection envelope matching in storage rings

    International Nuclear Information System (INIS)

    Minty, M.G.; Spence, W.L.

    1995-05-01

    The shape and size of the transverse phase space injected into a storage ring can be deduced from turn-by-turn measurements of the transient behavior of the beam envelope in the ring. Envelope oscillations at 2 x the β-tron frequency indicate the presence of a β-mismatch, while envelope oscillations at the β-tron frequency are the signature of a dispersion function mismatch. Experiments in injection optimization using synchrotron radiation imaging of the beam and a fast-gated camera at the SLC damping rings are reported

  7. MHTGR thermal performance envelopes: Reliability by design

    International Nuclear Information System (INIS)

    Etzel, K.T.; Howard, W.W.; Zgliczynski, J.B.

    1992-05-01

    This document discusses thermal performance envelopes which are used to specify steady-state design requirements for the systems of the Modular High Temperature Gas-Cooled Reactor to maximize plant performance reliability with optimized design. The thermal performance envelopes are constructed around the expected operating point accounting for uncertainties in actual plant as-built parameters and plant operation. The components are then designed to perform successfully at all points within the envelope. As a result, plant reliability is maximized by accounting for component thermal performance variation in the design. The design is optimized by providing a means to determine required margins in a disciplined and visible fashion

  8. Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach: The Methodology.

    Science.gov (United States)

    Jaciw, Andrew P

    2016-06-01

    Various studies have examined bias in impact estimates from comparison group studies (CGSs) of job training programs, and in education, where results are benchmarked against experimental results. Such within-study comparison (WSC) approaches investigate levels of bias in CGS-based impact estimates, as well as the success of various design and analytic strategies for reducing bias. This article reviews past literature and summarizes conditions under which CGSs replicate experimental benchmark results. It extends the framework to, and develops the methodology for, situations where results from CGSs are generalized to untreated inference populations. Past research is summarized; methods are developed to examine bias in program impact estimates based on cross-site comparisons in a multisite trial that are evaluated against site-specific experimental benchmarks. Students in Grades K-3 in 79 schools in Tennessee; students in Grades 4-8 in 82 schools in Alabama. Grades K-3 Stanford Achievement Test (SAT) in reading and math scores; Grades 4-8 SAT10 reading scores. Past studies show that bias in CGS-based estimates can be limited through strong design, with local matching, and appropriate analysis involving pretest covariates and variables that represent selection processes. Extension of the methodology to investigate accuracy of generalized estimates from CGSs shows bias from confounders and effect moderators. CGS results, when extrapolated to untreated inference populations, may be biased due to variation in outcomes and impact. Accounting for effects of confounders or moderators may reduce bias. © The Author(s) 2016.

  9. AM Envelope. The potential of Additive Manufacturing for facade constructions

    Directory of Open Access Journals (Sweden)

    Holger Strauss

    2017-11-01

    Full Text Available This dissertation shows the potential of Additive Manufacturing (AM for the development of building envelopes: AM will change the way of designing facades, how we engineer and produce them. To achieve today’s demands from those future envelopes, we have to find new solutions. New technologies offer one possible way to do so. They open new approaches in designing, producing and processing building construction and facades. Finding the one capable of having big impact is difficult – Additive Manufacturing is one possible answer. The term ‘AM Envelope’ (Additive Manufacturing Envelope describes the transfer of this technology to the building envelope. Additive Fabrication is a building block that aids in developing the building envelope from a mere space enclosure to a dynamic building envelope. First beginnings of AM facade construction show up when dealing with relevant aspects like material consumption, mounting or part’s performance. From those starting points several parts of an existing post-and-beam façade system were optimized, aiming toward the implementation of AM into the production chain. Enhancements on all different levels of production were achieved: storing, producing, mounting and performance. AM offers the opportunity to manufacture facades ‘just in time’. It is no longer necessary to store or produce large numbers of parts in advance. Initial investment for tooling can be avoided, as design improvements can be realized within the dataset of the AM part. AM is based on ‘tool-less’ production, all parts can be further developed with every new generation. Producing tool-less also allows for new shapes and functional parts in small batch sizes – down to batch size one. The parts performance can be re-interpreted based on the demands within the system, not based on the limitations of conventional manufacturing. AM offers new ways of materializing the physical part around its function. It leads toward customized

  10. Automatic fitting of conical envelopes to free-form surfaces for flank CNC machining

    OpenAIRE

    Bo P.; Bartoň M.; Pottmann H.

    2017-01-01

    We propose a new algorithm to detect patches of free-form surfaces that can be well approximated by envelopes of a rotational cone under a rigid body motion. These conical envelopes are a preferable choice from the manufacturing point of view as they are, by-definition, manufacturable by computer numerically controlled (CNC) machining using the efficient flank (peripheral) method with standard conical tools. Our geometric approach exploits multi-valued vector fields that consist of vectors in...

  11. Constructing canonical bases of quantized enveloping algebras

    OpenAIRE

    Graaf, W.A. de

    2001-01-01

    An algorithm for computing the elements of a given weight of the canonical basis of a quantized enveloping algebra is described. Subsequently, a similar algorithm is presented for computing the canonical basis of a finite-dimensional module.

  12. Creating a Lunar EVA Work Envelope

    Science.gov (United States)

    Griffin, Brand N.; Howard, Robert; Rajulu, Sudhakar; Smitherman, David

    2009-01-01

    A work envelope has been defined for weightless Extravehicular Activity (EVA) based on the Space Shuttle Extravehicular Mobility Unit (EMU), but there is no equivalent for planetary operations. The weightless work envelope is essential for planning all EVA tasks because it determines the location of removable parts, making sure they are within reach and visibility of the suited crew member. In addition, using the envelope positions the structural hard points for foot restraints that allow placing both hands on the job and provides a load path for reacting forces. EVA operations are always constrained by time. Tasks are carefully planned to ensure the crew has enough breathing oxygen, cooling water, and battery power. Planning first involves computers using a virtual work envelope to model tasks, next suited crew members in a simulated environment refine the tasks. For weightless operations, this process is well developed, but planetary EVA is different and no work envelope has been defined. The primary difference between weightless and planetary work envelopes is gravity. It influences anthropometry, horizontal and vertical mobility, and reaction load paths and introduces effort into doing "overhead" work. Additionally, the use of spacesuits other than the EMU, and their impacts on range of motion, must be taken into account. This paper presents the analysis leading to a concept for a planetary EVA work envelope with emphasis on lunar operations. There is some urgency in creating this concept because NASA has begun building and testing development hardware for the lunar surface, including rovers, habitats and cargo off-loading equipment. Just as with microgravity operations, a lunar EVA work envelope is needed to guide designers in the formative stages of the program with the objective of avoiding difficult and costly rework.

  13. The Arabidopsis Nuclear Pore and Nuclear Envelope

    OpenAIRE

    Meier, Iris; Brkljacic, Jelena

    2010-01-01

    The nuclear envelope is a double membrane structure that separates the eukaryotic cytoplasm from the nucleoplasm. The nuclear pores embedded in the nuclear envelope are the sole gateways for macromolecular trafficking in and out of the nucleus. The nuclear pore complexes assembled at the nuclear pores are large protein conglomerates composed of multiple units of about 30 different nucleoporins. Proteins and RNAs traffic through the nuclear pore complexes, enabled by the interacting activities...

  14. All the Universe in an envelope

    CERN Multimedia

    2007-01-01

    Do you know which force is hidden in an envelope or how many billions of years old are the atoms it contains? You will find the answers to these (curious) questions in a post office in the Pays de Gex. The French postal services of the Pays de Gex are again issuing pre-paid envelopes in collaboration with CERN (see Bulletin No. 24/2006). The new series presents some of the concepts of modern physics in an amazing way by showing what you can learn about the Universe with a single envelope. Packets of ten pre-stamped envelopes, each carrying a statement on fundamental physics, will be on sale from 7 July onwards. To learn more about the physics issues presented on the envelopes, people are invited to go to the CERN Web site where they will find the explanations. Five thousand envelopes will be put on sale in July and five thousand more during the French "Fête de la science" in October. They will be available from five post offices in the Pays de Gex (F...

  15. Genetic Diversity of Koala Retroviral Envelopes

    Directory of Open Access Journals (Sweden)

    Wenqin Xu

    2015-03-01

    Full Text Available Genetic diversity, attributable to the low fidelity of reverse transcription, recombination and mutation, is an important feature of infectious retroviruses. Under selective pressure, such as that imposed by superinfection interference, gammaretroviruses commonly adapt their envelope proteins to use alternative receptors to overcome this entry block. The first characterized koala retroviruses KoRV subgroup A (KoRV-A were remarkable in their absence of envelope genetic variability. Once it was determined that KoRV-A was present in all koalas in US zoos, regardless of their disease status, we sought to isolate a KoRV variant whose presence correlated with neoplastic malignancies. More than a decade after the identification of KoRV-A, we isolated a second subgroup of KoRV, KoRV-B from koalas with lymphomas. The envelope proteins of KoRV-A and KoRV-B are sufficiently divergent to confer the ability to bind and employ distinct receptors for infection. We have now obtained a number of additional KoRV envelope variants. In the present studies we report these variants, and show that they differ from KoRV-A and KoRV-B envelopes in their host range and superinfection interference properties. Thus, there appears to be considerable variation among KoRVs envelope genes suggesting genetic diversity is a factor following the KoRV-A infection process.

  16. COMPARISON OF THE TRADITIONAL STRENGTH OF MATERIALS APPROACH TO DESIGN WITH THE FRACTURE MECHANICS APPROACH

    International Nuclear Information System (INIS)

    Z. Ceylan

    2002-01-01

    The objective of this activity is to show that the use of the traditional strength of materials approach to the drip shield and the waste package (WP) designs is bounding and appropriate when compared to the fracture mechanics approach. The scope of this activity is limited to determining the failure assessment diagrams for the two materials at issue: Ti-7 and Alloy 22. This calculation is intended for use in support of the license application design of the drip shield and the WP. This activity is associated with the drip shield and the WP designs. The activity evaluation for work package number P32 12234F2, included in ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 1, p. A-6), has determined that the development of this document is subject to ''Quality Assurance Requirements and Description'' requirements. The control of the electronic management of data is accomplished in accordance with the methods specified in Reference 1, Section 10. AP-3.124, ''Design Calculations and Analysis'' (Ref. 2), is used to develop and document the calculation

  17. Solitary Alfven wave envelopes and the modulational instability

    International Nuclear Information System (INIS)

    Kennel, C.F.

    1987-06-01

    The derivative nonlinear Schroedinger equation describes the modulational instability of circularly polarized dispersive Alfven wave envelopes. It also may be used to determine the properties of finite amplitude localized stationary wave envelopes. Such envelope solitons exist only in conditions of modulational stability. This leaves open the question of whether, and if so, how, the modulational instability produces envelope solitons. 12 refs

  18. Multiple Score Comparison: a network meta-analysis approach to comparison and external validation of prognostic scores

    Directory of Open Access Journals (Sweden)

    Sarah R. Haile

    2017-12-01

    Full Text Available Abstract Background Prediction models and prognostic scores have been increasingly popular in both clinical practice and clinical research settings, for example to aid in risk-based decision making or control for confounding. In many medical fields, a large number of prognostic scores are available, but practitioners may find it difficult to choose between them due to lack of external validation as well as lack of comparisons between them. Methods Borrowing methodology from network meta-analysis, we describe an approach to Multiple Score Comparison meta-analysis (MSC which permits concurrent external validation and comparisons of prognostic scores using individual patient data (IPD arising from a large-scale international collaboration. We describe the challenges in adapting network meta-analysis to the MSC setting, for instance the need to explicitly include correlations between the scores on a cohort level, and how to deal with many multi-score studies. We propose first using IPD to make cohort-level aggregate discrimination or calibration scores, comparing all to a common comparator. Then, standard network meta-analysis techniques can be applied, taking care to consider correlation structures in cohorts with multiple scores. Transitivity, consistency and heterogeneity are also examined. Results We provide a clinical application, comparing prognostic scores for 3-year mortality in patients with chronic obstructive pulmonary disease using data from a large-scale collaborative initiative. We focus on the discriminative properties of the prognostic scores. Our results show clear differences in performance, with ADO and eBODE showing higher discrimination with respect to mortality than other considered scores. The assumptions of transitivity and local and global consistency were not violated. Heterogeneity was small. Conclusions We applied a network meta-analytic methodology to externally validate and concurrently compare the prognostic properties

  19. Measurement and Comparison of Variance in the Performance of Algerian Universities using models of Returns to Scale Approach

    Directory of Open Access Journals (Sweden)

    Imane Bebba

    2017-08-01

    Full Text Available This study aimed to measure and compare the performance of forty-seven Algerian universities, using models of returns to Scale approach, which is based primarily on the Data Envelopment Analysis  method. In order to achieve the objective of the study, a set of variables was chosen to represent the dimension of teaching. The variables consisted of three input variables, which were:  the total number of students  in the undergraduate level, students in the post graduate level and the number of permanent professors. On the other hand, the output variable was represented by the total number of students holding degrees of the two levels. Four basic models for data envelopment analysis method were applied. These were: (Scale Returns, represented by input-oriented and output-oriented constant returns and input-oriented and output-oriented  variable returns. After the analysis of data, results revealed that eight universities achieved full efficiency according to constant returns to scale in both input and output orientations. Seventeen universities achieved full efficiency according to the model of input-oriented returns to scale variable. Sixteen universities achieved full efficiency according to the model of output-oriented  returns to scale variable. Therefore, during the performance measurement, the size of the university, competition, financial and infrastructure constraints, and the process of resource allocation within the university  should be taken into consideration. Also, multiple input and output variables reflecting the dimensions of teaching, research, and community service should be included while measuring and assessing the performance of Algerian universities, rather than using two variables which do not reflect the actual performance of these universities. Keywords: Performance of Algerian Universities, Data envelopment analysis method , Constant returns to scale, Variable returns to scale, Input-orientation, Output-orientation.

  20. Biotic interactions in the face of climate change: a comparison of three modelling approaches.

    Directory of Open Access Journals (Sweden)

    Anja Jaeschke

    Full Text Available Climate change is expected to alter biotic interactions, and may lead to temporal and spatial mismatches of interacting species. Although the importance of interactions for climate change risk assessments is increasingly acknowledged in observational and experimental studies, biotic interactions are still rarely incorporated in species distribution models. We assessed the potential impacts of climate change on the obligate interaction between Aeshna viridis and its egg-laying plant Stratiotes aloides in Europe, based on an ensemble modelling technique. We compared three different approaches for incorporating biotic interactions in distribution models: (1 We separately modelled each species based on climatic information, and intersected the future range overlap ('overlap approach'. (2 We modelled the potential future distribution of A. viridis with the projected occurrence probability of S. aloides as further predictor in addition to climate ('explanatory variable approach'. (3 We calibrated the model of A. viridis in the current range of S. aloides and multiplied the future occurrence probabilities of both species ('reference area approach'. Subsequently, all approaches were compared to a single species model of A. viridis without interactions. All approaches projected a range expansion for A. viridis. Model performance on test data and amount of range gain differed depending on the biotic interaction approach. All interaction approaches yielded lower range gains (up to 667% lower than the model without interaction. Regarding the contribution of algorithm and approach to the overall uncertainty, the main part of explained variation stems from the modelling algorithm, and only a small part is attributed to the modelling approach. The comparison of the no-interaction model with the three interaction approaches emphasizes the importance of including obligate biotic interactions in projective species distribution modelling. We recommend the use of

  1. Comparison of two approaches for establishing performance criteria related to Maintenance Rule

    International Nuclear Information System (INIS)

    Jerng, Dong-Wook; Kim, Man Cheol

    2015-01-01

    Probabilistic safety assessment (PSA) serves as a tool for systemically analyzing the safety of nuclear power plants. This paper explains and compares two approaches for the establishment of performance criteria related to the Maintenance Rule: (1) the individual reliability-based approach, and (2) the PSA importance measure-based approach. Different characteristics of the two approaches were compared in a qualitative manner, while a quantitative comparison was performed through application of the two approaches to a nuclear power plant. It was observed that the individual reliability-based approach resulted in more conservative performance criteria, compared to the PSA importance measure-based approach. It is thus expected that the PSA importance measure-based approach will allow for more flexible maintenance policy under conditions of limited resources, while providing for a macroscopic view of overall plant safety. Based on insights derived through this analysis, we emphasize the importance of a balance between reliability and safety significance, and propose a balance measure accordingly. The conclusions of this analysis are likely to be applicable to other types of nuclear power plants. (author)

  2. Inhibition of enveloped viruses infectivity by curcumin.

    Directory of Open Access Journals (Sweden)

    Tzu-Yen Chen

    Full Text Available Curcumin, a natural compound and ingredient in curry, has antiinflammatory, antioxidant, and anticarcinogenic properties. Previously, we reported that curcumin abrogated influenza virus infectivity by inhibiting hemagglutination (HA activity. This study demonstrates a novel mechanism by which curcumin inhibits the infectivity of enveloped viruses. In all analyzed enveloped viruses, including the influenza virus, curcumin inhibited plaque formation. In contrast, the nonenveloped enterovirus 71 remained unaffected by curcumin treatment. We evaluated the effects of curcumin on the membrane structure using fluorescent dye (sulforhodamine B; SRB-containing liposomes that mimic the viral envelope. Curcumin treatment induced the leakage of SRB from these liposomes and the addition of the influenza virus reduced the leakage, indicating that curcumin disrupts the integrity of the membranes of viral envelopes and of liposomes. When testing liposomes of various diameters, we detected higher levels of SRB leakage from the smaller-sized liposomes than from the larger liposomes. Interestingly, the curcumin concentration required to reduce plaque formation was lower for the influenza virus (approximately 100 nm in diameter than for the pseudorabies virus (approximately 180 nm and the vaccinia virus (roughly 335 × 200 × 200 nm. These data provide insights on the molecular antiviral mechanisms of curcumin and its potential use as an antiviral agent for enveloped viruses.

  3. Featured Image: Orbiting Stars Share an Envelope

    Science.gov (United States)

    Kohler, Susanna

    2016-03-01

    This beautiful series of snapshots from a simulation (click for a better look!) shows what happens when two stars in a binary system become enclosed in the same stellar envelope. In this binary system, one of the stars has exhausted its hydrogen fuel and become a red giant, complete with an expanding stellar envelope composed of hydrogen and helium. Eventually, the envelope expands so much that the companion star falls into it, where it releases gravitational potential energy into the common envelope. A team led by Sebastian Ohlmann (Heidelberg Institute for Theoretical Studies and University of Wrzburg) recently performed hydrodynamic simulations of this process. Ohlmann and collaborators discovered that the energy release eventually triggers large-scale flow instabilities, which leads to turbulence within the envelope. This process has important consequences for how these systems next evolve (for instance, determining whether or not a supernova occurs!). You can check out the authors video of their simulated stellar inspiral below, or see their paper for more images and results from their study.CitationSebastian T. Ohlmann et al 2016 ApJ 816 L9. doi:10.3847/2041-8205/816/1/L9

  4. Inhibition of Enveloped Viruses Infectivity by Curcumin

    Science.gov (United States)

    Wen, Hsiao-Wei; Ou, Jun-Lin; Chiou, Shyan-Song; Chen, Jo-Mei; Wong, Min-Liang; Hsu, Wei-Li

    2013-01-01

    Curcumin, a natural compound and ingredient in curry, has antiinflammatory, antioxidant, and anticarcinogenic properties. Previously, we reported that curcumin abrogated influenza virus infectivity by inhibiting hemagglutination (HA) activity. This study demonstrates a novel mechanism by which curcumin inhibits the infectivity of enveloped viruses. In all analyzed enveloped viruses, including the influenza virus, curcumin inhibited plaque formation. In contrast, the nonenveloped enterovirus 71 remained unaffected by curcumin treatment. We evaluated the effects of curcumin on the membrane structure using fluorescent dye (sulforhodamine B; SRB)-containing liposomes that mimic the viral envelope. Curcumin treatment induced the leakage of SRB from these liposomes and the addition of the influenza virus reduced the leakage, indicating that curcumin disrupts the integrity of the membranes of viral envelopes and of liposomes. When testing liposomes of various diameters, we detected higher levels of SRB leakage from the smaller-sized liposomes than from the larger liposomes. Interestingly, the curcumin concentration required to reduce plaque formation was lower for the influenza virus (approximately 100 nm in diameter) than for the pseudorabies virus (approximately 180 nm) and the vaccinia virus (roughly 335 × 200 × 200 nm). These data provide insights on the molecular antiviral mechanisms of curcumin and its potential use as an antiviral agent for enveloped viruses. PMID:23658730

  5. Computation of Phase Equilibrium and Phase Envelopes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp

    formulate the involved equations in terms of the fugacity coefficients. We present expressions for the first-order derivatives. Such derivatives are necessary in computationally efficient gradient-based methods for solving the vapor-liquid equilibrium equations and for computing phase envelopes. Finally, we......In this technical report, we describe the computation of phase equilibrium and phase envelopes based on expressions for the fugacity coefficients. We derive those expressions from the residual Gibbs energy. We consider 1) ideal gases and liquids modeled with correlations from the DIPPR database...... and 2) nonideal gases and liquids modeled with cubic equations of state. Next, we derive the equilibrium conditions for an isothermal-isobaric (constant temperature, constant pressure) vapor-liquid equilibrium process (PT flash), and we present a method for the computation of phase envelopes. We...

  6. Boundaries, injective envelopes, and reduced crossed products

    DEFF Research Database (Denmark)

    Bryder, Rasmus Sylvester

    In this dissertation, we study boundary actions, equivariant injective envelopes, as well as theideal structure of reduced crossed products. These topics have recently been linked to thestudy of C-simple groups, that is, groups with simple reduced group C-algebras.In joint work with Matthew Kennedy......, we consider reduced twisted crossed products overC-simple groups. For any twisted C-dynamical system over a C-simple group, we provethat there is a one-to-one correspondence between maximal invariant ideals in the underlyingC-algebra and maximal ideals in the reduced crossed product. When......*-algebras, and relate the intersection property for group actions on unital C*-algebras to the intersection property for theequivariant injective envelope. Moreover, we also prove that the equivariant injective envelopeof the centre of the injective envelope of a unital C*-algebra can be regarded as a C...

  7. Self-Regulatory Behaviors and Approaches to Learning of Arts Students: A Comparison between Professional Training and English Learning

    Science.gov (United States)

    Tseng, Min-chen; Chen, Chia-cheng

    2017-01-01

    This study investigated the self-regulatory behaviors of arts students, namely memory strategy, goal-setting, self-evaluation, seeking assistance, environmental structuring, learning responsibility, and planning and organizing. We also explored approaches to learning, including deep approach (DA) and surface approach (SA), in a comparison between…

  8. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    Science.gov (United States)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  9. A study of some Be star envelopes

    International Nuclear Information System (INIS)

    Kitchen, C.R.

    1976-01-01

    The envelope model and emission region radius of six Be stars have been determined from 36 lines on 15 spectra taken with the Isaac Newton telescope. The results have been compared with earlier determinations to search for changes with the time. No definite evidence for such changes has been found, although there may be an indication of a change in phi Per. A re-determination of the errors involved in the method of analysis shows that these are smaller than previously estimated and range from about 9% to 35% for both envelope model and emission region radius. (Auth.)

  10. Asymmetry of the SN 1987A envelope

    International Nuclear Information System (INIS)

    Chugaj, N.N.

    1991-01-01

    The origin of the peculiar structure in the profiles of the emission lines observed in the spectrum of SN 1987A, namely, (1) redshift of maxima, and (2) fine structure of hydrogen lines, is considered. Among the three proposed hypothesis for the redshift, at least two (electron scattering in the spherically-symmetric envelope, and geometrical effects in the fragmented envelope) have serious drawbacks. More favorable is the third hypothesis which invokes asymmetric distribution of 56 Ni and of the iron-peak elements

  11. Radio Imaging of Envelopes of Evolved Stars

    Science.gov (United States)

    Cotton, Bill

    2018-04-01

    This talk will cover imaging of stellar envelopes using radio VLBI techniques; special attention will be paid to the technical differences between radio and optical/IR interferomery. Radio heterodyne receivers allow a straightforward way to derive spectral cubes and full polarization observations. Milliarcsecond resolution of very bright, i.e. non thermal, emission of molecular masers in the envelopes of evolved stars can be achieved using VLBI techniques with baselines of thousands of km. Emission from SiO, H2O and OH masers are commonly seen at increasing distance from the photosphere. The very narrow maser lines allow accurate measurements of the velocity field within the emitting region.

  12. Global Envelope Tests for Spatial Processes

    DEFF Research Database (Denmark)

    Myllymäki, Mari; Mrkvička, Tomáš; Grabarnik, Pavel

    2017-01-01

    Envelope tests are a popular tool in spatial statistics, where they are used in goodness-of-fit testing. These tests graphically compare an empirical function T(r) with its simulated counterparts from the null model. However, the type I error probability α is conventionally controlled for a fixed d......) the construction of envelopes for a deviation test. These new tests allow the a priori selection of the global α and they yield p-values. We illustrate these tests using simulated and real point pattern data....

  13. Global envelope tests for spatial processes

    DEFF Research Database (Denmark)

    Myllymäki, Mari; Mrkvička, Tomáš; Grabarnik, Pavel

    Envelope tests are a popular tool in spatial statistics, where they are used in goodness-of-fit testing. These tests graphically compare an empirical function T(r) with its simulated counterparts from the null model. However, the type I error probability α is conventionally controlled for a fixed......) the construction of envelopes for a deviation test. These new tests allow the a priori selection of the global α and they yield p-values. We illustrate these tests using simulated and real point pattern data....

  14. Computer-aided detection of masses in digital tomosynthesis mammography: Comparison of three approaches

    International Nuclear Information System (INIS)

    Chan Heangping; Wei Jun; Zhang Yiheng; Helvie, Mark A.; Moore, Richard H.; Sahiner, Berkman; Hadjiiski, Lubomir; Kopans, Daniel B.

    2008-01-01

    The authors are developing a computer-aided detection (CAD) system for masses on digital breast tomosynthesis mammograms (DBT). Three approaches were evaluated in this study. In the first approach, mass candidate identification and feature analysis are performed in the reconstructed three-dimensional (3D) DBT volume. A mass likelihood score is estimated for each mass candidate using a linear discriminant analysis (LDA) classifier. Mass detection is determined by a decision threshold applied to the mass likelihood score. A free response receiver operating characteristic (FROC) curve that describes the detection sensitivity as a function of the number of false positives (FPs) per breast is generated by varying the decision threshold over a range. In the second approach, prescreening of mass candidate and feature analysis are first performed on the individual two-dimensional (2D) projection view (PV) images. A mass likelihood score is estimated for each mass candidate using an LDA classifier trained for the 2D features. The mass likelihood images derived from the PVs are backprojected to the breast volume to estimate the 3D spatial distribution of the mass likelihood scores. The FROC curve for mass detection can again be generated by varying the decision threshold on the 3D mass likelihood scores merged by backprojection. In the third approach, the mass likelihood scores estimated by the 3D and 2D approaches, described above, at the corresponding 3D location are combined and evaluated using FROC analysis. A data set of 100 DBT cases acquired with a GE prototype system at the Breast Imaging Laboratory in the Massachusetts General Hospital was used for comparison of the three approaches. The LDA classifiers with stepwise feature selection were designed with leave-one-case-out resampling. In FROC analysis, the CAD system for detection in the DBT volume alone achieved test sensitivities of 80% and 90% at average FP rates of 1.94 and 3.40 per breast, respectively. With the

  15. Technical note: Comparison of methane ebullition modelling approaches used in terrestrial wetland models

    Science.gov (United States)

    Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo

    2018-02-01

    Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.

  16. Custo/benefício de aeronaves: uma abordagem pela Análise Envoltória de Dados Cost-benefit of aircrafts: an approach through Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Rafael Iglesias Reinas

    2011-01-01

    Full Text Available O objetivo deste artigo é a aplicação da técnica de Pesquisa Operacional Análise Envoltória de Dados - DEA, por meio do índice composto das fronteiras clássica e invertida, para avaliar o custo/beneficio de aeronaves de transporte civil. Para tal, foram adotados como inputs: a preço de mercado, e b custos operacionais; e como outputs: a peso de carga paga, b velocidade de cruzeiro, e c razão máxima de subida com um único motor. Para garantir a homogeneidade das DMUs, os aviões foram divididos segundo o sistema de propulsão e segundo as categorias: regional, narrow-body e wide-body. Foram também agrupados em diferentes alcances, para que se pudessem identificar os aviões com melhor custo/beneficio em cada um. Os resultados encontrados permitem afirmar que os aviões com o melhor custo/beneficio são os que já tiveram sua produção descontinuada, mas que ainda não estão muito defasados tecnologicamente.The objective of this paper was to apply the operational research technique of Data Envelopment Analysis (DEA, through the composite index of classic and inverted borders, to evaluate the cost-benefit of civil transport aircrafts. To this end, a market price, and b operating costs were used as inputs; and, a payload, b cruise speed, and c maximum reason climb with one engine were used as outputs. To ensure homogeneity of DMUs, the planes were divided according to the propulsion system and in the following categories: regional, narrow-body and wide-body; they were also grouped into different ranges, so that the best cost-benefit for each aircraft could be identified. Results allowed for the following conclusion: the best cost-benefits were found in planes where the production had already been discontinued, but they were not very technologically outdated yet.

  17. Primary total hip arthroplasty: a comparison of the lateral Hardinge approach to an anterior mini-invasive approach

    Directory of Open Access Journals (Sweden)

    Nathan Wayne

    2009-11-01

    Full Text Available The anterior mini-invasive (MI approach to performing total hip arthroplasty (THA is associated with less soft tissue damage and shorter postoperative recovery than other methods. Our hospital recently abandoned the traditional lateral Hardinge (LH approach in favour of this new method. We compared the first 100 patients operated after the changeover to the new method (MI group to the last 100 patients operated using the traditional method (LH group. Clinical and radiological parameters and complications were recorded pre- and postoperatively and the collected data of the two groups were statistically compared. There were no statistically significant differences between either group with regard to patient demographics or procedural data, placement of the femur component, postoperative leg discrepancy, prosthesis dislocation, blood transfusion, or postoperative dislocation of the components. The MI group had a significantly longer operating time, more bleeding, higher rate of nerve damage, and a higher percentage of acetabular component malposition whilst having a significantly shorter hospital stay and significantly fewer infections of the operative site in comparison to the LH group. Additionally, and perhaps most worrying was the clinically significant increase in intraoperative femur fractures in the MI group. The changeover to the anterior mini-invasive approach, which was the surgeons' initial experience with the MI technique, resulted in a drastic increase in the number of overall complications accompanied by less soft tissue damage and a shorter period of rehabilitation. Our results suggest that further analysis of this surgical MI technique will be needed before it can be recommended for widespread adoption.

  18. Integrated Energy Design of the Building Envelope

    DEFF Research Database (Denmark)

    Nielsen, Martin Vraa

    This thesis describes the outcome of the PhD project Integrated energy design of the building envelope carried out through a combination of scientific dissemination reported through peer-reviewed journals and a wide range of affiliated projects involved in at an architectural firm. The research...

  19. SAFEGUARDS ENVELOPE: PREVIOUS WORK AND EXAMPLES

    International Nuclear Information System (INIS)

    Metcalf, Richard; Bevill, Aaron; Charlton, William; Bean, Robert

    2008-01-01

    The future expansion of nuclear power will require not just electricity production but fuel cycle facilities such as fuel fabrication and reprocessing plants. As large reprocessing facilities are built in various states, they must be built and operated in a manner to minimize the risk of nuclear proliferation. Process monitoring has returned to the spotlight as an added measure that can increase confidence in the safeguards of special nuclear material (SNM). Process monitoring can be demonstrated to lengthen the allowable inventory period by reducing accountancy requirements, and to reduce the false positive indications. The next logical step is the creation of a Safeguards Envelope, a set of operational parameters and models to maximize anomaly detection and inventory period by process monitoring while minimizing operator impact and false positive rates. A brief example of a rudimentary Safeguards Envelope is presented, and shown to detect synthetic diversions overlaying a measured processing plant data set. This demonstration Safeguards Envelope is shown to increase the confidence that no SNM has been diverted with minimal operator impact, even though it is based on an information sparse environment. While the foundation on which a full Safeguards Envelope can be built has been presented in historical demonstrations of process monitoring, several requirements remain yet unfulfilled. Future work will require reprocessing plant transient models, inclusion of 'non-traditional' operating data, and exploration of new methods of identifying subtle events in transient processes

  20. Multi-layered breathing architectural envelope

    DEFF Research Database (Denmark)

    Lund Larsen, Andreas; Foged, Isak Worre; Jensen, Rasmus Lund

    2014-01-01

    A multi layered breathing envelope is developed as a method of natural ventilation. The two main layers consist of mineral wool and air permeable concrete. The mineral wool works as a dynamic insulation and the permeable concrete as a heat recovery system with a high thermal mass for heat storage...

  1. Cost Allocation and Convex Data Envelopment

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tind, Jørgen

    such as Data Envelopment Analysis (DEA). The convexity constraint of the BCC model introduces a non-zero slack in the objective function of the multiplier problem and we show that the cost allocation rules discussed in this paper can be used as candidates to allocate this slack value on to the input (or output...

  2. Moisture accumulation in a building envelope

    Energy Technology Data Exchange (ETDEWEB)

    Forest, T.W.; Checkwitch, K.

    1988-09-01

    In a large number of cases, the failure of a building envelope can be traced to the accumulation of moisture. In a cold winter climate, characteristic of the Canadian prairies, moisture is deposited in the structure by the movement of warm, moist air through the envelope. Tests on the moisture accumulation in a building envelope were initiated in a test house at an Alberta research facility during the 1987/88 heating season. The indoor moisture generation rate was measured and compared with the value inferred from the measured air infiltration rate. With the flue open, the moisture generation rate was approximately 5.5 kg/d of which 0.7 kg/d entered the building envelope; the remainder was exhausted through the flue. With the flue blocked, the moisture generation rate decreased to 3.4 kg/d, while the amount of moisture migrating through the envelope increased to 4.0 kg/d. The moisture accumulation in wall panels located on the north and south face of the test house was also monitored. Moisture was allowed to enter the wall cavity via a hole in the drywall. The fiberglass insulation remained dry throughout the test period. The moisture content of the exterior sheathing of the north panel increased to a maximum of 18% wt in the vicinity of the hole, but quickly dried when the ambient temperatures increased towards the end of the season. The south panel showed very little moisture accumlation due to the effects of solar radiation. 14 refs., 9 figs.

  3. Sequence comparison alignment-free approach based on suffix tree and L-words frequency.

    Science.gov (United States)

    Soares, Inês; Goios, Ana; Amorim, António

    2012-01-01

    The vast majority of methods available for sequence comparison rely on a first sequence alignment step, which requires a number of assumptions on evolutionary history and is sometimes very difficult or impossible to perform due to the abundance of gaps (insertions/deletions). In such cases, an alternative alignment-free method would prove valuable. Our method starts by a computation of a generalized suffix tree of all sequences, which is completed in linear time. Using this tree, the frequency of all possible words with a preset length L-L-words--in each sequence is rapidly calculated. Based on the L-words frequency profile of each sequence, a pairwise standard Euclidean distance is then computed producing a symmetric genetic distance matrix, which can be used to generate a neighbor joining dendrogram or a multidimensional scaling graph. We present an improvement to word counting alignment-free approaches for sequence comparison, by determining a single optimal word length and combining suffix tree structures to the word counting tasks. Our approach is, thus, a fast and simple application that proved to be efficient and powerful when applied to mitochondrial genomes. The algorithm was implemented in Python language and is freely available on the web.

  4. Sequence Comparison Alignment-Free Approach Based on Suffix Tree and L-Words Frequency

    Directory of Open Access Journals (Sweden)

    Inês Soares

    2012-01-01

    Full Text Available The vast majority of methods available for sequence comparison rely on a first sequence alignment step, which requires a number of assumptions on evolutionary history and is sometimes very difficult or impossible to perform due to the abundance of gaps (insertions/deletions. In such cases, an alternative alignment-free method would prove valuable. Our method starts by a computation of a generalized suffix tree of all sequences, which is completed in linear time. Using this tree, the frequency of all possible words with a preset length L—L-words—in each sequence is rapidly calculated. Based on the L-words frequency profile of each sequence, a pairwise standard Euclidean distance is then computed producing a symmetric genetic distance matrix, which can be used to generate a neighbor joining dendrogram or a multidimensional scaling graph. We present an improvement to word counting alignment-free approaches for sequence comparison, by determining a single optimal word length and combining suffix tree structures to the word counting tasks. Our approach is, thus, a fast and simple application that proved to be efficient and powerful when applied to mitochondrial genomes. The algorithm was implemented in Python language and is freely available on the web.

  5. Novel Real-Time Flight Envelope Monitoring System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is an aircraft flight envelope monitoring system that will provide real-time in-cockpit estimations of aircraft flight envelope boundaries....

  6. Moisture condensation on building envelopes in differential ventilated spaces in the tropics: quantitative assessment of influencing factors

    Directory of Open Access Journals (Sweden)

    Ali Maisarah

    2016-01-01

    Full Text Available Ventilation systems play a significant role in maintaining the indoor thermal and hygric balance. Nevertheless, the systems had been implicated to result in many problems. In the tropical climate, especially for energy efficiency purposes, building spaces are operated with differential ventilation. Such spaces operate on 24-hrs basis, some on 8-hrs while others are either naturally ventilated or served with mechanical supply-exhaust fan systems with non-conditioned outdoor air. This practice had been found to result in condensation problems. This study involves a quantitative appraisal of the effect of operative conditions and hygrothermal quality of building envelopes on condensation risk. The in-situ experiment is combined with an analytical approach to assessing the hygrothermal quality of building envelopes in a tropical climate building under differential ventilation between adjacent spaces. The case-studied building is with a known history of condensation and associated damages including mould growth. The microclimate measurement and hygrothermal performance of the wall and floor against condensation and mould growth risks had been previously reported elsewhere. As a step further, the present study evaluates the effects of various envelope insulation types and configurations together with the HVAC cooling set-points on envelope hygrothermal performance. The results revealed that overcooling the air-conditioned side increases condensation risk on the non-air-conditioned side of the envelopes. The envelopes failed criteria for surface condensation at existing operative conditions irrespective of envelope hygrothermal quality improvements. However, the envelope performed well at improved cooling operative conditions even at existing envelope hygrothermal quality. It is, therefore, important to ascertain the envelope hygrothermal quality as well the cooling operative conditions while embarking on energy efficiency operations in mechanical

  7. An application of a grey data envelopment analysis model to the risk comparative analysis among power generation technologies

    International Nuclear Information System (INIS)

    Garcia, Pauli A.A.; Melo, P.F. Frutuoso e

    2005-01-01

    The comparative risk analysis is a technique for which one seeks equilibrium among benefits, costs, and risks associated with common-purpose activities performed. In light of the ever-growing world demand for a sustainable power supply, we present in this paper a comparison among different power generation technologies. The data for the comparative analyses has been taken from the literature. A hybrid approach is proposed for performing the comparisons, in which the Grey System Theory and the Data Envelopment Analysis (DEA) are combined. The purpose of this combination is to take into account different features that influence the risk analysis, when one aims to compare different power generation technologies. The generation technologies considered here are: solar, biomass, wind, hydroelectric, oil, natural gas, coal, and nuclear. The criteria considered in the analysis are: contribution to the life expectancy reduction (in years); contribution to the life expectancy growth (in years); used area (in km 2 ); tons of released CO 2 per GWh generated. The results obtained by using the aforementioned approach are promising and demonstrate the advantages of the Grey-DEA approach for the problem at hand. The results show that investments in the nuclear and solar power generation technologies are the options that present the best relative efficiencies, that is, among all considered options, they presented the best cost-benefit-risk relationships. (author)

  8. The laboratory investigation of surface envelope solitons: reflection from a vertical wall and collisions of solitons

    Science.gov (United States)

    Slunyaev, Alexey; Klein, Marco; Clauss, Günther F.

    2016-04-01

    Envelope soliton solutions are key elements governing the nonlinear wave dynamics within a simplified theory for unidirectional weakly modulated weakly nonlinear wave groups on the water surface. Within integrable models the solitons preserve their structure in collisions with other waves; they do not disperse and can carry energy infinitively long. Steep and short soliton-like wave groups have been shown to exist in laboratory tests [1] and, even earlier, in numerical simulations [2, 3]. Thus, long-living wave groups may play important role in the dynamics of intense sea waves and wave-structure interactions. The solitary wave groups may change the wave statistics and can be taken into account when developing approaches for the deterministic forecasting of dangerous waves, including so-called rogue waves. An experimental campaign has been conducted in the wave basin of the Technical University of Berlin on simulations of intense solitary wave groups. The first successful experimental observation of intense envelope solitons took place in this facility [1]. The new experiments aimed at following main goals: 1) to reproduce intense envelope solitons with different carrier wave lengths; 2) to estimate the rate of envelope soliton dissipation; 3) to consider the reflection of envelope solitons on a vertical wall; 4) to consider head-on collisions of envelope solitons, and 5) to consider overtaking interactions of envelope solitons. Up to 9 wave gauges were used in each experimental run, which enabled registration of the surface movement at different distances from the wavemaker, at different locations across the wave flume and near the wall. Besides surface displacements, the group envelope shapes were directly recorded, with use of phase shifts applied to the modulated waves generated by the wavemaker. [1] A. Slunyaev, G.F. Clauss, M. Klein, M. Onorato, Simulations and experiments of short intense envelope solitons of surface water waves. Phys. Fluids 25, 067105

  9. Inversion of Auditory Spectrograms, Traditional Spectrograms, and Other Envelope Representations

    DEFF Research Database (Denmark)

    Decorsière, Remi Julien Blaise; Søndergaard, Peter Lempel; MacDonald, Ewen

    2015-01-01

    Envelope representations such as the auditory or traditional spectrogram can be defined by the set of envelopes from the outputs of a filterbank. Common envelope extraction methods discard information regarding the fast fluctuations, or phase, of the signal. Thus, it is difficult to invert, or re...... to the framework is proposed, which leads to a more accurate inversion of traditional spectrograms...

  10. 200 Area Deactivation Project Facilities Authorization Envelope Document

    International Nuclear Information System (INIS)

    DODD, E.N.

    2000-01-01

    Project facilities as required by HNF-PRO-2701, Authorization Envelope and Authorization Agreement. The Authorization Agreements (AA's) do not identify the specific set of environmental safety and health requirements that are applicable to the facility. Therefore, the facility Authorization Envelopes are defined here to identify the applicable requirements. This document identifies the authorization envelopes for the 200 Area Deactivation

  11. 14 CFR 27.87 - Height-speed envelope.

    Science.gov (United States)

    2010-01-01

    ... applicable power failure condition in paragraph (b) of this section, a limiting height-speed envelope must be... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Height-speed envelope. 27.87 Section 27.87... STANDARDS: NORMAL CATEGORY ROTORCRAFT Flight Performance § 27.87 Height-speed envelope. (a) If there is any...

  12. 14 CFR 29.87 - Height-velocity envelope.

    Science.gov (United States)

    2010-01-01

    ... Category A engine isolation requirements, the height-velocity envelope for complete power failure must be... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Height-velocity envelope. 29.87 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Flight Performance § 29.87 Height-velocity envelope. (a...

  13. Analysis of Building Envelope Construction in 2003 CBECS

    Energy Technology Data Exchange (ETDEWEB)

    Winiarski, David W.; Halverson, Mark A.; Jiang, Wei

    2007-06-01

    The purpose of this analysis is to determine "typical" building envelope characteristics for buildings built after 1980. We address three envelope components in this paper - roofs, walls, and window area. These typical building envelope characteristics were used in the development of DOE’s Reference Buildings .

  14. Comparison of posterior retroperitoneal and transabdominal lateral approaches in robotic adrenalectomy: an analysis of 200 cases.

    Science.gov (United States)

    Kahramangil, Bora; Berber, Eren

    2018-04-01

    Although numerous studies have been published on robotic adrenalectomy (RA) in the literature, none has done a comparison of posterior retroperitoneal (PR) and transabdominal lateral (TL) approaches. The aim of this study was to compare the outcomes of robotic PR and TL adrenalectomy. This is a retrospective analysis of a prospectively maintained database. Between September 2008 and January 2017, perioperative outcomes of patients undergoing RA through PR and TL approaches were recorded into an IRB-approved database. Clinical and perioperative parameters were compared using Student's t test, Wilcoxon rank-sum test, and χ 2 test. Multivariate regression analysis was performed to determine factors associated with total operative time. 188 patients underwent 200 RAs. 110 patients were operated through TL and 78 patients through PR approach. Overall, conversion rate to open was 2.5% and 90-day morbidity 4.8%. The perioperative outcomes of TL and PR approaches were similar regarding estimated blood loss, rate of conversion to open, length of hospital stay, and 90-day morbidity. PR approach resulted in a shorter mean ± SD total operative time (136.3 ± 38.7 vs. 154.6 ± 48.4 min; p = 0.005) and lower visual analog scale pain score on postoperative day #1 (4.3 ± 2.5 vs. 5.4 ± 2.4; p = 0.001). After excluding tumors larger than 6 cm operated through TL approach, the difference in operative times persisted (136.3 ± 38.7 vs. 153.7 ± 45.7 min; p = 0.009). On multivariate regression analysis, increasing BMI and TL approaches were associated with longer total operative time. This study shows that robotic PR and TL approaches are equally safe and efficacious. With experience, shorter operative time and less postoperative pain can be achieved with PR technique. This supports the preferential utilization of PR approach in high-volume centers with enough experience.

  15. Equivariant calculus in the differential envelope

    International Nuclear Information System (INIS)

    Kastler, D.

    1991-01-01

    The author shows how Z/2-graded cyclic cohomology is related to the equivariant calculus of S. Klimek, W. Kondracki, and A. Lesniewski (HUTMP 90/B247 (1990)). He uses the differential envelope of a complex unital differential algebra. After a presentation of fiber-preserved operators on equivariant functions valued in this algebra on a group he considers certain operators on this algebra. Finally he discusses explicitly the case G=Z/2. (HSI)

  16. Equivariant calculus in the differential envelope

    Energy Technology Data Exchange (ETDEWEB)

    Kastler, D. (Centre National de la Recherche Scientifique, 13 - Marseille (France). Centre de Physique Theorique)

    1991-01-01

    The author shows how Z/2-graded cyclic cohomology is related to the equivariant calculus of S. Klimek, W. Kondracki, and A. Lesniewski (HUTMP 90/B247 (1990)). He uses the differential envelope of a complex unital differential algebra. After a presentation of fiber-preserved operators on equivariant functions valued in this algebra on a group he considers certain operators on this algebra. Finally he discusses explicitly the case G=Z/2. (HSI).

  17. Digital image envelope: method and evaluation

    Science.gov (United States)

    Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang

    2003-05-01

    Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.

  18. ASSESSMENT OF REGIONAL EFFICIENCY IN CROATIA USING DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Danijela Rabar

    2013-02-01

    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  19. A Comparison of Approach and Avoidance Sexual Goals in Couples With Vulvodynia and Community Controls.

    Science.gov (United States)

    Dubé, Justin P; Bergeron, Sophie; Muise, Amy; Impett, Emily A; Rosen, Natalie O

    2017-11-01

    couples coping with PVD. Dubé JP, Bergeron S, Muise A, et al. A Comparison of Approach and Avoidance Sexual Goals in Couples With Vulvodynia and Community Controls. J Sex Med 2017;14:1412-1420. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  20. Traction cytometry: regularization in the Fourier approach and comparisons with finite element method.

    Science.gov (United States)

    Kulkarni, Ankur H; Ghosh, Prasenjit; Seetharaman, Ashwin; Kondaiah, Paturu; Gundiah, Namrata

    2018-05-09

    Traction forces exerted by adherent cells are quantified using displacements of embedded markers on polyacrylamide substrates due to cell contractility. Fourier Transform Traction Cytometry (FTTC) is widely used to calculate tractions but has inherent limitations due to errors in the displacement fields; these are mitigated through a regularization parameter (γ) in the Reg-FTTC method. An alternate finite element (FE) approach computes tractions on a domain using known boundary conditions. Robust verification and recovery studies are lacking but essential in assessing the accuracy and noise sensitivity of the traction solutions from the different methods. We implemented the L2 regularization method and defined a maximum curvature point in the traction with γ plot as the optimal regularization parameter (γ*) in the Reg-FTTC approach. Traction reconstructions using γ* yield accurate values of low and maximum tractions (Tmax) in the presence of up to 5% noise. Reg-FTTC is hence a clear improvement over the FTTC method but is inadequate to reconstruct low stresses such as those at nascent focal adhesions. FE, implemented using a node-by-node comparison, showed an intermediate reconstruction compared to Reg-FTTC. We performed experiments using mouse embryonic fibroblast (MEF) and compared results between these approaches. Tractions from FTTC and FE showed differences of ∼92% and 22% as compared to Reg-FTTC. Selection of an optimum value of γ for each cell reduced variability in the computed tractions as compared to using a single value of γ for all the MEF cells in this study.

  1. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    International Nuclear Information System (INIS)

    Chan, M.T.; Herman, G.T.; Levitan, E.

    1996-01-01

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  2. The cell envelope glycoconjugates of Mycobacterium tuberculosis

    Science.gov (United States)

    Angala, Shiva Kumar; Belardinelli, Juan Manuel; Huc-Claustre, Emilie; Wheat, William H.; Jackson, Mary

    2015-01-01

    Tuberculosis (TB) remains the second most common cause of death due to a single infectious agent. The cell envelope of Mycobacterium tuberculosis (Mtb), the causative agent of the disease in humans, is a source of unique glycoconjugates and the most distinctive feature of the biology of this organism. It is the basis of much of Mtb pathogenesis and one of the major causes of its intrinsic resistance to chemotherapeutic agents. At the same time, the unique structures of Mtb cell envelope glycoconjugates, their antigenicity and essentiality for mycobacterial growth provide opportunities for drug, vaccine, diagnostic and biomarker development, as clearly illustrated by recent advances in all of these translational aspects. This review focuses on our current understanding of the structure and biogenesis of Mtb glycoconjugates with particular emphasis on one of most intriguing and least understood aspect of the physiology of mycobacteria: the translocation of these complex macromolecules across the different layers of the cell envelope. It further reviews the rather impressive progress made in the last ten years in the discovery and development of novel inhibitors targeting their biogenesis. PMID:24915502

  3. Spectral envelope sensitivity of musical instrument sounds.

    Science.gov (United States)

    Gunawan, David; Sen, D

    2008-01-01

    It is well known that the spectral envelope is a perceptually salient attribute in musical instrument timbre perception. While a number of studies have explored discrimination thresholds for changes to the spectral envelope, the question of how sensitivity varies as a function of center frequency and bandwidth for musical instruments has yet to be addressed. In this paper a two-alternative forced-choice experiment was conducted to observe perceptual sensitivity to modifications made on trumpet, clarinet and viola sounds. The experiment involved attenuating 14 frequency bands for each instrument in order to determine discrimination thresholds as a function of center frequency and bandwidth. The results indicate that perceptual sensitivity is governed by the first few harmonics and sensitivity does not improve when extending the bandwidth any higher. However, sensitivity was found to decrease if changes were made only to the higher frequencies and continued to decrease as the distorted bandwidth was widened. The results are analyzed and discussed with respect to two other spectral envelope discrimination studies in the literature as well as what is predicted from a psychoacoustic model.

  4. Concentrations versus amounts of biomarkers in urine: a comparison of approaches to assess pyrethroid exposure

    Directory of Open Access Journals (Sweden)

    Bouchard Michèle

    2008-11-01

    Full Text Available Abstract Background Assessment of human exposure to non-persistent pesticides such as pyrethroids is often based on urinary biomarker measurements. Urinary metabolite levels of these pesticides are usually reported in volume-weighted concentrations or creatinine-adjusted concentrations measured in spot urine samples. It is known that these units are subject to intra- and inter-individual variations. This research aimed at studying the impact of these variations on the assessment of pyrethroid absorbed doses at individual and population levels. Methods Using data obtained from various adult and infantile populations, the intra and inter-individual variability in the urinary flow rate and creatinine excretion rate was first estimated. Individual absorbed doses were then calculated using volume-weighted or creatinine-adjusted concentrations according to published approaches and compared to those estimated from the amounts of biomarkers excreted in 15- or 24-h urine collections, the latter serving as a benchmark unit. The effect of the units of measurements (volume-weighted or creatinine adjusted concentrations or 24-h amounts on results of the comparison of pyrethroid biomarker levels between two populations was also evaluated. Results Estimation of daily absorbed doses of permethrin from volume-weighted or creatinine-adjusted concentrations of biomarkers was found to potentially lead to substantial under or overestimation when compared to doses reconstructed directly from amounts excreted in urine during a given period of time (-70 to +573% and -83 to +167%, respectively. It was also shown that the variability in creatinine excretion rate and urinary flow rate may introduce a bias in the case of between population comparisons. Conclusion The unit chosen to express biomonitoring data may influence the validity of estimated individual absorbed dose as well as the outcome of between population comparisons.

  5. Modeling a Decision Support Tool for Buildable and Sustainable Building Envelope Designs

    Directory of Open Access Journals (Sweden)

    Natee Singhaputtangkul

    2015-05-01

    Full Text Available Sustainability and buildability requirements in building envelope design have significantly gained more importance nowadays, yet there is a lack of an appropriate decision support system (DSS that can help a building design team to incorporate these requirements and manage their tradeoffs at once. The main objective of this study is to build such a tool to facilitate a building design team to take into account sustainability and buildability criteria for assessment of building envelopes of high-rise residential buildings in Singapore. Literature reviews were conducted to investigate a comprehensive set of the sustainability and buildability criteria. This also included development of the tool using a Quality Functional Deployment (QFD approach combined with fuzzy set theory. A building design team was engaged to test the tool with the aim to evaluate usefulness of the tool in managing the tradeoffs among the sustainability and buildability criteria. The results from a qualitative data analysis suggested that the tool allowed the design team to effectively find a balance between the tradeoffs among the criteria when assessing multiple building envelope design alternatives. Main contributions of using this tool are achievement of a more efficient assessment of the building envelopes and more sustainable and buildable building envelope design.

  6. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  7. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  8. Formation of polycyclic aromatic hydrocarbons in circumstellar envelopes

    International Nuclear Information System (INIS)

    Frenklach, M.; Feigelson, E.D.

    1989-01-01

    Production of polycyclic aromatic hydrocarbons in carbon-rich circumstellar envelopes was investigated using a kinetic approach. A detailed chemical reaction mechanism of gas-phase PAH formation and growth, containing approximately 100 reactions of 40 species, was numerically solved under the physical conditions expected in cool stellar winds. The chemistry is based on studies of soot production in hydrocarbon pyrolysis and combustion. Several first-ring and second-ring cyclization processes were considered. A linear lumping algorithm was used to describe PAH growth beyond the second aromatic ring. PAH production using this mechanism was examined with respect to a grid of idealized constant velocity stellar winds as well as several published astrophysical models. The basic result is that the onset of PAH production in the interstellar envelopes is predicted to occur within the temperature interval of 1100 to 900 K. The absolute amounts of the PAHs formed, however, are very sensitive to a number of parameters, both chemical and astrophysical, whose values are not accurately known. Astrophysically meaningful quantities of PAHs require particularly dense and slow stellar winds and high initial acetylene abundance. It is suggested that most of the PAHs may be produced in a relatively small fraction of carbon-rich red giants. 87 refs

  9. Focal Targeting of the Bacterial Envelope by Antimicrobial Peptides

    Directory of Open Access Journals (Sweden)

    Rafi eRashid

    2016-06-01

    Full Text Available Antimicrobial peptides (AMPs are utilized by both eukaryotic and prokaryotic organisms. AMPs such as the human beta defensins, human neutrophil peptides, human cathelicidin, and many bacterial bacteriocins are cationic and capable of binding to anionic regions of the bacterial surface. Cationic AMPs (CAMPs target anionic lipids (e.g. phosphatidylglycerol (PG and cardiolipins (CL in the cell membrane and anionic components (e.g. lipopolysaccharide (LPS and lipoteichoic acid (LTA of the cell envelope. Bacteria have evolved mechanisms to modify these same targets in order to resist CAMP killing, e.g. lysinylation of PG to yield cationic lysyl-PG and alanylation of LTA. Since CAMPs offer a promising therapeutic alternative to conventional antibiotics, which are becoming less effective due to rapidly emerging antibiotic resistance, there is a strong need to improve our understanding about the AMP mechanism of action. Recent literature suggests that AMPs often interact with the bacterial cell envelope at discrete foci. Here we review recent AMP literature, with an emphasis on focal interactions with bacteria, including (1 CAMP disruption mechanisms, (2 delocalization of membrane proteins and lipids by CAMPs, and (3 CAMP sensing systems and resistance mechanisms. We conclude with new approaches for studying the bacterial membrane, e.g., lipidomics, high resolution imaging and non-detergent-based membrane domain extraction.

  10. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  11. Comparison of different modeling approaches to simulate contaminant transport in a fractured limestone aquifer

    DEFF Research Database (Denmark)

    Mosthaf, Klaus; Rosenberg, L.; Balbarini, Nicola

    . Given available field data and model purpose, this paper therefore aims to develop, examine and compare modeling approaches for transport of contaminants in fractured limestone aquifers. The model comparison was conducted for a contaminated site in Denmark, where a plume of a dissolved contaminant (PCE...... was combined with an analysis of heterogeneities and fractures from a nearby excavation (analog site). Methods for translating the geological information and fracture mapping into each of the model concepts were examined. Each model was compared with available field data, considering both model fit...... of field data is the determination of relevant hydraulic properties and interpretation of aqueous and solid phase contaminant concentration sampling data. Traditional water sampling has a bias towards fracture sampling, however concentrations in the limestone matrix are needed for assessing contaminant...

  12. A Comparison of Proliferation Resistance Measures of Misuse Scenarios Using a Markov Approach

    International Nuclear Information System (INIS)

    Yue, M.; Cheng, L.-Y.; Bari, R.

    2008-01-01

    Misuse of declared nuclear facilities is one of the important proliferation threats. The robustness of a facility against these threats is characterized by a number of proliferation resistance (PR) measures. This paper evaluates and compares PR measures for several misuse scenarios using a Markov model approach to implement the pathway analysis methodology being developed by the PR and PP (Proliferation Resistance and Physical Protection) Expert Group. Different misue strategies can be adopted by a proliferator and each strategy is expected to have different impacts on the proliferator's success. Selected as the probabilistic measure to represent proliferation resistance, the probabilities of the proliferator's success of misusing a hypothetical ESFR (Example Sodium Fast Reactor) facility system are calculated using the Markov model based on the pathways constructed for individual misuse scenarios. Insights from a comparison of strategies that are likely to be adopted by the proliferator are discussed in this paper.

  13. The Innovative Approaches to Packaging – Comparison Analysis of Intelligent and Active Packaging Perceptions in Slovakia

    Directory of Open Access Journals (Sweden)

    Loucanova Erika

    2017-06-01

    Full Text Available Packaging has always served a practical function - to hold goods together and protect it when moving toward the customer through distribution channel. Today packaging is also a container for promoting the product and making it easier and safer to use. The sheer importance of packaging functions is still growing and consequently the interest of the company is to access to the packaging more innovative and creative. The paper deals with the innovative approaches to packaging resulting in the creation of packaging with interactive active features in the form of active and intelligent packaging. Using comparative analysis, we monitored the perception of the active packaging functions in comparison to intelligent packaging function among different age categories. We identified the age categories which are most interested in these functions.

  14. Rigorous approach to the comparison between experiment and theory in Casimir force measurements

    International Nuclear Information System (INIS)

    Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M

    2006-01-01

    In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect

  15. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    Science.gov (United States)

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  16. Derringer desirability and kinetic plot LC-column comparison approach for MS-compatible lipopeptide analysis.

    Science.gov (United States)

    D'Hondt, Matthias; Verbeke, Frederick; Stalmans, Sofie; Gevaert, Bert; Wynendaele, Evelien; De Spiegeleer, Bart

    2014-06-01

    Lipopeptides are currently re-emerging as an interesting subgroup in the peptide research field, having historical applications as antibacterial and antifungal agents and new potential applications as antiviral, antitumor, immune-modulating and cell-penetrating compounds. However, due to their specific structure, chromatographic analysis often requires special buffer systems or the use of trifluoroacetic acid, limiting mass spectrometry detection. Therefore, we used a traditional aqueous/acetonitrile based gradient system, containing 0.1% (m/v) formic acid, to separate four pharmaceutically relevant lipopeptides (polymyxin B 1 , caspofungin, daptomycin and gramicidin A 1 ), which were selected based upon hierarchical cluster analysis (HCA) and principal component analysis (PCA). In total, the performance of four different C18 columns, including one UPLC column, were evaluated using two parallel approaches. First, a Derringer desirability function was used, whereby six single and multiple chromatographic response values were rescaled into one overall D -value per column. Using this approach, the YMC Pack Pro C18 column was ranked as the best column for general MS-compatible lipopeptide separation. Secondly, the kinetic plot approach was used to compare the different columns at different flow rate ranges. As the optimal kinetic column performance is obtained at its maximal pressure, the length elongation factor λ ( P max / P exp ) was used to transform the obtained experimental data (retention times and peak capacities) and construct kinetic performance limit (KPL) curves, allowing a direct visual and unbiased comparison of the selected columns, whereby the YMC Triart C18 UPLC and ACE C18 columns performed as best. Finally, differences in column performance and the (dis)advantages of both approaches are discussed.

  17. Comparison of Different Approaches to Predict the Performance of Pumps As Turbines (PATs

    Directory of Open Access Journals (Sweden)

    Mauro Venturini

    2018-04-01

    Full Text Available This paper deals with the comparison of different methods which can be used for the prediction of the performance curves of pumps as turbines (PATs. The considered approaches are four, i.e., one physics-based simulation model (“white box” model, two “gray box” models, which integrate theory on turbomachines with specific data correlations, and one “black box” model. More in detail, the modeling approaches are: (1 a physics-based simulation model developed by the same authors, which includes the equations for estimating head, power, and efficiency and uses loss coefficients and specific parameters; (2 a model developed by Derakhshan and Nourbakhsh, which first predicts the best efficiency point of a PAT and then reconstructs their complete characteristic curves by means of two ad hoc equations; (3 the prediction model developed by Singh and Nestmann, which predicts the complete turbine characteristics based on pump shape and size; (4 an Evolutionary Polynomial Regression model, which represents a data-driven hybrid scheme which can be used for identifying the explicit mathematical relationship between PAT and pump curves. All approaches are applied to literature data, relying on both pump and PAT performance curves of head, power, and efficiency over the entire range of operation. The experimental data were provided by Derakhshan and Nourbakhsh for four different turbomachines, working in both pump and PAT mode with specific speed values in the range 1.53–5.82. This paper provides a quantitative assessment of the predictions made by means of the considered approaches and also analyzes consistency from a physical point of view. Advantages and drawbacks of each method are also analyzed and discussed.

  18. LEGAL CERTAINTY OF INDUSTRIAL DESIGN REVENUE IN INDONESIA BASED ON INTELLECTUAL PROPERTY APPROACH AND LEGAL COMPARISON

    Directory of Open Access Journals (Sweden)

    Ranti Fauza Mayana

    2018-03-01

    Full Text Available [Legal Certainty Of Industrial Design Revenue  In Indonesia Based On Intellectual Property Approach And Legal Comparison]  Protection of Industrial Designs, as well as intellectual property, is based on the ability of human creativity through creativity, taste and intention. According to Article 25 paragraph (1 TRIPs Protected Industrial Design Agreement is a new or original Industrial Design, this provision holds the principle that the novelty of a design is obtained when the design is differ from the previous, the novelty includes novelty and originality, the principal basis for the grant of Industrial Design, whereas this principle is not fully adopted in the provisions of Industrial Design. The Industrial Design Decree in Indonesia only requires novelty without clarifying how to interpret the novelty requirement so that a large number of Industrial Design Rights are obtained based on the Minor Change approach where slight differences in form and configuration have essentially demonstrated novelty. The minor change approach is considered to exclude the aspect of originality and is less able to provide legal certainty to the holder of the registered Industrial Design Rights. This paper aims to explore minor change approach as the basis for the evaluation of the novelty of Industrial Design in the perspective of comparative law in several countries of the world, namely the United States, Japan, the European Union and Australia as a study and reference material in an effort to establish protection of Industrial Design Rights in Indonesia that can provide legal certainty. Keywords: Industrial Design Revenue, Comparative Law.

  19. Comparative Proteomics of Human Monkeypox and Vaccinia Intracellular Mature and Extracellular Enveloped Virions

    Energy Technology Data Exchange (ETDEWEB)

    Manes, Nathan P.; Estep, Ryan D.; Mottaz, Heather M.; Moore, Ronald J.; Clauss, Therese RW; Monroe, Matthew E.; Du, Xiuxia; Adkins, Joshua N.; Wong, Scott; Smith, Richard D.

    2008-03-07

    Orthopoxviruses are the largest and most complex of the animal viruses. In response to the recent emergence of monkeypox in Africa and the threat of smallpox bioterrorism, virulent (monkeypox virus) and benign (vaccinia virus) orthopoxviruses were proteomically compared with the goal of identifying proteins required for pathogenesis. Orthopoxviruses were grown in HeLa cells to two different viral forms (intracellular mature virus and extracellular enveloped virus), purified by sucrose gradient ultracentrifugation, denatured using RapiGest™ surfactant, and digested with trypsin. Unfractionated samples and strong cation exchange HPLC fractions were analyzed by reversed-phase LC-MS/MS, and analyses of the MS/MS spectra using SEQUEST® and X! Tandem resulted in the identification of hundreds of monkeypox, vaccinia, and copurified host proteins. The unfractionated samples were additionally analyzed by LC-MS on an LTQ-Orbitrap™, and the accurate mass and elution time tag approach was used to perform quantitative comparisons. Possible pathophysiological roles of differentially expressed orthopoxvirus genes are discussed.

  20. Expanded breadth of the T-cell response to mosaic HIV-1 envelope DNA vaccination

    Energy Technology Data Exchange (ETDEWEB)

    Korber, Bette [Los Alamos National Laboratory; Fischer, William [Los Alamos National Laboratory; Wallstrom, Timothy [Los Alamos National Laboratory

    2009-01-01

    An effective AIDS vaccine must control highly diverse circulating strains of HIV-1. Among HIV -I gene products, the envelope (Env) protein contains variable as well as conserved regions. In this report, an informatic approach to the design of T-cell vaccines directed to HIV -I Env M group global sequences was tested. Synthetic Env antigens were designed to express mosaics that maximize the inclusion of common potential Tcell epitope (PTE) 9-mers and minimize the inclusion of rare epitopes likely to elicit strain-specific responses. DNA vaccines were evaluated using intracellular cytokine staining (ICS) in inbred mice with a standardized panel of highly conserved 15-mer PTE peptides. I, 2 and 3 mosaic sets were developed that increased theoretical epitope coverage. The breadth and magnitude ofT-cell immunity stimulated by these vaccines were compared to natural strain Env's; additional comparisons were performed on mutant Env's, including gpl60 or gpl45 with or without V regions and gp41 deletions. Among them, the 2 or 3 mosaic Env sets elicited the optimal CD4 and CD8 responses. These responses were most evident in CD8 T cells; the 3 mosaic set elicited responses to an average of 8 peptide pools compared to 2 pools for a set of3 natural Env's. Synthetic mosaic HIV -I antigens can therefore induce T-cell responses with expanded breadth and may facilitate the development of effective T -cell-based HIV -1 vaccines.

  1. Mutations That Alter the Bacterial Cell Envelope Increase Lipid Production

    Energy Technology Data Exchange (ETDEWEB)

    Lemmer, Kimberly C.; Zhang, Weiping; Langer, Samantha J.; Dohnalkova, Alice; Hu, Dehong; Lemke, Rachelle A.; Piotrowski, Jeff S.; Orr, Galya; Noguera, Daniel R.; Donohue, Timothy J.

    2017-05-23

    ABSTRACT

    Lipids from microbes offer a promising source of renewable alternatives to petroleum-derived compounds. In particular, oleaginous microbes are of interest because they accumulate a large fraction of their biomass as lipids. In this study, we analyzed genetic changes that alter lipid accumulation inRhodobacter sphaeroides. By screening anR. sphaeroidesTn5mutant library for insertions that increased fatty acid content, we identified 10 high-lipid (HL) mutants for further characterization. These HL mutants exhibited increased sensitivity to drugs that target the bacterial cell envelope and changes in shape, and some had the ability to secrete lipids, with two HL mutants accumulating ~60% of their total lipids extracellularly. When one of the highest-lipid-secreting strains was grown in a fed-batch bioreactor, its lipid content was comparable to that of oleaginous microbes, with the majority of the lipids secreted into the medium. Based on the properties of these HL mutants, we conclude that alterations of the cell envelope are a previously unreported approach to increase microbial lipid production. We also propose that this approach may be combined with knowledge about biosynthetic pathways, in this or other microbes, to increase production of lipids and other chemicals.

    IMPORTANCEThis paper reports on experiments to understand how to increase microbial lipid production. Microbial lipids are often cited as one renewable replacement for petroleum-based fuels and chemicals, but strategies to increase the yield of these compounds are needed to achieve this goal. While lipid biosynthesis is often well understood, increasing yields of these compounds to industrially relevant levels is a challenge, especially since genetic, synthetic biology, or engineering approaches are not feasible in many microbes. We show that altering the bacterial cell envelope can be used to increase

  2. African Swine Fever Virus Undergoes Outer Envelope Disruption, Capsid Disassembly and Inner Envelope Fusion before Core Release from Multivesicular Endosomes.

    Directory of Open Access Journals (Sweden)

    Bruno Hernáez

    2016-04-01

    Full Text Available African swine fever virus (ASFV is a nucleocytoplasmic large DNA virus (NCLDV that causes a highly lethal disease in domestic pigs. As other NCLDVs, the extracellular form of ASFV possesses a multilayered structure consisting of a genome-containing nucleoid successively wrapped by a thick protein core shell, an inner lipid membrane, an icosahedral protein capsid and an outer lipid envelope. This structural complexity suggests an intricate mechanism of internalization in order to deliver the virus genome into the cytoplasm. By using flow cytometry in combination with pharmacological entry inhibitors, as well as fluorescence and electron microscopy approaches, we have dissected the entry and uncoating pathway used by ASFV to infect the macrophage, its natural host cell. We found that purified extracellular ASFV is internalized by both constitutive macropinocytosis and clathrin-mediated endocytosis. Once inside the cell, ASFV particles move from early endosomes or macropinosomes to late, multivesicular endosomes where they become uncoated. Virus uncoating requires acidic pH and involves the disruption of the outer membrane as well as of the protein capsid. As a consequence, the inner viral membrane becomes exposed and fuses with the limiting endosomal membrane to release the viral core into the cytosol. Interestingly, virus fusion is dependent on virus protein pE248R, a transmembrane polypeptide of the inner envelope that shares sequence similarity with some members of the poxviral entry/fusion complex. Collective evidence supports an entry model for ASFV that might also explain the uncoating of other multienveloped icosahedral NCLDVs.

  3. Next-Generation Mitogenomics: A Comparison of Approaches Applied to Caecilian Amphibian Phylogeny.

    Directory of Open Access Journals (Sweden)

    Simon T Maddock

    Full Text Available Mitochondrial genome (mitogenome sequences are being generated with increasing speed due to the advances of next-generation sequencing (NGS technology and associated analytical tools. However, detailed comparisons to explore the utility of alternative NGS approaches applied to the same taxa have not been undertaken. We compared a 'traditional' Sanger sequencing method with two NGS approaches (shotgun sequencing and non-indexed, multiplex amplicon sequencing on four different sequencing platforms (Illumina's HiSeq and MiSeq, Roche's 454 GS FLX, and Life Technologies' Ion Torrent to produce seven (near- complete mitogenomes from six species that form a small radiation of caecilian amphibians from the Seychelles. The fastest, most accurate method of obtaining mitogenome sequences that we tested was direct sequencing of genomic DNA (shotgun sequencing using the MiSeq platform. Bayesian inference and maximum likelihood analyses using seven different partitioning strategies were unable to resolve compellingly all phylogenetic relationships among the Seychelles caecilian species, indicating the need for additional data in this case.

  4. Next-Generation Mitogenomics: A Comparison of Approaches Applied to Caecilian Amphibian Phylogeny.

    Science.gov (United States)

    Maddock, Simon T; Briscoe, Andrew G; Wilkinson, Mark; Waeschenbach, Andrea; San Mauro, Diego; Day, Julia J; Littlewood, D Tim J; Foster, Peter G; Nussbaum, Ronald A; Gower, David J

    2016-01-01

    Mitochondrial genome (mitogenome) sequences are being generated with increasing speed due to the advances of next-generation sequencing (NGS) technology and associated analytical tools. However, detailed comparisons to explore the utility of alternative NGS approaches applied to the same taxa have not been undertaken. We compared a 'traditional' Sanger sequencing method with two NGS approaches (shotgun sequencing and non-indexed, multiplex amplicon sequencing) on four different sequencing platforms (Illumina's HiSeq and MiSeq, Roche's 454 GS FLX, and Life Technologies' Ion Torrent) to produce seven (near-) complete mitogenomes from six species that form a small radiation of caecilian amphibians from the Seychelles. The fastest, most accurate method of obtaining mitogenome sequences that we tested was direct sequencing of genomic DNA (shotgun sequencing) using the MiSeq platform. Bayesian inference and maximum likelihood analyses using seven different partitioning strategies were unable to resolve compellingly all phylogenetic relationships among the Seychelles caecilian species, indicating the need for additional data in this case.

  5. A direct comparison of remote sensing approaches for high-throughput phenotyping in plant breeding

    Directory of Open Access Journals (Sweden)

    Maria Tattaris

    2016-08-01

    Full Text Available Remote sensing (RS of plant canopies permits non-intrusive, high-throughput monitoring of plant physiological characteristics. This study compared three RS approaches using a low flying UAV (unmanned aerial vehicle, with that of proximal sensing, and satellite-based imagery. Two physiological traits were considered, canopy temperature (CT and a vegetation index (NDVI, to determine the most viable approaches for large scale crop genetic improvement. The UAV-based platform achieves plot-level resolution while measuring several hundred plots in one mission via high-resolution thermal and multispectral imagery measured at altitudes of 30-100 m. The satellite measures multispectral imagery from an altitude of 770 km. Information was compared with proximal measurements using IR thermometers and an NDVI sensor at a distance of 0.5-1m above plots. For robust comparisons, CT and NDVI were assessed on panels of elite cultivars under irrigated and drought conditions, in different thermal regimes, and on un-adapted genetic resources under water deficit. Correlations between airborne data and yield/biomass at maturity were generally higher than equivalent proximal correlations. NDVI was derived from high-resolution satellite imagery for only larger sized plots (8.5 x 2.4 m due to restricted pixel density. Results support use of UAV-based RS techniques for high-throughput phenotyping for both precision and efficiency.

  6. Applicability of the cost-effectiveness approach for comparison of waste management options

    International Nuclear Information System (INIS)

    Vuori, S.; Peltonen, E.; Vieno, T.; Vira, J.

    1984-01-01

    There is an obvious need to consider the achievable level of safety of waste management in view of the costs involved. The feasibility of the cost-effectiveness approach for this purpose is discussed in the framework of practical case studies. The analysis indicates that such an approach has clear benefits, but it also reveals several issues and ambiguities in its application. The waste management alternatives considered include various concepts for the disposal of low- and intermediate-level reactor wastes as well as of the unreprocessed spent fuel. The employed impact indicators describe both the individual and collective risks. In addition, indicators simultaneously giving a perspective into other risks in the society and a means to make a rank ordering of the alternative options are proposed. The cost-effectiveness ratios for collective risks vary in the range of ten to hundreds of millions US $ per man.Sv. The examples considered also indicate that increased costs do not necessarily improve safety. Furthermore, the comparison of the safety of different options requires more sophisticated and realistic models than those employed in the present analyses, because an unbalanced degree of conservatism could result in misleading conclusions. (author)

  7. Evaluation of users' satisfaction on pedestrian facilities using pair-wise comparison approach

    Science.gov (United States)

    Zainol, R.; Ahmad, F.; Nordin, N. A.; Aripin, A. W. M.

    2014-02-01

    Global climate change issues demand people of the world to change the way they live today. Thus, current cities need to be redeveloped towards less use of carbon in their day to day operations. Pedestrianized environment is one of the approaches used in reducing carbon foot print in cities. Heritage cities are the first to be looked into since they were built in the era in which motorized vehicles were minimal. Therefore, the research explores users' satisfaction on assessment of physical attributes of pedestrianization in Melaka Historical City, a UNESCO World Heritage Site. It aims to examine users' satisfaction on pedestrian facilities provided within the study area using pair wise questionnaire comparison approach. A survey of 200 respondents using random sampling was conducted in six different sites namely Jonker Street, Church Street, Kota Street, Goldsmith Street, Merdeka Street to Taming Sari Tower and Merdeka Street to River Cruise terminal. The survey consists of an assessment tool based on a nine-point scale of users' satisfaction level of pathway properties, zebra pedestrian crossing, street furniture, personal safety, adjacent to traffic flow, aesthetic and amenities. Analytical hierarchical process (AHP) was used to avoid any biasness in analyzing the data collected. Findings show that Merdeka Street to Taming Sari Tower as the street that scores the highest satisfaction level that fulfils all the required needs of a pedestrianized environment. Similar assessment elements can be used to evaluate existing streets in other cities and these criteria should also be used in planning for future cities.

  8. German nuclear codes revised: comparison with approaches used in other countries

    International Nuclear Information System (INIS)

    Raetzke, C.; Micklinghoff, M.

    2005-01-01

    The article deals with the plan of the German Federal Ministry for the Environment (BMU) to revise the German set of nuclear codes, and draws a comparison with approaches pursued in other countries in formulating and implementing new requirements imposed upon existing plants. A striking feature of the BMU project is the intention to have the codes reflect the state of the art in an entirely abstract way irrespective of existing plants. This implies new requirements imposed on plant design, among other things. However, the state authorities, which establish the licensing conditions for individual plants in concrete terms, will not be able to apply these new codes for legal reasons (protection of vested rights) to the extent in which they incorporate changes in safety philosophy. Also the procedure adopted has raised considerable concern. The processing time of two years is inordinately short, and participation of the public and of industry does not go beyond the strictly formal framework of general public participation. In the light of this absence of quality assurance, it would be surprising if this new set of codes did not suffer from considerable deficits in its contents. Other countries show that the BMU is embarking on an isolated approach in every respect. Elsewhere, backfitting requirements are developed carefully and over long periods of time; they are discussed in detail with the operators; costs and benefits are weighted, and the consequences are evaluated. These elements are in common to procedures in all countries, irrespective of very different steps in detail. (orig.)

  9. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  10. Integrated energy design of the building envelope

    Energy Technology Data Exchange (ETDEWEB)

    Vraa Nielsen, M.

    2012-07-01

    This thesis describes the outcome of the PhD project Integrated energy design of the building envelope carried out through a combination of scientific dissemination reported through peer-reviewed journals and a wide range of affiliated projects involved in at an architectural firm. The research project analysed how the implementation of technical knowledge early in the building design process can quantify the effect of a building's facades on its energy efficiency and indoor climate and thereby facilitate a more qualified design development. The project was structured in the following way: 1) the importance of integrating knowledge in the early stages of design, and how it can be done; 2) understanding the facade's typology; and 3) the complex notion of comfort. The project touched not only on the technical capabilities and requirements governing facade design, but also the process by which it takes place. This was done by applying the methodology of Integrated Energy Design (IED) and analysing its applicability in the design of facades. A major part of the project was an actual engagement in the architectural process to test out incorporating a consciousness about energy and comfort as part of a more holistic performance evaluation. The research project illustrates the great potential in taking passive properties into account through a geometrical optimisation inherent in the development of the architectural concept. It demonstrates that integration of technical knowledge at the early stages of design not only can qualify the geometrical processing, but also facilitate the design development of the facade. Thereby a more holistic performance optimisation can be obtained through parameters such as overall facade geometry and orientation, functional organisation, room height and depth, facade layout, window geometry and transparency, design of the window aperture, etc. Through the wide range of affiliated project involved in at the architectural firm over

  11. The comparison of various approach to evaluation erosion risks and design control erosion measures

    Science.gov (United States)

    Kapicka, Jiri

    2015-04-01

    In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas

  12. Analysis of Pseudomonas aeruginosa cell envelope proteome by capture of surface-exposed proteins on activated magnetic nanoparticles.

    Directory of Open Access Journals (Sweden)

    Davide Vecchietti

    Full Text Available We report on specific magneto-capturing followed by Multidimensional Protein Identification Technology (MudPIT for the analysis of surface-exposed proteins of intact cells of the bacterial opportunistic pathogen Pseudomonas aeruginosa. The magneto-separation of cell envelope fragments from the soluble cytoplasmic fraction allowed the MudPIT identification of the captured and neighboring proteins. Remarkably, we identified 63 proteins captured directly by nanoparticles and 67 proteins embedded in the cell envelope fragments. For a high number of proteins, our analysis strongly indicates either surface exposure or localization in an envelope district. The localization of most identified proteins was only predicted or totally unknown. This novel approach greatly improves the sensitivity and specificity of the previous methods, such as surface shaving with proteases that was also tested on P. aeruginosa. The magneto-capture procedure is simple, safe, and rapid, and appears to be well-suited for envelope studies in highly pathogenic bacteria.

  13. Analysis of Pseudomonas aeruginosa Cell Envelope Proteome by Capture of Surface-Exposed Proteins on Activated Magnetic Nanoparticles

    Science.gov (United States)

    Vecchietti, Davide; Di Silvestre, Dario; Miriani, Matteo; Bonomi, Francesco; Marengo, Mauro; Bragonzi, Alessandra; Cova, Lara; Franceschi, Eleonora; Mauri, Pierluigi; Bertoni, Giovanni

    2012-01-01

    We report on specific magneto-capturing followed by Multidimensional Protein Identification Technology (MudPIT) for the analysis of surface-exposed proteins of intact cells of the bacterial opportunistic pathogen Pseudomonas aeruginosa. The magneto-separation of cell envelope fragments from the soluble cytoplasmic fraction allowed the MudPIT identification of the captured and neighboring proteins. Remarkably, we identified 63 proteins captured directly by nanoparticles and 67 proteins embedded in the cell envelope fragments. For a high number of proteins, our analysis strongly indicates either surface exposure or localization in an envelope district. The localization of most identified proteins was only predicted or totally unknown. This novel approach greatly improves the sensitivity and specificity of the previous methods, such as surface shaving with proteases that was also tested on P. aeruginosa. The magneto-capture procedure is simple, safe, and rapid, and appears to be well-suited for envelope studies in highly pathogenic bacteria. PMID:23226459

  14. Analysis of beam envelope by transverse space charge effect

    International Nuclear Information System (INIS)

    Toyama, Shin'ichi

    1997-09-01

    It is important for high current accelerators to estimate the contribution of the space charge effect to keep the beam off its beak up. The application of an envelope equation is examined in previous report in which the beam is just coasting beam (non accelerating). The analysis of space charge effect is necessary for the comparison in coming accelerator test in PNC. In order to evaluate the beam behavior in high current, the beam dynamics and beam parameters which are input to the equation for the evaluation are developed and make it ready to estimate the beam transverse dynamics by the space charge. The estimate needs to have enough accuracy for advanced code calculation. After the preparation of the analytic expression of transverse motion, the non-linear differential equation of beam dynamics is solved by a numerical method on a personal computer. The beam envelope from the equation is estimated by means of the beam emittance, current and energy. The result from the analysis shows that the transverse beam broadening is scarecely small around the beam current value of PNC design. The contribution to the beam broadening of PNC linac comes from its beam emittance. The beam broadening in 100 MeV case is almost negligible in the view of transverse space charge effect. Therefore, the electron beam is stable up to 10 A order in PNC linac design. Of course, the problem for RF supply is out of consideration here. It is important to estimate other longitudinal effect such as beam bunch effect which is lasting unevaluated. (author)

  15. Comparison of Australian and US Cost-Benefit Approaches to MEPS

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, James E.

    2004-03-12

    The Australian Greenhouse Office contracted with the Collaborative Labeling and Appliance Standards Program (CLASP) for LBNL to compare US and Australian approaches to analyzing costs and benefits of minimum energy performance standards (MEPS). This report compares the approaches for three types of products: household refrigerators and freezers, small electric storage water heaters, and commercial/industrial air conditioners. This report presents the findings of similarities and differences between the approaches of the two countries and suggests changes to consider in the approach taken in Australia. The purpose of the Australian program is to reduce greenhouse gas emissions, while the US program is intended to increase energy efficiency; each program is thus subject to specific constraints. The market and policy contexts are different, with the USA producing most of its own products and conducting pioneering engineering-economic studies to identify maximum energy efficiency levels that are technologically feasible and economically justified. In contrast, Australia imports a large share of its products and adopts MEPS already in place elsewhere. With these differences in circumstances, Australia's analysis approach could be expected to have less analytical detail and still result in MEPS levels that are appropriate for their policy and market context. In practice, the analysis required to meet these different objectives is quite similar. To date, Australia's cost-benefit analysis has served the goals and philosophies of the program well and been highly effective in successfully identifying MEPS that are significantly reducing greenhouse gas emissions while providing economic benefits to consumers. In some cases, however, the experience of the USA--using more extensive data sets and more detailed analysis--suggests possible improvements to Australia's cost-benefit analysis. The principal findings of the comparison are: (1) The Technology and Market

  16. [Comparison of ablation of left-sided accessory pathway by atrial septal and retrograde arterial approach].

    Science.gov (United States)

    Zhu, J G; Bao, Z Y; Gu, X

    2017-03-07

    Objective: To compare the advantages and disadvantages of radiofrequency ablation of left-sided accessory pathways by via atrial septal approach with retrograde through aortic approach. Methods: A total of 184 patients of left-side accessory pathways were treated in Taizhou People's Hospital and the Subei People's Hospital from March 2012 to August 2015.A total of 103 cases were treated by aortic retrograde approach as through arterial group, 81 cases were treated by punctured atrial septal to left atrial for mapping and ablation as through atrial septal group.Comparison of ablation procedure time, total and pathways of different parts(subgroup) at instant success and relapse rates, safety (serious complications), and statistics other complications in operation and postoperative. Results: Through arterial group and through atrial septal group were no significant difference ( P >0.05) in the ablation procedure time((25±18 ) vs (22±15)min ), instant success(98.1% vs 97.5%) and relapse rates(1.0% vs 1.2%), security(1 vs 0 case). There was no statistical difference in septal part subgroups (all P >0.05) in the ablation procedure time((22±18)vs (25±19)min), instant success(91.7% vs 89.9 %) and relapse rates(0 vs 11.1%); posterior wall subgroup had no statistical difference in the ablation procedure time((18±15)vs (16±12)min), instant success(100% vs 100 %) and relapse rates(0 vs 0)(all P >0.05); side wall subgroup had no statistical difference in the ablation procedure time((29±20)vs (21±18) min), instant success (98.3% vs 98.1%)and relapse rates(1.7% vs 0%)(all P >0.05). Conclusion: Ablation of left-sided accessory pathways by transseptal approach and transaortic approach has no statistical difference in the procedure time, instant success and relapse rates, security.In a particular case, there is a certain complementarity between the two methods.

  17. Allocating the Fixed Resources and Setting Targets in Integer Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Kobra Gholami

    2013-11-01

    Full Text Available Data envelopment analysis (DEA is a non-parametric approach to evaluate a set of decision making units (DMUs consuming multiple inputs to produce multiple outputs. Formally, DEA use to estimate the efficiency score into the empirical efficient frontier. Also, DEA can be used to allocate resources and set targets for future forecast. The data are continuous in the standard DEA model whereas there are many problems in the real life that data must be integer such as number of employee, machinery, expert and so on. Thus in this paper we propose an approach to allocate fixed resources and set fixed targets with selective integer assumption that is based on an integer data envelopment analysis (IDEA approach for the first time. The major aim in this approach is preserving the efficiency score of DMUs. We use the concept of benchmarking to reach this aim. The numerical example gets to illustrate the applicability of the proposed method.

  18. Dispersion - does it degrade a pulse envelope

    International Nuclear Information System (INIS)

    Deighton, M.O.

    1985-01-01

    In hostile environments, transmitting information as ultrasonic Lamb wave pulses has advantages, since the stainless steel strip serving as a waveguide is very durable. Besides attenuation, velocity dispersion (inherent in Lamb waves) can be important even in fairly short guides. Theory shows that unlimited propagation of a pulsed r.f. envelope is possible, even with dispersion present. The constant group velocity needed would favour asub(o)-mode pulses over other modes, provided ordinary attenuation is small. An approximate formula indicates the useful range of a pulse, when group velocity does vary. (author)

  19. Shape Control of Responsive Building Envelopes

    DEFF Research Database (Denmark)

    Foged, Isak Worre; Kirkegaard, Poul Henning; Christensen, Jesper Thøger

    2010-01-01

    The present paper considers shape control of adaptive architectural structures for improvement of structural performance by recognizing changes in their environments and loads, adapting to meet goals, and using past events to improve future performance or maintain serviceability. The general scop...... environmental system to a primary structural system joint into a collective behavioral system equipment with an actuator system is presented....... alternatives. The adaptive structure is a proposal for a responsive building envelope which is an idea of a first level operational framework for present and future investigations towards performance based responsive architectures through a set of responsive typologies. A mock-up concept of a secondary...

  20. Infrared spectrophotometry and radiative transfer in optically thick circumstellar dust envelopes

    International Nuclear Information System (INIS)

    Merrill, K.M.

    1976-01-01

    The Two-Micron Sky Survey of Neugebauer and Leighton and, more recently, the AFCRL Infrared Sky Survey of Walker and Price have detected numerous compact, isolated, bright infrared sources which are not identified with previously cataloged stars. Observations of many such objects suggest that extensive circumstellar dust envelopes modify the flux from a central source. The present investigations employ broad bandpass photometry at lambda lambda 1.65 μm to 12.5 μm and narrow bandpass spectrophotometry (Δ lambda/lambda approximately 0.015) at lambda lambda 2-4 μm and lambda lambda 8-13 μm to determine the properties of a large sample of such infrared sources. Infrared spectrophotometry can clearly differentiate between normal stars of spectral types M(''oxygen-rich'') and C (''carbon-rich'') on the basis of characteristic absorption bands arising in cool stellar atmospheres. Most of the 2 μ Sky Survey and many of the AFCRL Sky Survey sources appear to be stars of spectral types M and C which are differentiated from normal cool comparison stars only by the presence of extensive circumstellar dust envelopes. Due to the large optical depth of the envelopes, the flux from the star and from the dust cannot be simply separated. Hence solutions of radiative transfer through spherically symmetric envelopes of arbitrary optical depth were generated by a generalized computer code which employed opacities of real dust

  1. The role of short-time intensity and envelope power for speech intelligibility and psychoacoustic masking.

    Science.gov (United States)

    Biberger, Thomas; Ewert, Stephan D

    2017-08-01

    The generalized power spectrum model [GPSM; Biberger and Ewert (2016). J. Acoust. Soc. Am. 140, 1023-1038], combining the "classical" concept of the power-spectrum model (PSM) and the envelope power spectrum-model (EPSM), was demonstrated to account for several psychoacoustic and speech intelligibility (SI) experiments. The PSM path of the model uses long-time power signal-to-noise ratios (SNRs), while the EPSM path uses short-time envelope power SNRs. A systematic comparison of existing SI models for several spectro-temporal manipulations of speech maskers and gender combinations of target and masker speakers [Schubotz et al. (2016). J. Acoust. Soc. Am. 140, 524-540] showed the importance of short-time power features. Conversely, Jørgensen et al. [(2013). J. Acoust. Soc. Am. 134, 436-446] demonstrated a higher predictive power of short-time envelope power SNRs than power SNRs using reverberation and spectral subtraction. Here the GPSM was extended to utilize short-time power SNRs and was shown to account for all psychoacoustic and SI data of the three mentioned studies. The best processing strategy was to exclusively use either power or envelope-power SNRs, depending on the experimental task. By analyzing both domains, the suggested model might provide a useful tool for clarifying the contribution of amplitude modulation masking and energetic masking.

  2. An Experimental Study of Cavitation Detection in a Centrifugal Pump Using Envelope Analysis

    Science.gov (United States)

    Tan, Chek Zin; Leong, M. Salman

    Cavitation represents one of the most common faults in pumps and could potentially lead to a series of failure in mechanical seal, impeller, bearing, shaft, motor, etc. In this work, an experimental rig was setup to investigate cavitation detection using vibration envelope analysis method, and measured parameters included sound, pressure and flow rate for feasibility of cavitation detection. The experiment testing included 3 operating points of the centrifugal pump (B.E.P, 90% of B.E.P and 80% of B.E.P). Suction pressure of the centrifugal pump was decreased gradually until the inception point of cavitation. Vibration measurements were undertaken at various locations including casing, bearing, suction and discharge flange of the centrifugal pump. Comparisons of envelope spectrums under cavitating and non-cavitating conditions were presented. Envelope analysis was proven useful in detecting cavitation over the 3 testing conditions. During the normal operating condition, vibration peak synchronous to rotational speed was more pronounced. It was however during cavitation condition, the half order sub-harmonic vibration component was clearly evident in the envelope spectrums undertaken at all measurement locations except at the pump bearing. The possible explanation of the strong sub-harmonic (½ of BPF) during cavitation existence in the centrifugal pump was due to insufficient time for the bubbles to collapse completely before the end of the single cycle.

  3. Construction Project Success ranking through the Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Mazyar Zahedi-Seresht

    2014-09-01

    Full Text Available The purpose of this paper is to rank construction projects' success in a post delivery phase. To attain this objective, a data envelopment analysis (DEA approach is used. The model's output is a project success index which is calculated based on five project success criteria. These criteria which are determined by a two-round Delphi questionnaire survey are time performance, cost performance, quality, HSE, and customer satisfaction. The input factors which have effects on the output measures are Organizational Sponsorship, Project Manager Competency, Customer Organization, Project Operational Environment and Organizational Experience. The tool adopted to determine these factors is questionnaire. This model is applied for 9 projects with different importance of output and input factors and the reasonable result is achieved for ranking these projects.

  4. Complex envelope control of pulsed accelerating fields in superconducting cavities

    CERN Document Server

    Czarski, T

    2010-01-01

    A digital control system for superconducting cavities of a linear accelerator is presented in this work. FPGA (Field Programmable Gate Arrays) based controller, managed by MATLAB, was developed to investigate a novel firmware implementation. The LLRF - Low Level Radio Frequency system for FLASH project in DESY is introduced. Essential modeling of a cavity resonator with signal and power analysis is considered as a key approach to the control methods. An electrical model is represented by the non-stationary state space equation for the complex envelope of the cavity voltage driven by the current generator and the beam loading. The electromechanical model of the superconducting cavity resonator including the Lorentz force detuning has been developed for a simulation purpose. The digital signal processing is proposed for the field vector detection. The field vector sum control is considered for multiple cavities driven by one klystron. An algebraic, complex domain model is proposed for the system analysis. The c...

  5. A preliminary comparison of hydrodynamic approaches for flood inundation modeling of urban areas in Jakarta Ciliwung river basin

    Science.gov (United States)

    Rojali, Aditia; Budiaji, Abdul Somat; Pribadi, Yudhistira Satya; Fatria, Dita; Hadi, Tri Wahyu

    2017-07-01

    This paper addresses on the numerical modeling approaches for flood inundation in urban areas. Decisive strategy to choose between 1D, 2D or even a hybrid 1D-2D model is more than important to optimize flood inundation analyses. To find cost effective yet robust and accurate model has been our priority and motivation in the absence of available High Performance Computing facilities. The application of 1D, 1D/2D and full 2D modeling approach to river flood study in Jakarta Ciliwung river basin, and a comparison of approaches benchmarked for the inundation study are presented. This study demonstrate the successful use of 1D/2D and 2D system to model Jakarta Ciliwung river basin in terms of inundation results and computational aspect. The findings of the study provide an interesting comparison between modeling approaches, HEC-RAS 1D, 1D-2D, 2D, and ANUGA when benchmarked to the Manggarai water level measurement.

  6. Comparison of Exposure in the Kaplan Versus the Kocher Approach in the Treatment of Radial Head Fractures.

    Science.gov (United States)

    Barnes, Leslie Fink; Lombardi, Joseph; Gardner, Thomas R; Strauch, Robert J; Rosenwasser, Melvin P

    2018-01-01

    The aim of this study was to compare the complete visible surface area of the radial head, neck, and coronoid in the Kaplan and Kocher approaches to the lateral elbow. The hypothesis was that the Kaplan approach would afford greater visibility due to the differential anatomy of the intermuscular planes. Ten cadavers were dissected with the Kaplan and Kocher approaches, and the visible surface area was measured in situ using a 3-dimensional digitizer. Six measurements were taken for each approach by 2 surgeons, and the mean of these measurements were analyzed. The mean surface area visible with the lateral collateral ligament (LCL) preserved in the Kaplan approach was 616.6 mm 2 in comparison with the surface area of 136.2 mm 2 visible in the Kocher approach when the LCL was preserved. Using a 2-way analysis of variance, the difference between these 2 approaches was statistically significant. When the LCL complex was incised in the Kocher approach, the average visible surface area of the Kocher approach was 456.1 mm 2 and was statistically less than the Kaplan approach. The average surface area of the coronoid visible using a proximally extended Kaplan approach was 197.8 mm 2 . The Kaplan approach affords significantly greater visible surface area of the proximal radius than the Kocher approach.

  7. Envelope as Climate Negotiator: Evaluating adaptive building envelope's capacity to moderate indoor climate and energy

    Science.gov (United States)

    Erickson, James

    Through manipulation of adaptable opportunities available within a given environment, individuals become active participants in managing personal comfort requirements, by exercising control over their comfort without the assistance of mechanical heating and cooling systems. Similarly, continuous manipulation of a building skin's form, insulation, porosity, and transmissivity qualities exerts control over the energy exchanged between indoor and outdoor environments. This research uses four adaptive response variables in a modified software algorithm to explore an adaptive building skin's potential in reacting to environmental stimuli with the purpose of minimizing energy use without sacrificing occupant comfort. Results illustrate that significant energy savings can be realized with adaptive envelopes over static building envelopes even under extreme summer and winter climate conditions; that the magnitude of these savings are dependent on climate and orientation; and that occupant thermal comfort can be improved consistently over comfort levels achieved by optimized static building envelopes. The resulting adaptive envelope's unique climate-specific behavior could inform designers in creating an intelligent kinetic aesthetic that helps facilitate adaptability and resiliency in architecture.

  8. Effect of air turbulence on gas transport in soil; comparison of approaches

    Science.gov (United States)

    Pourbakhtiar, Alireza; Papadikis, Konstantinos; Poulsen, Tjalfe; Bridge, Jonathan; Wilkinson, Stephen

    2017-04-01

    Greenhouse gases are playing the key role in global warming. Soil is a source of greenhouse gases such as methane (CH4). Radon (Rn) which is a radioactive gas can emit form subsurface into the atmosphere and leads to health concerns in urban areas. Temperature, humidity, air pressure and vegetation of soil can affect gas emissions inside soil (Oertel et al., 2016). It's shown in many cases that wind induced fluctuations is an important factor in transport of gas through soil and other porous media. An example is: landfill gas emissions (Poulsen et al., 2001). We applied an experimental equipment for measuring controlled air turbulence on gas transport in soil in relation to the depth of sample. Two approaches for measurement of effect of wind turbulence on gas transport were applied and compared. Experiments were carried out with diffusion of CO2 and air as tracer gases with average vertical wind speeds of 0 to 0.83 m s-1. In approach A, Six different sample thicknesses from 5 to 30 cm were selected and total of 4 different wind conditions with different speed and fluctuations were applied. In approach B, a sample with constant depth was used. Five oxygen sensors were places inside sample at different depths. Total of 111 experiments were carried out. Gas transport is described by advection-dispersion equation. Gas transport is quantified as a dispersion coefficient. Oxygen breakthrough curves as a function of distance to the surface of the sample exposed to wind were derived numerically with an explicit forward time, central space finite-difference based model to evaluate gas transport. We showed that wind turbulence-induced fluctuations is an important factor in gas transport that can increase gas transport with average of 45 times more than molecular diffusion under zero wind condition. Comparison of two strategies for experiments, indicated that, constant deep samples (Approach B) are more reliable for measurement of gas transport under influence of wind

  9. Surgically induced astigmatism after phacoemulsification by temporal clear corneal and superior clear corneal approach: a comparison

    Directory of Open Access Journals (Sweden)

    Nikose AS

    2018-01-01

    Full Text Available Archana Sunil Nikose, Dhrubojyoti Saha, Pradnya Mukesh Laddha, Mayuri Patil Department of Ophthalmology, N.K.P. Salve Institute and LMH, Nagpur, Maharashtra, India Introduction: Cataract surgery has undergone various advances since it was evolved from ancient couching to the modern phacoemulsification cataract surgery. Surgically induced astigmatism (SIA remains one of the most common complications. The introduction of sutureless clear corneal incision has gained increasing popularity worldwide because it offers several advantages over the traditional sutured limbal incision and scleral tunnel. A clear corneal incision has the benefit of being bloodless and having an easy approach, but SIA is still a concern.Purpose: In this study, we evaluated the SIA in clear corneal incisions with temporal approach and superior approach phacoemulsification. Comparisons between the two incisions were done using keratometric readings of preoperative and postoperative refractive status.Methodology: It was a hospital-based prospective interventional comparative randomized control trial of 261 patients conducted in a rural-based tertiary care center from September 2012 to August 2014. The visual acuity and detailed anterior segment and posterior segment examinations were done and the cataract was graded according to Lens Opacification Classification System II. Patients were divided for phacoemulsification into two groups, group A and group B, who underwent temporal and superior clear corneal approach, respectively. The patients were followed up on day 1, 7, 30, and 90 postoperatively. The parameters recorded were uncorrected visual acuity, best-corrected visual acuity, slit lamp examination, and keratometry. The mean difference of SIA between 30th and 90th day was statistically evaluated using paired t-test, and all the analyses were performed using SPSS 18.0 (SPSS Inc. software.Results: The mean postoperative SIA in group A was 0.998 D on the 30th day, which

  10. Grain formation in cool stellar envelopes

    International Nuclear Information System (INIS)

    Deguchi, S.

    1980-01-01

    The nucleation and growth of dust grains in the stellar envelope are investigated for the case of oxygen-rich stars, where the mass loss occurs as a result of the radiation pressure on the dust grains. The number density of grains, the final grain sizes, and the final amount of metals remaining in gaseous states are calculated based on the grain-nucleation theory proposed by Yamamoto and Hasegawa and Draine and Salpeter. It is shown that, even if we base our calculations on the Lothe-Pound nucleation rate equation instead of the classical, homogeneous nucleation rate equation, the proposed theory gives a number density of grains quite similar to that based on the classical rate equation. The approximate solution of the flow, in this paper, brings physical insight to the problem of how the formation of grains couples the flow passing the sonic point. The metals in the outer envelope remain in gaseous state by the amount of 1--10% of the initial content for the mass-loss rate of 10 -5 M/sub sun/ yr -1 and by less than 1% for the massloss are less than 3 x 10 -6 M/sub sun/ yr -1 . Species of metals condensed onto the grains are also discussed

  11. Bellanca building, Yellowknife : building envelope retrofit project

    Energy Technology Data Exchange (ETDEWEB)

    Rajewski, G. [A.D. Williams Engineering Inc., Edmonton, AB (Canada)

    2008-07-01

    The Bellanca building is a ten-story, commercial office building, located in Yellowknife, Northwest Territories. The owner was concerned about annual fuel consumption, relative to other buildings of similar size. Tenants reported cold drafts and some ice build-up had been reported in the past, on the exterior of the cladding. In addition, some water penetration had occurred during rainfall. This presentation provided background information on the Bellanca building and discussed a building envelope retrofit project. A.D. Williams was hired in late 2006 in order to provide an opinion on the present condition of the building envelope. This presentation described the site investigation and presented an interior and exterior review of the building. It also presented a thermographic survey in order to map thermal anomalies and establish trends. Following acceptance of the report on findings, one of five options was selected for further development. This included removal of existing cladding, exterior gypsum wallboard, fiberglass insulation and application of BASF Walltite CT foam, sheathing, rigid insulation, drainage plane and new cladding. The preliminary design was then presented. This paper also described the tender and award of the contract; construction phase; and substantial completion of the project. tabs, figs.

  12. Chemistry of Protostellar Envelopes and Disks

    Science.gov (United States)

    Flores Rivera, Lizxandra; Terebey, Susan; Willacy, Karen

    2018-06-01

    Molecule formation is dynamic during the protostar collapse phase, driven by changes in temperature, density, and UV radiation as gas and dust flows from the envelope onto the forming protoplanetary disk. In this work, we compare physical models based on two different collapse solutions. We modeled the chemistry (created by Karen Willacy) for C18O to see how its abundance changes over time using as primary input parameters the temperature and density profile that were produced by the dust Radiative Transfer (MCRT) model called HOCHUNK3D from Whitney (2003). Given this model, we produce synthetic line emission maps from L1527 IRS to simulate the Class 0/I protostar L1527 IRS using RADMC3D code and compare them with previous observations from ALMA. High concentrations of gas phase molecules of C18O are found within the 20 AU in areas in the envelope that are close to the surface of the disk. In the outermost part of the disk surface, the C18O freezes out beyond 400 AU, showing a much reduced abundance where the temperature profile drops down below 25 K. In cold regions, the radiation field plays an important role in the chemistry.

  13. Solution of K-V envelope equations

    International Nuclear Information System (INIS)

    Anderson, O.A.

    1995-04-01

    The envelope equations for a KV beam with space charge have been analyzed systematically by an e expansion followed by integrations. The focusing profile as a function of axial length is assumed to be symmetric but otherwise arbitrary. Given the bean current, emittance, and peak focusing field, we find the envelopes a(s) and b(s) and obtain , a max , σ, and σ 0 . Explicit results are presented for various truncations of the expansion. The zeroth order results correspond to those from the well-known smooth approximation; the same convenient format is retained for the higher order cases. The first order results, involving single correction terms, give 3--10 times better accuracy and are good to ∼1% at σ 0 = 70 degree. Third order gives a factor of 10--30 improvement over the smooth approximation and derived quantities accurate to ∼1% at σ 0 = 112 degree. The first order expressions are convenient design tools. They lend themselves to variable energy problems and have been applied to the design, construction, and testing of ESQ accelerators at LBL

  14. Evaluating the effect of corridors and landscape heterogeneity on dispersal probability: a comparison of three spatially explicit modelling approaches

    DEFF Research Database (Denmark)

    Jepsen, J. U.; Baveco, J. M.; Topping, C. J.

    2004-01-01

    preferences of the modeller, rather than by a critical evaluation of model performance. We present a comparison of three common spatial simulation approaches (patch-based incidence-function model (IFM), individual-based movement model (IBMM), individual-based population model including detailed behaviour...

  15. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    Science.gov (United States)

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  16. Evaluation of users' satisfaction on pedestrian facilities using pair-wise comparison approach

    International Nuclear Information System (INIS)

    Zainol, R; Ahmad, F; Nordin, N A; Aripin, A W M

    2014-01-01

    Global climate change issues demand people of the world to change the way they live today. Thus, current cities need to be redeveloped towards less use of carbon in their day to day operations. Pedestrianized environment is one of the approaches used in reducing carbon foot print in cities. Heritage cities are the first to be looked into since they were built in the era in which motorized vehicles were minimal. Therefore, the research explores users' satisfaction on assessment of physical attributes of pedestrianization in Melaka Historical City, a UNESCO World Heritage Site. It aims to examine users' satisfaction on pedestrian facilities provided within the study area using pair wise questionnaire comparison approach. A survey of 200 respondents using random sampling was conducted in six different sites namely Jonker Street, Church Street, Kota Street, Goldsmith Street, Merdeka Street to Taming Sari Tower and Merdeka Street to River Cruise terminal. The survey consists of an assessment tool based on a nine-point scale of users' satisfaction level of pathway properties, zebra pedestrian crossing, street furniture, personal safety, adjacent to traffic flow, aesthetic and amenities. Analytical hierarchical process (AHP) was used to avoid any biasness in analyzing the data collected. Findings show that Merdeka Street to Taming Sari Tower as the street that scores the highest satisfaction level that fulfils all the required needs of a pedestrianized environment. Similar assessment elements can be used to evaluate existing streets in other cities and these criteria should also be used in planning for future cities

  17. A comparison and assessment of approaches for modelling flow over in-line tube banks

    International Nuclear Information System (INIS)

    Iacovides, Hector; Launder, Brian; West, Alastair

    2014-01-01

    Highlights: • We present wall-resolved LES and URANS simulations of periodic flow in heated in-line tube banks. • Simulations of flow in a confined in-line tube-bank are compared with experimental data. • When pitch-to-diameter (P/D) ratio becomes less than 1.6, the periodic flow becomes skewed. • URANS tested here unable to mimic the periodic flow at P/D = 1.6. • In confined tube banks URANS suggest alternate, in the axial direction, flow deflection. - Abstract: The paper reports experiences from applying alternative strategies for modelling turbulent flow and local heat-transfer coefficients around in-line tube banks. The motivation is the simulation of conditions in the closely packed cross-flow heat exchangers used in advanced gas-cooled nuclear reactors (AGRs). The main objective is the flow simulation in large-scale tube banks with confining walls. The suitability and accuracy of wall-resolved large-eddy simulation (LES) and Unsteady Reynolds-Averaged Navier–Stokes (URANS) approaches are examined for generic, square, in-line tube banks, where experimental data are limited but available. Within the latter approach, both eddy-viscosity and Reynolds-stress-transport models have been tested. The assumption of flow periodicity in all three directions is investigated by varying the domain size. It is found that the path taken by the fluid through the tube-bank configuration differs according to the treatment of turbulence and whether the flow is treated as two- or three-dimensional. Finally, the important effect of confining walls has been examined by making direct comparison with the experiments of the complete test rig of Aiba et al. (1982)

  18. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    Science.gov (United States)

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  19. Perfluoroalkyl substances in aquatic environment-comparison of fish and passive sampling approaches.

    Science.gov (United States)

    Cerveny, Daniel; Grabic, Roman; Fedorova, Ganna; Grabicova, Katerina; Turek, Jan; Kodes, Vit; Golovko, Oksana; Zlabek, Vladimir; Randak, Tomas

    2016-01-01

    The concentrations of seven perfluoroalkyl substances (PFASs) were investigated in 36 European chub (Squalius cephalus) individuals from six localities in the Czech Republic. Chub muscle and liver tissue were analysed at all sampling sites. In addition, analyses of 16 target PFASs were performed in Polar Organic Chemical Integrative Samplers (POCISs) deployed in the water at the same sampling sites. We evaluated the possibility of using passive samplers as a standardized method for monitoring PFAS contamination in aquatic environments and the mutual relationships between determined concentrations. Only perfluorooctane sulphonate was above the LOQ in fish muscle samples and 52% of the analysed fish individuals exceeded the Environmental Quality Standard for water biota. Fish muscle concentration is also particularly important for risk assessment of fish consumers. The comparison of fish tissue results with published data showed the similarity of the Czech results with those found in Germany and France. However, fish liver analysis and the passive sampling approach resulted in different fish exposure scenarios. The total concentration of PFASs in fish liver tissue was strongly correlated with POCIS data, but pollutant patterns differed between these two matrices. The differences could be attributed to the metabolic activity of the living organism. In addition to providing a different view regarding the real PFAS cocktail to which the fish are exposed, POCISs fulfil the Three Rs strategy (replacement, reduction, and refinement) in animal testing. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. On Thermally Interacting Multiple Boreholes with Variable Heating Strength: Comparison between Analytical and Numerical Approaches

    Directory of Open Access Journals (Sweden)

    Marc A. Rosen

    2012-08-01

    Full Text Available The temperature response in the soil surrounding multiple boreholes is evaluated analytically and numerically. The assumption of constant heat flux along the borehole wall is examined by coupling the problem to the heat transfer problem inside the borehole and presenting a model with variable heat flux along the borehole length. In the analytical approach, a line source of heat with a finite length is used to model the conduction of heat in the soil surrounding the boreholes. In the numerical method, a finite volume method in a three dimensional meshed domain is used. In order to determine the heat flux boundary condition, the analytical quasi-three-dimensional solution to the heat transfer problem of the U-tube configuration inside the borehole is used. This solution takes into account the variation in heating strength along the borehole length due to the temperature variation of the fluid running in the U-tube. Thus, critical depths at which thermal interaction occurs can be determined. Finally, in order to examine the validity of the numerical method, a comparison is made with the results of line source method.

  1. Sales Comparison Approach Indicating Heterogeneity of Particular Type of Real Estate and Corresponding Valuation Accuracy

    Directory of Open Access Journals (Sweden)

    Martin Cupal

    2017-01-01

    Full Text Available The article focuses on heterogeneity of goods, namely real estate and consequently deals with market valuation accuracy. The heterogeneity of real estate property is, in particular, that every unit is unique in terms of its construction, condition, financing and mainly location and thus assessing the value must necessarily be difficult. This research also indicates the rate of efficiency of markets across the types based on their level of variability. The research is based on two databases consisting of various types of real estate with specific market parameters. These parameters determine the differences across the types and reveal heterogeneity. The first database has been set on valuations by sales comparison approach and the second one on data of real properties offered on the market. The methodology is based on univariate and multivariate statistics of key variables of those databases. The multivariate analysis is performed by Hotelling T2 control chart and statistics with appropriate numerical characteristics. The results of both databases were joint by weights with regard to the dependence criterion of the variables. The final results indicate potential valuation accuracy across the types. The main contribution of the research is that the evaluation was not only derived from the price deviation or distribution, but it also draws from causes of real property heterogeneity as a whole.

  2. Mechanism of protein import across the chloroplast envelope.

    Science.gov (United States)

    Chen, K; Chen, X; Schnell, D J

    2000-01-01

    The development and maintenance of chloroplasts relies on the contribution of protein subunits from both plastid and nuclear genomes. Most chloroplast proteins are encoded by nuclear genes and are post-translationally imported into the organelle across the double membrane of the chloroplast envelope. Protein import into the chloroplast consists of two essential elements: the specific recognition of the targeting signals (transit sequences) of cytoplasmic preproteins by receptors at the outer envelope membrane and the subsequent translocation of preproteins simultaneously across the double membrane of the envelope. These processes are mediated via the co-ordinate action of protein translocon complexes in the outer (Toc apparatus) and inner (Tic apparatus) envelope membranes.

  3. Adaptive Flight Envelope Estimation and Protection, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Impact Technologies, in collaboration with the Georgia Institute of Technology, proposes to develop and demonstrate an innovative flight envelope estimation and...

  4. Analyzing Information Systems Development: A Comparison and Analysis of Eight IS Development Approaches.

    Science.gov (United States)

    Iivari, Juhani; Hirschheim, Rudy

    1996-01-01

    Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…

  5. Radiative transfer in spherical circumstellar dust envelopes. III. Dust envelope models of some well known infrared stars

    International Nuclear Information System (INIS)

    Apruzese, J.P.

    1975-01-01

    The radiative transfer techniques described elsewhere by the author have been employed to construct dust envelope models of several well known infrared stars. The resulting calculations indicate that the infrared emissivity of circumstellar grains generally must be higher than that which many calculations of small nonsilicate grains yield. This conclusion is dependent to some degree on the (unknown) size of the stellar envelopes considered, but is quite firm in the case of the spatially resolved envelope of IRC+10216. Further observations of the spatial distribution of the infrared radiation from stellar envelopes will be invaluable in deciphering the properties of the circumstellar grains

  6. Evolution of building envelope construction techniques in coastal British Columbia

    Energy Technology Data Exchange (ETDEWEB)

    Mattock, C.; Ito, K.; Oshikawa, T. [International Eco-House Inc., (Canada)

    1999-11-01

    Changes in the significant evolutionary development over the past 3 years in building envelope construction for multi storey wood frame housing in British Columbia are described. The urban areas of this region are characterized by a maritime climate which features a high frequency of wind driven rain and little accumulation of snow. Buildings are exposed to high wetting with little drying potential, and moderate temperatures allow for fungal growth even in the winter. While as in the rest of Canada wetting is often due to condensation of moisture contained in indoor air as it leaks out of the building, in British Columbia wind driven rain is a much larger source of moisture. Given this, the following principles of moisture control have been promoted to the B.C. building industry in order of priority: 1) deflection - using parts and elements of the building such as overhangs and flashings that reduce the exposure of the exterior walls to rain, 2) drainage - using envelope assemblies that will then redirect liquid water to the outside, 3) employing drying elements that promote drying through diffusion such as highly permeable wall sheathings, and 4) use of durable materials - using materials that resist rot such as treated lumber, stainless steel fastenings, etc. A variety of air barrier systems other than the conventional sealed polyethylene approach have been employed because of the introduction of recent building code requirements for enhanced airtightness and air barrier durability combined with the use of rain screen construction. This variety of air barrier systems includes: an airtight drywall, an exterior permeable membrane, and an exterior impermeable membrane.

  7. Psychoeducation for hypochondriasis : A comparison of a cognitive-behavioural approach and a problem-solving approach

    NARCIS (Netherlands)

    Buwalda, Femke M.; Bouman, Theo. K.; van Duijn, Marijtje A. J.; Van der Duin, M.

    In this study, two 6-week psychoeducational courses for hypochondriasis are compared, one based on the cognitive-behavioural approach, and the other on the problem-solving approach. Effects of both courses on hypochondriacal complaints, depression, trait anxiety, and number of problems encountered

  8. Identification of the dynamic operating envelope of HCCI engines using class imbalance learning.

    Science.gov (United States)

    Janakiraman, Vijay Manikandan; Nguyen, XuanLong; Sterniak, Jeff; Assanis, Dennis

    2015-01-01

    Homogeneous charge compression ignition (HCCI) is a futuristic automotive engine technology that can significantly improve fuel economy and reduce emissions. HCCI engine operation is constrained by combustion instabilities, such as knock, ringing, misfires, high-variability combustion, and so on, and it becomes important to identify the operating envelope defined by these constraints for use in engine diagnostics and controller design. HCCI combustion is dominated by complex nonlinear dynamics, and a first-principle-based dynamic modeling of the operating envelope becomes intractable. In this paper, a machine learning approach is presented to identify the stable operating envelope of HCCI combustion, by learning directly from the experimental data. Stability is defined using thresholds on combustion features obtained from engine in-cylinder pressure measurements. This paper considers instabilities arising from engine misfire and high-variability combustion. A gasoline HCCI engine is used for generating stable and unstable data observations. Owing to an imbalance in class proportions in the data set, the models are developed both based on resampling the data set (by undersampling and oversampling) and based on a cost-sensitive learning method (by overweighting the minority class relative to the majority class observations). Support vector machines (SVMs) and recently developed extreme learning machines (ELM) are utilized for developing dynamic classifiers. The results compared against linear classification methods show that cost-sensitive nonlinear ELM and SVM classification algorithms are well suited for the problem. However, the SVM envelope model requires about 80% more parameters for an accuracy improvement of 3% compared with the ELM envelope model indicating that ELM models may be computationally suitable for the engine application. The proposed modeling approach shows that HCCI engine misfires and high-variability combustion can be predicted ahead of time

  9. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    Science.gov (United States)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  10. Polarimetry and physics of Be star envelopes

    International Nuclear Information System (INIS)

    Coyne, G.V.; McLean, I.S.

    1982-01-01

    A review of the most recent developments in polarization studies of Be stars is presented. New polarization techniques for high-resolution spectropolarimetry and for near infrared polarimetry are described and a wide range of new observations are discussed. These include broad-band, intermediate-band and multichannel observations of the continuum polarization of Be stars in the wavelength interval 0.3-2.2 microns, high resolution (0.5 A) line profile polarimetry of a few stars and surveys of many stars for the purposes of statistical analyses. The physical significance of the observational material is discussed in the light of recent theoretical models. Emphasis is placed on the physical and geometrical parameters of Be star envelopes which polarimetry helps to determine. (Auth.)

  11. Enveloping branes and brane-world singularities

    Energy Technology Data Exchange (ETDEWEB)

    Antoniadis, Ignatios; Cotsakis, Spiros [CERN-Theory Division, Department of Physics, Geneva 23 (Switzerland); Klaoudatou, Ifigeneia [University of the Aegean, Research Group of Geometry, Dynamical Systems and Cosmology, Department of Information and Communication Systems Engineering, Samos (Greece)

    2014-12-01

    The existence of envelopes is studied for systems of differential equations in connection with the method of asymptotic splittings which allows one to determine the singularity structure of the solutions. The result is applied to brane-worlds consisting of a 3-brane in a five-dimensional bulk, in the presence of an analog of a bulk perfect fluid parameterizing a generic class of bulk matter. We find that all flat brane solutions suffer from a finite-distance singularity contrary to previous claims. We then study the possibility of avoiding finite-distance singularities by cutting the bulk and gluing regular solutions at the position of the brane. Further imposing physical conditions such as finite Planck mass on the brane and positive energy conditions on the bulk fluid, excludes, however, this possibility as well. (orig.)

  12. Performance measurement with fuzzy data envelopment analysis

    CERN Document Server

    Tavana, Madjid

    2014-01-01

    The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations.  Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.

  13. Pushing the Envelope of Extreme Space Weather

    Science.gov (United States)

    Pesnell, W. D.

    2014-12-01

    Extreme Space Weather events are large solar flares or geomagnetic storms, which can cost billions of dollars to recover from. We have few examples of such events; the Carrington Event (the solar superstorm) is one of the few that had superlatives in three categories: size of solar flare, drop in Dst, and amplitude of aa. Kepler observations show that stars similar to the Sun can have flares releasing millions of times more energy than an X-class flare. These flares and the accompanying coronal mass ejections could strongly affect the atmosphere surrounding a planet. What level of solar activity would be necessary to strongly affect the atmosphere of the Earth? Can we map out the envelope of space weather along the evolution of the Sun? What would space weather look like if the Sun stopped producing a magnetic field? To what extreme should Space Weather go? These are the extremes of Space Weather explored in this talk.

  14. Data envelopment analysis of randomized ranks

    Directory of Open Access Journals (Sweden)

    Sant'Anna Annibal P.

    2002-01-01

    Full Text Available Probabilities and odds, derived from vectors of ranks, are here compared as measures of efficiency of decision-making units (DMUs. These measures are computed with the goal of providing preliminary information before starting a Data Envelopment Analysis (DEA or the application of any other evaluation or composition of preferences methodology. Preferences, quality and productivity evaluations are usually measured with errors or subject to influence of other random disturbances. Reducing evaluations to ranks and treating the ranks as estimates of location parameters of random variables, we are able to compute the probability of each DMU being classified as the best according to the consumption of each input and the production of each output. Employing the probabilities of being the best as efficiency measures, we stretch distances between the most efficient units. We combine these partial probabilities in a global efficiency score determined in terms of proximity to the efficiency frontier.

  15. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  16. The influence of ventilated façade on sound insulation properties of envelope walls

    Directory of Open Access Journals (Sweden)

    Fišarová Zuzana

    2017-01-01

    Full Text Available Presented article deals with sound insulation properties of timber structures’ envelope walls. Particularly, the influence of heavy board ventilated façade on laboratory airborne sound insulation R and Rw in dB was studied. The installation method and gaps between façade boards can cause building defects originating in overrating the influence of ventilated cladding on envelope wall acoustic parameters. Real constructions were built for the experimental purposes and measurements, one with gaps between boards and one with simply eliminated gaps for mutual comparison. The results obtained were processed to make tables and graphs and to derive recommendations for the design of this type of constructions involving the general installation method of façade boards. Detailed results are depicted in conclusions.

  17. Advanced Envelope Research for Factory Built Housing, Phase 3 -- Whole-House Prototyping

    Energy Technology Data Exchange (ETDEWEB)

    Levy, E.; Mullens, M.; Rath, P.

    2014-04-01

    The Advanced Envelope Research effort will provide factory homebuilders with high performance, cost-effective envelope designs that can be effectively integrated into the plant production process while meeting the thermal requirements of the 2012 IECC standards. Given the affordable nature of manufactured homes, impact on first cost is a major consideration in developing new envelope technologies. This work is part of a multi-phase effort. Phase 1 identified seven envelope technologies and provided a preliminary assessment of three methods for building high performance walls. Phase 2 focused on developing viable product designs, manufacturing strategies, addressing code and structural issues, and cost analysis of the three selected options. An industry advisory committee helped narrow the research focus to perfecting a stud wall design with exterior continuous insulation (CI). Phase 3, completed in two stages, continued the design development effort, exploring and evaluating a range or methods for applying CI to factory built homes. The scope also included material selection, manufacturing and cost analysis, and prototyping and testing. During this phase, a home was built with CI, evaluated, and placed in service. The experience of building a mock up wall section with CI and then constructing on line a prototype home resolved important concerns about how to integrate the material into the production process. First steps were taken toward finding least expensive approaches for incorporating CI in standard factory building practices and a preliminary assessment suggested that even at this early stage the technology is attractive when viewed from a life cycle cost perspective.

  18. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    Science.gov (United States)

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. An empirical comparison of different approaches for combining multimodal neuroimaging data with Support Vector Machine

    Directory of Open Access Journals (Sweden)

    William ePettersson-Yeo

    2014-07-01

    Full Text Available In the pursuit of clinical utility, neuroimaging researchers of psychiatric and neurological illness are increasingly using analyses, such as support vector machine (SVM, that allow inference at the single-subject level. Recent studies employing single-modality data, however, suggest that classification accuracies must be improved for such utility to be realised. One possible solution is to integrate different data types to provide a single combined output classification; either by generating a single decision function based on an integrated kernel matrix, or, by creating an ensemble of multiple single modality classifiers and integrating their predictions. Here, we describe four integrative approaches: 1 an un-weighted sum of kernels, 2 multi-kernel learning, 3 prediction averaging, and 4 majority voting, and compare their ability to enhance classification accuracy relative to the best single-modality classification accuracy. We achieve this by integrating structural, functional and diffusion tensor magnetic resonance imaging data, in order to compare ultra-high risk (UHR; n=19, first episode psychosis (FEP; n=19 and healthy control subjects (HCs; n=19. Our results show that i whilst integration can enhance classification accuracy by up to 13%, the frequency of such instances may be limited, ii where classification can be enhanced, simple methods may yield greater increases relative to more computationally complex alternatives, and, iii the potential for classification enhancement is highly influenced by the specific diagnostic comparison under consideration. In conclusion, our findings suggest that for moderately sized clinical neuroimaging datasets, combining different imaging modalities in a data-driven manner is no magic bullet for increasing classification accuracy.

  20. Social Comparison and Body Image in Adolescence: A Grounded Theory Approach

    Science.gov (United States)

    Krayer, A.; Ingledew, D. K.; Iphofen, R.

    2008-01-01

    This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…

  1. The Pennsylvania Phosphorus Index and TopoSWAT: A comparison of transport components and approaches

    Science.gov (United States)

    The regional Chesapeake Bay Conservation Innovation Grant Initiative includes comparison of TopoSWAT results and Phosphorus Index (P Index) evaluations of eight study watersheds throughout the Chesapeake Bay watershed. While similarities exist between the P Index and TopoSWAT, further comparison of ...

  2. Calculation of CWKB envelope in boson and fermion productions

    Indian Academy of Sciences (India)

    Abstract. We present the calculation of envelope of boson and of both low- and high- mass fermion production at the end of inflation when the coherently oscillating inflatons decay into bosons and fermions. We consider three different models of inflation and use. CWKB technique to calculate the envelope to understand the ...

  3. 14 CFR 29.1517 - Limiting height-speed envelope.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Limiting height-speed envelope. 29.1517... Operating Limitations § 29.1517 Limiting height-speed envelope. For Category A rotorcraft, if a range of... following power failure, the range of heights and its variation with forward speed must be established...

  4. Beam envelope profile of non-centrosymmetric polygonal phase space

    International Nuclear Information System (INIS)

    Chen Yinbao; Xie Xi

    1984-01-01

    The general theory of beam envelope profile of non-centrosymmetric polygonal phase space is developed. By means of this theory the beam envelope profile of non-centrosymmetric polygonal phase space can be calculated directly. An example is carried out in detail to show the practical application of the theory

  5. Distinguishing between forensic science and forensic pseudoscience: testing of validity and reliability, and approaches to forensic voice comparison.

    Science.gov (United States)

    Morrison, Geoffrey Stewart

    2014-05-01

    In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.

  6. A prospective randomized peri- and post-operative comparison of the minimally invasive anterolateral approach versus the lateral approach

    OpenAIRE

    Stefan Landgraeber; Henning Quitmann; Sebastian Güth; Marcel Haversath; Wojciech Kowalczyk; Andrés Kecskeméthy; Hansjörg Heep; Marcus Jäger

    2013-01-01

    There is still controversy as to whether minimally invasive total hip arthroplasty enhances the postoperative outcome. The aim of this study was to compare the outcome of patients who underwent total hip replacement through an anterolateral minimally invasive (MIS) or a conventional lateral approach (CON). We performed a randomized, prospective study of 75 patients with primary hip arthritis, who underwent hip replacement through the MIS (n=36) or CON (n=39) approach. The Western Ontario and ...

  7. Towards the description of the phase behavior of electrolyte solutions in slit-like pores. Density functional approach for the restricted primitive model

    Directory of Open Access Journals (Sweden)

    O.Pizio

    2004-01-01

    Full Text Available We develop a density functional approach for the phase behavior of the restricted primitive model for electrolyte solutions confined to slit-like pores. The theory permits to evaluate the effects of confinement on the ionic vapor - ionic liquid coexistence envelope. We have shown that due to confinement in pores with uncharged walls the critical temperature of the model decreases compared to the bulk. Also the coexistence envelope of the transition is narrower in comparison to the bulk model. The transition between dense and dilute phase represents capillary evaporation. We have analyzed changes of the density profiles of ions during transition. Possible extensions of this study are discussed.

  8. A Spectral Algorithm for Envelope Reduction of Sparse Matrices

    Science.gov (United States)

    Barnard, Stephen T.; Pothen, Alex; Simon, Horst D.

    1993-01-01

    The problem of reordering a sparse symmetric matrix to reduce its envelope size is considered. A new spectral algorithm for computing an envelope-reducing reordering is obtained by associating a Laplacian matrix with the given matrix and then sorting the components of a specified eigenvector of the Laplacian. This Laplacian eigenvector solves a continuous relaxation of a discrete problem related to envelope minimization called the minimum 2-sum problem. The permutation vector computed by the spectral algorithm is a closest permutation vector to the specified Laplacian eigenvector. Numerical results show that the new reordering algorithm usually computes smaller envelope sizes than those obtained from the current standard algorithms such as Gibbs-Poole-Stockmeyer (GPS) or SPARSPAK reverse Cuthill-McKee (RCM), in some cases reducing the envelope by more than a factor of two.

  9. Ultraviolet extinction in M-supergiant circumstellar envelopes

    International Nuclear Information System (INIS)

    Buss, R.H. Jr.; Snow, T.P. Jr.

    1986-01-01

    Using International Ultraviolet (IUS) archival low-dispersion spectra, ultraviolet spectral extinctions were derived for the circumstellar envelopes of two M supergiants: HD 60414 and HD 213310. The observed stellar systems belong to a class of widely-separated spectroscopic binaries that are called VV Cephei stars. The total extinction was calculated by dividing the reddened fluxes with unreddened comparison fluxes of similar stars (g B2.5 for HD 213310 and a normalized s+B3 for HD 60414) from the reference atlas. After substracting the interstellar extinctions, which were estimated from the E(B-V) reddening of nearby stars, the resultant circumstellar extinctions were normalized at about 3.5 inverse microns. Not only is the 2175 A extinction bump absent in the circumstellar extinctions, but the far-ultraviolet extinction rise is also absent. The rather flat, ultraviolet extinction curves were interpreted as signatures of a population of noncarbonaceous, oxygen-rich grains with diameters larger than the longest observed wavelength

  10. Self-regulatory Behaviors and Approaches to Learning of Arts Students: A Comparison Between Professional Training and English Learning.

    Science.gov (United States)

    Tseng, Min-Chen; Chen, Chia-Cheng

    2017-06-01

    This study investigated the self-regulatory behaviors of arts students, namely memory strategy, goal-setting, self-evaluation, seeking assistance, environmental structuring, learning responsibility, and planning and organizing. We also explored approaches to learning, including deep approach (DA) and surface approach (SA), in a comparison between students' professional training and English learning. The participants consisted of 344 arts majors. The Academic Self-Regulation Questionnaire and the Revised Learning Process Questionnaire were adopted to examine students' self-regulatory behaviors and their approaches to learning. The results show that a positive and significant correlation was found in students' self-regulatory behaviors between professional training and English learning. The results indicated that increases in using self-regulatory behaviors in professional training were associated with increases in applying self-regulatory behaviors in learning English. Seeking assistance, self-evaluation, and planning and organizing were significant predictors for learning English. In addition, arts students used the deep approach more often than the surface approach in both their professional training and English learning. A positive correlation was found in DA, whereas a negative correlation was shown in SA between students' self-regulatory behaviors and their approaches to learning. Students with high self-regulation adopted a deep approach, and they applied the surface approach less in professional training and English learning. In addition, a SEM model confirmed that DA had a positive influence; however, SA had a negative influence on self-regulatory behaviors.

  11. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches

    Energy Technology Data Exchange (ETDEWEB)

    Walke, Russell C. [Quintessa Limited, The Hub, 14 Station Road, Henley-on-Thames (United Kingdom); Kirchner, Gerald [University of Hamburg, ZNF, Beim Schlump 83, 20144 Hamburg (Germany); Xu, Shulan; Dverstorp, Bjoern [Swedish Radiation Safety Authority, SE-171 16 Stockholm (Sweden)

    2014-07-01

    to the biosphere. Some radionuclides do not reach equilibrium within the time frame that the biosphere evolves at the Forsmark site, making associated dose factors sensitive to time scales assumed for biosphere evolution. Comparison of the results generated by both types of model demonstrates that, for areas that evolve from marine, through lakes and mires to terrestrial systems with organic soils, the approach adopted in SKB's model is conservative. However, higher dose factors are possible when potential for long-term irrigation with shallow groundwater is considered. Surveys of groundwater wells in the Forsmark area today show that some shallow groundwater is used to water plants, which demonstrates that small scale irrigation from such sources cannot be ruled out for present-day or warmer climate states. Complex models use more of the available site-specific information and contribute to an understanding of complex process interactions and effects of system heterogeneity. The study shows, however, that simple 'reference' biosphere models enable processes that control potential radionuclide impacts to be identified, taking into account climate variability. They help to build understanding and confidence in more complex modelling approaches, quantify the conservatisms involved and remain a valuable tool for nuclear waste disposal licensing procedures. (authors)

  12. Envelope enhancement increases cortical sensitivity to interaural envelope delays with acoustic and electric hearing.

    Directory of Open Access Journals (Sweden)

    Douglas E H Hartley

    Full Text Available Evidence from human psychophysical and animal electrophysiological studies suggests that sensitivity to interaural time delay (ITD in the modulating envelope of a high-frequency carrier can be enhanced using half-wave rectified stimuli. Recent evidence has shown potential benefits of equivalent electrical stimuli to deaf individuals with bilateral cochlear implants (CIs. In the current study we assessed the effects of envelope shape on ITD sensitivity in the primary auditory cortex of normal-hearing ferrets, and profoundly-deaf animals with bilateral CIs. In normal-hearing animals, cortical sensitivity to ITDs (±1 ms in 0.1-ms steps was assessed in response to dichotically-presented i sinusoidal amplitude-modulated (SAM and ii half-wave rectified (HWR tones (100-ms duration; 70 dB SPL presented at the best-frequency of the unit over a range of modulation frequencies. In separate experiments, adult ferrets were deafened with neomycin administration and bilaterally-implanted with intra-cochlear electrode arrays. Electrically-evoked auditory brainstem responses (EABRs were recorded in response to bipolar electrical stimulation of the apical pair of electrodes with singe biphasic current pulses (40 µs per phase over a range of current levels to measure hearing thresholds. Subsequently, we recorded cortical sensitivity to ITDs (±800 µs in 80-µs steps within the envelope of SAM and HWR biphasic-pulse trains (40 µs per phase; 6000 pulses per second, 100-ms duration over a range of modulation frequencies. In normal-hearing animals, nearly a third of cortical neurons were sensitive to envelope-ITDs in response to SAM tones. In deaf animals with bilateral CI, the proportion of ITD-sensitive cortical neurons was approximately a fifth in response to SAM pulse trains. In normal-hearing and deaf animals with bilateral CI the proportion of ITD sensitive units and neural sensitivity to ITDs increased in response to HWR, compared with SAM stimuli

  13. Within-culture variations of uniqueness: towards an integrative approach based on social status, gender, life contexts, and interpersonal comparison.

    Science.gov (United States)

    Causse, Elsa; Félonneau, Marie-Line

    2014-01-01

    Research on uniqueness is widely focused on cross-cultural comparisons and tends to postulate a certain form of within-culture homogeneity. Taking the opposite course of this classic posture, we aimed at testing an integrative approach enabling the study of within-culture variations of uniqueness. This approach considered different sources of variation: social status, gender, life contexts, and interpersonal comparison. Four hundred seventy-nine participants completed a measure based on descriptions of "self" and "other." Results showed important variations of uniqueness. An interaction between social status and life contexts revealed the expression of uniqueness in the low-status group. This study highlights the complexity of uniqueness that appears to be related to both cultural ideology and social hierarchy.

  14. Time-History Seismic Analysis of Masonry Buildings: A Comparison between Two Non-Linear Modelling Approaches

    Directory of Open Access Journals (Sweden)

    Michele Betti

    2015-05-01

    Full Text Available The paper presents a comparison between two numerical modelling approaches employed to investigate the seismic behavior of unreinforced masonry buildings with flexible diaphragms. The comparison is performed analyzing a two-story prototype tested on a shaking table at the CNR-ENEA research center of Casaccia (Italy. The first numerical model was built by using the finite element (FE technique, while the second one was built by a simplified macro-element (ME approach. Both models were employed to perform non-linear dynamic analyses, integrating the equations of motion by step-by-step procedures. The shaking table tests were simulated to analyze the behavior of the prototype from the initial elastic state until the development of extensive damage. The main results of the analyses are discussed and critically compared in terms of engineering parameters, such as accelerations, displacements and base shears. The effectiveness of both models within the investigated typology of buildings is then evaluated in depth.

  15. On construction of two-dimensional Riemannian manifolds embedded into enveloping Euclidean (pseudo-Euclidean) space

    International Nuclear Information System (INIS)

    Saveliev, M.V.

    1983-01-01

    In the framework of the algebraic approach a construction of exactly integrable two-dimensional Riemannian manifolds embedded into enveloping Euclidean (pseudo-Euclidean) space Rsub(N) of an arbitrary dimension is presented. The construction is based on a reformulation of the Gauss, Peterson-Codazzi and Ricci equations in the form of a Lax-type representation in two-dimensional space. Here the Lax pair operators take the values in algebra SO(N)

  16. Marketing through Social Media : Case: Comparison of Social Media Marketing Approaches of B2C Companies for Company X

    OpenAIRE

    Rantapelkonen Ahlberg, Jaana

    2010-01-01

    Rantapelkonen Ahlberg, Jaana. 2010. Social Media Marketing. Case: Comparison of Social Media Marketing Approaches of B2C Companies for Company X. Master’s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 42 (74). The objective of this thesis is to provide insights on how Company X can use Social Media as a marketing and branding tool in consumer marketing in the Swedish market. More specifically, this study attempts to define what kinds of social media are u...

  17. Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P100)

    NARCIS (Netherlands)

    Bornmann, L.; Leydesdorff, L.; Wang, J.

    2013-01-01

    For comparisons of citation impacts across fields and over time, bibliometricians normalize the observed citation counts with reference to an expected citation value. Percentile-based approaches have been proposed as a non-parametric alternative to parametric central-tendency statistics. Percentiles

  18. Approaches towards airport economic performance measurement

    Directory of Open Access Journals (Sweden)

    Ivana STRYČEKOVÁ

    2011-01-01

    Full Text Available The paper aims to assess how economic benchmarking is being used by airports as a means of performance measurement and comparison of major international airports in the world. The study focuses on current benchmarking practices and methods by taking into account different factors according to which it is efficient to benchmark airports performance. As methods are considered mainly data envelopment analysis and stochastic frontier analysis. Apart from them other approaches are discussed by airports to provide economic benchmarking. The main objective of this article is to evaluate the efficiency of the airports and answer some undetermined questions involving economic benchmarking of the airports.

  19. MAGNETIZATION OF CLOUD CORES AND ENVELOPES AND OTHER OBSERVATIONAL CONSEQUENCES OF RECONNECTION DIFFUSION

    International Nuclear Information System (INIS)

    Lazarian, A.; Esquivel, A.; Crutcher, R.

    2012-01-01

    Recent observational results for magnetic fields in molecular clouds reviewed by Crutcher seem to be inconsistent with the predictions of the ambipolar diffusion theory of star formation. These include the measured decrease in mass to flux ratio between envelopes and cores, the failure to detect any self-gravitating magnetically subcritical clouds, the determination of the flat probability distribution function (PDF) of the total magnetic field strengths implying that there are many clouds with very weak magnetic fields, and the observed scaling B∝ρ 2/3 that implies gravitational contraction with weak magnetic fields. We consider the problem of magnetic field evolution in turbulent molecular clouds and discuss the process of magnetic field diffusion mediated by magnetic reconnection. For this process that we termed 'reconnection diffusion', we provide a simple physical model and explain that this process is inevitable in view of the present-day understanding of MHD turbulence. We address the issue of the expected magnetization of cores and envelopes in the process of star formation and show that reconnection diffusion provides an efficient removal of magnetic flux that depends only on the properties of MHD turbulence in the core and the envelope. We show that as the amplitude of turbulence as well as the scale of turbulent motions decrease from the envelope to the core of the cloud, the diffusion of the magnetic field is faster in the envelope. As a result, the magnetic flux trapped during the collapse in the envelope is being released faster than the flux trapped in the core, resulting in much weaker fields in envelopes than in cores, as observed. We provide simple semi-analytical model calculations which support this conclusion and qualitatively agree with the observational results. Magnetic reconnection is also consistent with the lack of subcritical self-gravitating clouds, with the observed flat PDF of field strengths, and with the scaling of field strength

  20. MAGNETIZATION OF CLOUD CORES AND ENVELOPES AND OTHER OBSERVATIONAL CONSEQUENCES OF RECONNECTION DIFFUSION

    Energy Technology Data Exchange (ETDEWEB)

    Lazarian, A. [Astronomy Department, University of Wisconsin, Madison, WI 53706 (United States); Esquivel, A. [Instituto de Ciencias Nucleares, Universidad Nacional Autonoma de Mexico, Apartado Postal 70-543, 04510 Mexico D.F. (Mexico); Crutcher, R. [Department of Astronomy, University of Illinois at Urbana-Champaign, 1002 W. Green Street, Urbana, IL 61801 (United States)

    2012-10-01

    Recent observational results for magnetic fields in molecular clouds reviewed by Crutcher seem to be inconsistent with the predictions of the ambipolar diffusion theory of star formation. These include the measured decrease in mass to flux ratio between envelopes and cores, the failure to detect any self-gravitating magnetically subcritical clouds, the determination of the flat probability distribution function (PDF) of the total magnetic field strengths implying that there are many clouds with very weak magnetic fields, and the observed scaling B{proportional_to}{rho}{sup 2/3} that implies gravitational contraction with weak magnetic fields. We consider the problem of magnetic field evolution in turbulent molecular clouds and discuss the process of magnetic field diffusion mediated by magnetic reconnection. For this process that we termed 'reconnection diffusion', we provide a simple physical model and explain that this process is inevitable in view of the present-day understanding of MHD turbulence. We address the issue of the expected magnetization of cores and envelopes in the process of star formation and show that reconnection diffusion provides an efficient removal of magnetic flux that depends only on the properties of MHD turbulence in the core and the envelope. We show that as the amplitude of turbulence as well as the scale of turbulent motions decrease from the envelope to the core of the cloud, the diffusion of the magnetic field is faster in the envelope. As a result, the magnetic flux trapped during the collapse in the envelope is being released faster than the flux trapped in the core, resulting in much weaker fields in envelopes than in cores, as observed. We provide simple semi-analytical model calculations which support this conclusion and qualitatively agree with the observational results. Magnetic reconnection is also consistent with the lack of subcritical self-gravitating clouds, with the observed flat PDF of field strengths, and

  1. Comparison of different statistical modelling approaches for deriving spatial air temperature patterns in an urban environment

    Science.gov (United States)

    Straub, Annette; Beck, Christoph; Breitner, Susanne; Cyrys, Josef; Geruschkat, Uta; Jacobeit, Jucundus; Kühlbach, Benjamin; Kusch, Thomas; Richter, Katja; Schneider, Alexandra; Umminger, Robin; Wolf, Kathrin

    2017-04-01

    Frequently spatial variations of air temperature of considerable magnitude occur within urban areas. They correspond to varying land use/land cover characteristics and vary with season, time of day and synoptic conditions. These temperature differences have an impact on human health and comfort directly by inducing thermal stress as well as indirectly by means of affecting air quality. Therefore, knowledge of the spatial patterns of air temperature in cities and the factors causing them is of great importance, e.g. for urban planners. A multitude of studies have shown statistical modelling to be a suitable tool for generating spatial air temperature patterns. This contribution presents a comparison of different statistical modelling approaches for deriving spatial air temperature patterns in the urban environment of Augsburg, Southern Germany. In Augsburg there exists a measurement network for air temperature and humidity currently comprising 48 stations in the city and its rural surroundings (corporately operated by the Institute of Epidemiology II, Helmholtz Zentrum München, German Research Center for Environmental Health and the Institute of Geography, University of Augsburg). Using different datasets for land surface characteristics (Open Street Map, Urban Atlas) area percentages of different types of land cover were calculated for quadratic buffer zones of different size (25, 50, 100, 250, 500 m) around the stations as well for source regions of advective air flow and used as predictors together with additional variables such as sky view factor, ground level and distance from the city centre. Multiple Linear Regression and Random Forest models for different situations taking into account season, time of day and weather condition were applied utilizing selected subsets of these predictors in order to model spatial distributions of mean hourly and daily air temperature deviations from a rural reference station. Furthermore, the different model setups were

  2. An empirical comparison of different approaches for combining multimodal neuroimaging data with support vector machine.

    Science.gov (United States)

    Pettersson-Yeo, William; Benetti, Stefania; Marquand, Andre F; Joules, Richard; Catani, Marco; Williams, Steve C R; Allen, Paul; McGuire, Philip; Mechelli, Andrea

    2014-01-01

    In the pursuit of clinical utility, neuroimaging researchers of psychiatric and neurological illness are increasingly using analyses, such as support vector machine, that allow inference at the single-subject level. Recent studies employing single-modality data, however, suggest that classification accuracies must be improved for such utility to be realized. One possible solution is to integrate different data types to provide a single combined output classification; either by generating a single decision function based on an integrated kernel matrix, or, by creating an ensemble of multiple single modality classifiers and integrating their predictions. Here, we describe four integrative approaches: (1) an un-weighted sum of kernels, (2) multi-kernel learning, (3) prediction averaging, and (4) majority voting, and compare their ability to enhance classification accuracy relative to the best single-modality classification accuracy. We achieve this by integrating structural, functional, and diffusion tensor magnetic resonance imaging data, in order to compare ultra-high risk (n = 19), first episode psychosis (n = 19) and healthy control subjects (n = 23). Our results show that (i) whilst integration can enhance classification accuracy by up to 13%, the frequency of such instances may be limited, (ii) where classification can be enhanced, simple methods may yield greater increases relative to more computationally complex alternatives, and, (iii) the potential for classification enhancement is highly influenced by the specific diagnostic comparison under consideration. In conclusion, our findings suggest that for moderately sized clinical neuroimaging datasets, combining different imaging modalities in a data-driven manner is no "magic bullet" for increasing classification accuracy. However, it remains possible that this conclusion is dependent on the use of neuroimaging modalities that had little, or no, complementary information to offer one another, and that the

  3. Selecting Energy Efficient Building Envelope Retrofits to Existing Department of Defense Building Using Value Focused Thinking

    National Research Council Canada - National Science Library

    Pratt, David M

    2006-01-01

    ... these facilities that have the greatest potential for energy efficient building envelope retrofits. There are hundreds of various new building envelope technologies available to retrofit an existing building envelope, including window, roof, and wall technologies...

  4. The limited role of recombination energy in common envelope removal

    Science.gov (United States)

    Grichener, Aldana; Sabach, Efrat; Soker, Noam

    2018-05-01

    We calculate the outward energy transport time by convection and photon diffusion in an inflated common envelope and find this time to be shorter than the envelope expansion time. We conclude therefore that most of the hydrogen recombination energy ends in radiation rather than in kinetic energy of the outflowing envelope. We use the stellar evolution code MESA and inject energy inside the envelope of an asymptotic giant branch star to mimic energy deposition by a spiraling-in stellar companion. During 1.7 years the envelope expands by a factor of more than 2. Along the entire evolution the convection can carry the energy very efficiently outwards, to the radius where radiative transfer becomes more efficient. The total energy transport time stays within several months, shorter than the dynamical time of the envelope. Had we included rapid mass loss, as is expected in the common envelope evolution, the energy transport time would have been even shorter. It seems that calculations that assume that most of the recombination energy ends in the outflowing gas might be inaccurate.

  5. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  6. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  7. The eikonal equation, envelopes and contact transformations

    International Nuclear Information System (INIS)

    Frittelli, Simonetta; Kamran, Niky; Newman, Ezra T

    2003-01-01

    We begin with an arbitrary but given conformal Lorentzian metric on an open neighbourhood, U, of a four-dimensional manifold (spacetime) and study families of solutions of the eikonal equation. In particular, the families that are of interest to us are the complete solutions. Their level surfaces form a two-parameter (points of S 2 ) family of foliations of U. We show that, from such a complete solution, it is possible to derive a pair of second-order PDEs defined solely on the parameter space S 2 , i.e., they have no reference to the spacetime points. We then show that if one uses the classical envelope method for the construction of new complete solutions from any given complete solution, then the new pair of PDEs (found from the new complete solution) is related to the old pair by contact transformations in the second jet bundle over S 2 . Further, we demonstrate that the pair of second-order PDEs obtained in this manner from any complete solution lies in a subclass of all pairs of second-order PDEs defined by the vanishing of a certain function obtained from the pair and is referred to as the generalized-Wuenschmann invariant. For completeness we briefly discuss the analogous issues associated with the eikonal equation in three dimensions. Finally we point out that conformally invariant geometric structures from the Lorentzian manifold have natural counterparts in the second jet bundle over S 2 on which the pair of PDEs lives

  8. Ecosystem functioning is enveloped by hydrometeorological variability.

    Science.gov (United States)

    Pappas, Christoforos; Mahecha, Miguel D; Frank, David C; Babst, Flurin; Koutsoyiannis, Demetris

    2017-09-01

    Terrestrial ecosystem processes, and the associated vegetation carbon dynamics, respond differently to hydrometeorological variability across timescales, and so does our scientific understanding of the underlying mechanisms. Long-term variability of the terrestrial carbon cycle is not yet well constrained and the resulting climate-biosphere feedbacks are highly uncertain. Here we present a comprehensive overview of hydrometeorological and ecosystem variability from hourly to decadal timescales integrating multiple in situ and remote-sensing datasets characterizing extra-tropical forest sites. We find that ecosystem variability at all sites is confined within a hydrometeorological envelope across sites and timescales. Furthermore, ecosystem variability demonstrates long-term persistence, highlighting ecological memory and slow ecosystem recovery rates after disturbances. However, simulation results with state-of-the-art process-based models do not reflect this long-term persistent behaviour in ecosystem functioning. Accordingly, we develop a cross-time-scale stochastic framework that captures hydrometeorological and ecosystem variability. Our analysis offers a perspective for terrestrial ecosystem modelling and paves the way for new model-data integration opportunities in Earth system sciences.

  9. Fullerenes and fulleranes in circumstellar envelopes

    International Nuclear Information System (INIS)

    Zhang, Yong; Kwok, Sun; Sadjadi, SeyedAbdolreza

    2016-01-01

    Three decades of search have recently led to convincing discoveries of cosmic fullerenes. The presence of C_6_0 and C"+ _6_0 in both circumstellar and interstellar environments suggests that these molecules and their derivatives can be efficiently formed in circumstellar envelopes and survive in harsh conditions. Detailed analysis of the infrared bands from fullerenes and their connections with the local properties can provide valuable information on the physical conditions and chemical processes that occurred in the late stages of stellar evolution. The identification of C"+ _6_0 as the carrier of four diffuse interstellar bands (DIBs) suggests that fullerene- related compounds are abundant in interstellar space and are essential for resolving the DIB mystery. Experiments have revealed a high hydrogenation rate when C_6_0 is exposed to atomic hydrogen, motivating the attempt to search for cosmic fulleranes. In this paper, we present a short review of current knowledge of cosmic fullerenes and fulleranes and briefly discuss the implications on circumstellar chemistry. (paper)

  10. A comparison of conventional local approach and the short crack approach to fatigue crack initiation at a notch

    Energy Technology Data Exchange (ETDEWEB)

    Ranganathan, Narayanaswami; Leroy, Rene; Tougui, Abdellah [Laboratoire de Mecanique et Rheologie, Universite Francois Rabelais de Tours, Polytech Tours, Departement Mecanique et Conception de Systemes, Tours (France)

    2009-09-15

    Methods to estimate fatigue crack initiation life at a notch tip are compared. The methods used determine the strain amplitudes at the notch tip using Neuber's or Glinka's approximation. In conventional approaches, equivalent-damage levels are determined, using appropriate strain-life relationships coupled with damage-summation models. In the short-crack approach, a crack-like defect is assumed to exist at the notch tip. It is shown that the short-crack concept can be successfully applied to predict crack-initiation behavior at a notch. Model predictions are compared with carefully designed experiments. It is shown that model predictions are very close to experimentally measured lives under an aircraft-wing loading spectrum. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  11. A Prospective Randomized Peri- and Post-Operative Comparison of the Minimally Invasive Anterolateral Approach Versus the Lateral Approach

    Science.gov (United States)

    Landgraeber, Stefan; Quitmann, Henning; Güth, Sebastian; Haversath, Marcel; Kowalczyk, Wojciech; Kecskeméthy, Andrés; Heep, Hansjörg; Jäger, Marcus

    2013-01-01

    There is still controversy as to whether minimally invasive total hip arthroplasty enhances the postoperative outcome. The aim of this study was to compare the outcome of patients who underwent total hip replacement through an anterolateral minimally invasive (MIS) or a conventional lateral approach (CON). We performed a randomized, prospective study of 75 patients with primary hip arthritis, who underwent hip replacement through the MIS (n=36) or CON (n=39) approach. The Western Ontario and McMaster Universities Osteoarthritis Index and Harris Hip score (HHS) were evaluated at frequent intervals during the early postoperative follow-up period and then after 3.5 years. Pain sensations were recorded. Serological and radiological analyses were performed. In the MIS group the patients had smaller skin incisions and there was a significantly lower rate of patients with a positive Trendelenburg sign after six weeks postoperatively. After six weeks the HHS was 6.85 points higher in the MIS group (P=0.045). But calculating the mean difference between the baseline and the six weeks HHS we evaluated no significant differences. Blood loss was greater and the duration of surgery was longer in the MIS group. The other parameters, especially after the twelfth week, did not differ significantly. Radiographs showed the inclination of the acetabular component to be significantly higher in the MIS group, but on average it was within the same permitted tolerance range as in the CON group. Both approaches are adequate for hip replacement. Given the data, there appears to be no significant long term advantage to the MIS approach, as described in this study. PMID:24191179

  12. A prospective randomized peri- and post-operative comparison of the minimally invasive anterolateral approach versus the lateral approach

    Directory of Open Access Journals (Sweden)

    Stefan Landgraeber

    2013-07-01

    Full Text Available There is still controversy as to whether minimally invasive total hip arthroplasty enhances the postoperative outcome. The aim of this study was to compare the outcome of patients who underwent total hip replacement through an anterolateral minimally invasive (MIS or a conventional lateral approach (CON. We performed a randomized, prospective study of 75 patients with primary hip arthritis, who underwent hip replacement through the MIS (n=36 or CON (n=39 approach. The Western Ontario\tand\tMcMaster\tUniversities Osteoarthritis Index and Harris Hip score (HHS were evaluated at frequent intervals during the early postoperative follow-up period and then after 3.5 years. Pain sensations were recorded. Serological and radiological analyses were performed. In the MIS group the patients had smaller skin incisions and there was a significantly lower rate of patients with a positive Trendelenburg sign after six weeks postoperatively. After six weeks the HHS was 6.85 points higher in the MIS group (P=0.045. But calculating the mean difference between the baseline and the six weeks HHS we evaluated no significant differences. Blood loss was greater and the duration of surgery was longer in the MIS group. The other parameters, especially after the twelfth week, did not differ significantly. Radiographs showed the inclination of the acetabular component to be significantly higher in the MIS group, but on average it was within the same permitted tolerance range as in the CON group. Both approaches are adequate for hip replacement. Given the data, there appears to be no significant long term advantage to the MIS approach, as described in this study.

  13. Simulation-based comparison of two approaches frequently used for dynamic contrast-enhanced MRI

    International Nuclear Information System (INIS)

    Zwick, Stefan; Brix, Gunnar; Tofts, Paul S.; Strecker, Ralph; Kopp-Schneider, Annette; Laue, Hendrik; Semmler, Wolfhard; Kiessling, Fabian

    2010-01-01

    The purpose was to compare two approaches for the acquisition and analysis of dynamic-contrast-enhanced MRI data with respect to differences in the modelling of the arterial input-function (AIF), the dependency of the model parameters on physiological parameters and their numerical stability. Eight hundred tissue concentration curves were simulated for different combinations of perfusion, permeability, interstitial volume and plasma volume based on two measured AIFs and analysed according to the two commonly used approaches. The transfer constants (Approach 1) K trans and (Approach 2) k ep were correlated with all tissue parameters. K trans showed a stronger dependency on perfusion, and k ep on permeability. The volume parameters (Approach 1) v e and (Approach 2) A were mainly influenced by the interstitial and plasma volume. Both approaches allow only rough characterisation of tissue microcirculation and microvasculature. Approach 2 seems to be somewhat more robust than 1, mainly due to the different methods of CA administration. (orig.)

  14. DATA ENVELOPMENT ANALYSIS OF BANKING SECTOR IN BANGLADESH

    Directory of Open Access Journals (Sweden)

    Md. Rashedul Hoque

    2012-05-01

    Full Text Available Banking sector of Bangladesh is flourishing and contributing to its economy. In this aspect measuring efficiency is important. Data Envelopment Analysis technique is used for this purpose. The data are collected from the annual reports of twenty four different banks in Bangladesh. Data Envelopment Analysis is mainly of two types - constant returns to scale and variable returns to scale. Since this study attempts to maximize output, so the output oriented Data Envelopment Analysis is used. The most efficient bank is one that obtains the highest efficiency score.

  15. Pre-paid envelopes commemorating the 2013 Open Days

    CERN Multimedia

    2013-01-01

    The post office on CERN's Prévessin site is still selling pre-paid envelopes commemorating the 2013 Open Days. Hurry while stocks last!   The special envelopes, which are valid in France for non-priority letters weighing up to 20 grams, are ideal for your Christmas and New Year correspondence. A set of ten envelopes, each featuring a different image, costs € 8.70 or 10 CHF. The post office is located in Building 866 on the Prévessin site and is open Mondays to Thursdays from 9.30 a.m. to 12.30 p.m.

  16. A comparison of two sampling approaches for assessing the urban forest canopy cover from aerial photography.

    Science.gov (United States)

    Ucar Zennure; Pete Bettinger; Krista Merry; Jacek Siry; J.M. Bowker

    2016-01-01

    Two different sampling approaches for estimating urban tree canopy cover were applied to two medium-sized cities in the United States, in conjunction with two freely available remotely sensed imagery products. A random point-based sampling approach, which involved 1000 sample points, was compared against a plot/grid sampling (cluster sampling) approach that involved a...

  17. A Comparison of Jungian, Person-Centered, and Gestalt Approaches to Personal Growth Groups.

    Science.gov (United States)

    Day, Bryon; Matthes, William

    1992-01-01

    Compares Jungian approach to personal growth groups to Person-centered and Gestalt approaches. Notes similarities, though Jungian approach adds dimension of "cognitive map" not found in other two. Notes that cognitive map uses constructs from Jung's theory of individuation process, hypothesizing that integration of these constructs into…

  18. Comparison of mandibular stability after SSRO with surgery-first approach versus conventional ortho-first approach.

    Science.gov (United States)

    Akamatsu, Tadashi; Hanai, Ushio; Miyasaka, Muneo; Muramatsu, Hiroyuki; Yamamoto, Shou

    2016-01-01

    Postoperative mandibular stability in the surgery-first (SF) approach and ortho-first (OF) approach in orthognathic surgery was retrospectively assessed using the lateral cephalo X-P in 38 patients with skeletal Angle Class III malocclusion who underwent sagittal split ramus osteotomy (SSRO). The postoperative mandibular relapse of the two groups observed from T1 (2 weeks after the surgery) to T2 (for the OF group, a year after surgery; for the SF group, the day orthodontic treatment was completed) was compared. The mean (SD) horizontal relapse at pogonion was 0.86 (0.92) mm in the forward direction in the SF group and 0.90 (1.09) mm in the forward direction in the OF group. No significant difference was found in the amount of horizontal movement between the two groups. On the other hand, the mean (SD) vertical relapse at pogonion was 1.59 (2.91) mm in the downward direction in the SF group and 0.14 (1.30) mm in the upward direction in the OF group, showing a significant difference in the amount of movement between the two groups. The degree of completion of the occlusion at T2 in the SF group was compared with that in the OF group by measuring OB, OJ, L1-occlusal plane angle, and interincisal angle. No significant difference was found between the two groups and the post-treatment occlusion was clinically favourable. Although the SF approach has several advantages for patients, the method of operation and fixation should be selected carefully to maintain postoperative mandibular stability.

  19. A comparison of trans-cranial and trans-sphenoidal approaches for vision improvement due to pitutary adenomas

    Directory of Open Access Journals (Sweden)

    Fakhr Tabatabai SA

    1997-07-01

    Full Text Available To improve visual disturbance, optic nerve decompression can be performed via transcranial or tran-sphenoidal approaches. Although the surgical exposure in transcranial approach is favourable, yet the optic nerve's presence in the field may make it vulnerable to damage. Of fighty patients with different types of pituitary adenomas, 35 cases with medium-sized (1-3 cm tumors have been studied in a randomized clinical trial during a three year period, to compare the applicability of these approaches. While short hospital stay with better visual outcome was observed in fifteen trans-sphenoidal cases, in comparison to 20 trans-cranial cases, however the preoperative visual status and underlying disorders were similar in both groups. Decompressing the optic apparatus, trans-sphenoidally, seems beneficial, where there are no contraindications for the procedure in medium-sized pituitary adenomas

  20. Model-Assisted Estimation of Tropical Forest Biomass Change: A Comparison of Approaches

    Directory of Open Access Journals (Sweden)

    Nikolai Knapp

    2018-05-01

    Full Text Available Monitoring of changes in forest biomass requires accurate transfer functions between remote sensing-derived changes in canopy height (ΔH and the actual changes in aboveground biomass (ΔAGB. Different approaches can be used to accomplish this task: direct approaches link ΔH directly to ΔAGB, while indirect approaches are based on deriving AGB stock estimates for two points in time and calculating the difference. In some studies, direct approaches led to more accurate estimations, while, in others, indirect approaches led to more accurate estimations. It is unknown how each approach performs under different conditions and over the full range of possible changes. Here, we used a forest model (FORMIND to generate a large dataset (>28,000 ha of natural and disturbed forest stands over time. Remote sensing of forest height was simulated on these stands to derive canopy height models for each time step. Three approaches for estimating ΔAGB were compared: (i the direct approach; (ii the indirect approach and (iii an enhanced direct approach (dir+tex, using ΔH in combination with canopy texture. Total prediction accuracies of the three approaches measured as root mean squared errors (RMSE were RMSEdirect = 18.7 t ha−1, RMSEindirect = 12.6 t ha−1 and RMSEdir+tex = 12.4 t ha−1. Further analyses revealed height-dependent biases in the ΔAGB estimates of the direct approach, which did not occur with the other approaches. Finally, the three approaches were applied on radar-derived (TanDEM-X canopy height changes on Barro Colorado Island (Panama. The study demonstrates the potential of forest modeling for improving the interpretation of changes observed in remote sensing data and for comparing different methodologies.

  1. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    Science.gov (United States)

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  2. Anatomic comparison of the endonasal and transpetrosal approaches for interpeduncular fossa access.

    Science.gov (United States)

    Oyama, Kenichi; Prevedello, Daniel M; Ditzel Filho, Leo F S; Muto, Jun; Gun, Ramazan; Kerr, Edward E; Otto, Bradley A; Carrau, Ricardo L

    2014-01-01

    The interpeduncular cistern, including the retrochiasmatic area, is one of the most challenging regions to approach surgically. Various conventional approaches to this region have been described; however, only the endoscopic endonasal approach via the dorsum sellae and the transpetrosal approach provide ideal exposure with a caudal-cranial view. The authors compared these 2 approaches to clarify their limitations and intrinsic advantages for access to the interpeduncular cistern. Four fresh cadaver heads were studied. An endoscopic endonasal approach via the dorsum sellae with pituitary transposition was performed to expose the interpeduncular cistern. A transpetrosal approach was performed bilaterally, combining a retrolabyrinthine presigmoid and a subtemporal transtentorium approach. Water balloons were used to simulate space-occupying lesions. "Water balloon tumors" (WBTs), inflated to 2 different volumes (0.5 and 1.0 ml), were placed in the interpeduncular cistern to compare visualization using the 2 approaches. The distances between cranial nerve (CN) III and the posterior communicating artery (PCoA) and between CN III and the edge of the tentorium were measured through a transpetrosal approach to determine the width of surgical corridors using 0- to 6-ml WBTs in the interpeduncular cistern (n = 8). Both approaches provided adequate exposure of the interpeduncular cistern. The endoscopic endonasal approach yielded a good visualization of both CN III and the PCoA when a WBT was in the interpeduncular cistern. Visualization of the contralateral anatomical structures was impaired in the transpetrosal approach. The surgical corridor to the interpeduncular cistern via the transpetrosal approach was narrow when the WBT volume was small, but its width increased as the WBT volume increased. There was a statistically significant increase in the maximum distance between CN III and the PCoA (p = 0.047) and between CN III and the tentorium (p = 0.029) when the WBT volume

  3. Comparisons of watershed sulfur budgets in southeast Canada and northeast US: New approaches and implications

    Science.gov (United States)

    Mitchell, M.J.; Lovett, G.; Bailey, S.; Beall, F.; Burns, D.; Buso, D.; Clair, T.A.; Courchesne, F.; Duchesne, L.; Eimers, C.; Fernandez, I.; Houle, D.; Jeffries, D.S.; Likens, G.E.; Moran, M.D.; Rogers, C.; Schwede, D.; Shanley, J.; Weathers, K.C.; Vet, R.

    2011-01-01

    concentrations and deposition predictions with the predictions of two continental-scale air quality models, the Community Multiscale Air Quality (CMAQ) model and A Unified Regional Air-quality Modeling System (AURAMS) that utilize complete inventories of emissions and chemical budgets. The results of this comparison indicated that the predictive relationship provides an accurate representation of SO2 concentrations and S deposition for the region that is generally consistent with these models, and thus provides confidence that our approach could be used to develop accurate watershed S budgets for these 15 sites. Most watersheds showed large net losses of SO42- on an annual basis, and the watershed mass balances were grouped into five categories based on the relative value of mean annual net losses or net gains. The net annual fluxes of SO42- showed a strong relationship with hydrology; the largest net annual negative fluxes were associated with years of greatest precipitation amount and highest discharge. The important role of catchment hydrology on S budgets suggests implications for future predicted climate change as it affects patterns of precipitation and drought. The sensitivity of S budgets is likely to be greatest in watersheds with the greatest wetland area, which are particularly sensitive to drying and wetting cycles. A small number of the watersheds in this analysis were shown to have substantial S sources from mineral weathering, but most showed evidence of an internal source of SO42-, which is likely from the mineralization of organic S stored from decades of increased S deposition. Mobilization of this internal S appears to contribute about 1-6 kg S ha-1 year-1 to stream fluxes at these sites and is affecting the rate and extent of recovery from acidification as S deposition rates have declined in recent years. This internal S source should be considered when developing critical deposition loads that will promote ecosystem recovery from acidification and the depl

  4. A general soft-enveloping strategy in the templating synthesis of mesoporous metal nanostructures.

    Science.gov (United States)

    Fang, Jixiang; Zhang, Lingling; Li, Jiang; Lu, Lu; Ma, Chuansheng; Cheng, Shaodong; Li, Zhiyuan; Xiong, Qihua; You, Hongjun

    2018-02-06

    Metal species have a relatively high mobility inside mesoporous silica; thus, it is difficult to introduce the metal precursors into silica mesopores and suppress the migration of metal species during a reduction process. Therefore, until now, the controlled growth of metal nanocrystals in a confined space, i.e., mesoporous channels, has been very challenging. Here, by using a soft-enveloping reaction at the interfaces of the solid, liquid, and solution phases, we successfully control the growth of metallic nanocrystals inside a mesoporous silica template. Diverse monodispersed nanostructures with well-defined sizes and shapes, including Ag nanowires, 3D mesoporous Au, AuAg alloys, Pt networks, and Au nanoparticle superlattices are successfully obtained. The 3D mesoporous AuAg networks exhibit enhanced catalytic activities in an electrochemical methanol oxidation reaction. The current soft-enveloping synthetic strategy offers a robust approach to synthesize diverse mesoporous metal nanostructures that can be utilized in catalysis, optics, and biomedicine applications.

  5. Trade off study on different envelope detectors for B-mode imaging

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Bagge, J. P.; Jensen, Jørgen Arendt

    2003-01-01

    sum of the real and imaginary signals. The four detectors were evaluated on in-vivo data acquired with a B-K Medical 2102 scanner interfaced to the sampling system RASMINE. Three data sets were acquired with three different center frequencies. Hundred images were acquired as the transducer was moved......Generation of B-mode images involves envelope detection of the RF-signals. Various detection algorithms are available. A trade off between performance, price, and complexity determines the choice of algorithm in an ultrasound system. A Hilbert Transform (HT) and a subsequent computation...... of the magnitude give the ideal envelope, but the approach (IDE) is expensive and complex. A rectifier (REC) is a simple, low-cost solution, but the performance is severely degraded (especially in dynamic imaging). This study has investigated the possibility of providing a detector with a complexity and cost close...

  6. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    Science.gov (United States)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  7. Application of Data Envelopment Analysis to Measure Cost, Revenue and Profit Efficiency

    Directory of Open Access Journals (Sweden)

    Kristína Kočišová

    2014-09-01

    Full Text Available The literature analysing efficiency of financial institutions has enveloped rapidly over the last years. Most studies have focused on the input side analysing input technical and cost efficiency. Only few studies have examined the output side evaluating output technical and revenue efficiency. We know that both sides are relevant when evaluating efficiency of financial institutions. Therefore the primary purpose of this paper is to review a number of approaches for efficiency measurement. In particular, the concepts of cost, revenue and profit functions are discussed. We apply Data Envelopment Analysis (DEA to a sample of Slovak and Czech commercial banks during years 2009–2013 comparing the efficiencies by either minimizing cost or maximizing revenue and profit. The results showed that the level of average revenue efficiency was the highest and the average profit efficiency was the lowest one. As can be seen the Czech banks were more cost, revenue and profit efficient than Slovak ones during the whole analysed period.

  8. Genetic algorithm for building envelope calibration

    International Nuclear Information System (INIS)

    Ramos Ruiz, Germán; Fernández Bandera, Carlos; Gómez-Acebo Temes, Tomás; Sánchez-Ostiz Gutierrez, Ana

    2016-01-01

    Highlights: • Calibration methodology using Multi-Objective Genetic Algorithm (NSGA-II). • Uncertainty analysis formulas implemented directly in EnergyPlus. • The methodology captures the heat dynamic of the building with a high level of accuracy. • Reduction in the number of parameters involved due to sensitivity analysis. • Cost-effective methodology using temperature sensors only. - Abstract: Buildings today represent 40% of world primary energy consumption and 24% of greenhouse gas emissions. In our society there is growing interest in knowing precisely when and how energy consumption occurs. This means that consumption measurement and verification plans are well-advanced. International agencies such as Efficiency Valuation Organization (EVO) and International Performance Measurement and Verification Protocol (IPMVP) have developed methodologies to quantify savings. This paper presents a methodology to accurately perform automated envelope calibration under option D (calibrated simulation) of IPMVP – vol. 1. This is frequently ignored because of its complexity, despite being more flexible and accurate in assessing the energy performance of a building. A detailed baseline energy model is used, and by means of a metaheuristic technique achieves a highly reliable and accurate Building Energy Simulation (BES) model suitable for detailed analysis of saving strategies. In order to find this BES model a Genetic Algorithm (NSGA-II) is used, together with a highly efficient engine to stimulate the objective, thus permitting rapid achievement of the goal. The result is a BES model that broadly captures the heat dynamic behaviour of the building. The model amply fulfils the parameters demanded by ASHRAE and EVO under option D.

  9. Breast and prostate cancer productivity costs: a comparison of the human capital approach and the friction cost approach.

    Science.gov (United States)

    Hanly, Paul; Timmons, Aileen; Walsh, Paul M; Sharp, Linda

    2012-05-01

    Productivity costs constitute a substantial proportion of the total societal costs associated with cancer. We compared the results of applying two different analytical methods--the traditional human capital approach (HCA) and the emerging friction cost approach (FCA)--to estimate breast and prostate cancer productivity costs in Ireland in 2008. Data from a survey of breast and prostate cancer patients were combined with population-level survival estimates and a national wage data set to calculate costs of temporary disability (cancer-related work absence), permanent disability (workforce departure, reduced working hours), and premature mortality. For breast cancer, productivity costs per person using the HCA were € 193,425 and those per person using the FCA were € 8,103; for prostate cancer, the comparable estimates were € 109,154 and € 8,205, respectively. The HCA generated higher costs for younger patients (breast cancer) because of greater lifetime earning potential. In contrast, the FCA resulted in higher productivity costs for older male patients (prostate cancer) commensurate with higher earning capacity over a shorter time period. Reduced working hours postcancer was a key driver of total HCA productivity costs. HCA costs were sensitive to assumptions about discount and growth rates. FCA costs were sensitive to assumptions about the friction period. The magnitude of the estimates obtained in this study illustrates the importance of including productivity costs when considering the economic impact of illness. Vastly different results emerge from the application of the HCA and the FCA, and this finding emphasizes the importance of choosing the study perspective carefully and being explicit about assumptions that underpin the methods. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Multi-country comparisons of energy performance: The index decomposition analysis approach

    International Nuclear Information System (INIS)

    Ang, B.W.; Xu, X.Y.; Su, Bin

    2015-01-01

    Index decomposition analysis (IDA) is a popular tool for studying changes in energy consumption over time in a country or region. This specific application of IDA, which may be called temporal decomposition analysis, has been extended by researchers and analysts to study variations in energy consumption or energy efficiency between countries or regions, i.e. spatial decomposition analysis. In spatial decomposition analysis, the main objective is often to understand the relative contributions of overall activity level, activity structure, and energy intensity in explaining differences in total energy consumption between two countries or regions. We review the literature of spatial decomposition analysis, investigate the methodological issues, and propose a spatial decomposition analysis framework for multi-region comparisons. A key feature of the proposed framework is that it passes the circularity test and provides consistent results for multi-region comparisons. A case study in which 30 regions in China are compared and ranked based on their performance in energy consumption is presented. - Highlights: • We conducted cross-regional comparisons of energy consumption using IDA. • We proposed two criteria for IDA method selection in spatial decomposition analysis. • We proposed a new model for regional comparison that passes the circularity test. • Features of the new model are illustrated using the data of 30 regions in China

  11. Are Noncovalent Interactions an Achilles Heel in Chemistry Education? A Comparison of Instructional Approaches

    Science.gov (United States)

    Williams, Leah C.; Underwood, Sonia M.; Klymkowsky, Michael W.; Cooper, Melanie M.

    2015-01-01

    Intermolecular forces (IMFs), or more broadly, noncovalent interactions either within or between molecules, are central to an understanding of a wide range of chemical and biological phenomena. In this study, we present a multiyear, multi-institutional, longitudinal comparison of how students enrolled in traditional general chemistry courses and…

  12. Multi-dimensional flood vulnerability assessment using data envelopment analysis

    Science.gov (United States)

    Zahid, Zalina; Saharizan, Nurul Syuhada; Hamzah, Paezah; Hussin, Siti Aida Sheikh; Khairi, Siti Shaliza Mohd

    2017-11-01

    Malaysia has been greatly impacted by flood during monsoon seasons. Even though flood prone areas are well identified, assessment on the vulnerability of the disaster is lacking. Assessment of flood vulnerability, defined as the potential for loss when a disaster occurs, is addressed in this paper. The focus is on the development of flood vulnerability measurement in 11 states in Peninsular Malaysia using a non-parametric approach of Data Envelopment Analysis. Scores for three dimensions of flood vulnerability (Population Vulnerability, Social Vulnerability and Biophysical) were calculated using secondary data of selected input and output variables across an 11-year period from 2004 to 2014. The results showed that Johor and Pahang were the most vulnerable to flood in terms of Population Vulnerability, followed by Kelantan, the most vulnerable to flood in terms of Social Vulnerability and Kedah, Pahang and Terengganu were the most vulnerable to flood in terms of Biophysical Vulnerability among the eleven states. The results also showed that the state of Johor, Pahang and Kelantan to be most vulnerable across the three dimensions. Flood vulnerability assessment is important as it provides invaluable information that will allow the authority to identify and develop plans for flood mitigation and to reduce the vulnerability of flood at the affected regions.

  13. Flight envelope protection system for unmanned aerial vehicles

    KAUST Repository

    Claudel, Christian G.; Shaqura, Mohammad

    2016-01-01

    Systems and methods to protect the flight envelope in both manual flight and flight by a commercial autopilot are provided. A system can comprise: an inertial measurement unit (IMU); a computing device in data communication with the IMU

  14. Envelope Protection for In-Flight Ice Contamination

    Science.gov (United States)

    Gingras, David R.; Barnhart, Billy P.; Ranaudo, Richard J.; Ratvasky, Thomas P.; Morelli, Eugene A.

    2010-01-01

    Fatal loss-of-control (LOC) accidents have been directly related to in-flight airframe icing. The prototype system presented in this paper directly addresses the need for real-time onboard envelope protection in icing conditions. The combinations of a-priori information and realtime aerodynamic estimations are shown to provide sufficient input for determining safe limits of the flight envelope during in-flight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system has been designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. Components of ICEPro are described and results from preliminary tests are presented.

  15. that Bind Specifically to Recombinant Envelope Protein of Dengue

    African Journals Online (AJOL)

    Tropical Journal of Pharmaceutical Research June 2015; 14 (6): 997-1003 ... Revised accepted: 30 April 2015. Abstract ... Results: The 45 KDa, 43 KDa and 30 KDa plasma membrane proteins were identified as viral envelope targets.

  16. Early Site Permit Demonstration Program: Plant parameters envelope report

    International Nuclear Information System (INIS)

    1993-03-01

    The Early Site Permit (ESP) Demonstration Program is the nuclear industry's initiative for piloting the early resolution of siting-related issues before the detailed design proceedings of the combined operating license review. The ESP Demonstration Program consists of three phases. The plant parameters envelopes task is part of Phase 1, which addresses the generic review of applicable federal regulations and develops criteria for safety and environmental assessment of potential sites. The plant parameters envelopes identify parameters that characterize the interface between an ALWR design and a potential site, and quantify the interface through values selected from the Utility Requirements Documents, vendor design information, or engineering assessments. When augmented with site-specific information, the plant parameters envelopes provide sufficient information to allow ESPs to be granted based on individual ALWR design information or enveloping design information for the evolutionary, passive, or generic ALWR plants. This document is expected to become a living document when used by future applicants

  17. Transport of Ions Across the Inner Envelope Membrane of Chloroplasts

    International Nuclear Information System (INIS)

    McCarty, R. E.

    2004-01-01

    The technical report outlines the results of nine years of research on how ions cross the inner envelope membrane of chloroplasts. The ions include protons, nitrite, calcium and ferrous iron. Bicarbonate transport was also studied

  18. Intelligent building envelopes. Architectural concept and applications for daylighting quality

    Energy Technology Data Exchange (ETDEWEB)

    Wyckmans, Annemie

    2005-11-15

    How does an intelligent building envelope manage the variable and sometimes conflictive occupant requirements that arise in a day lit indoor environment. This is the research question that provides the basis for this Ph.D. work. As it touches upon several fields of application, the research question is untangled into four steps, each of which corresponds to a chapter of the thesis. 1) What characterises intelligent behaviour for a building envelope. 2) What characterises indoor day lighting quality. 3) Which functions can an intelligent building envelope be expected to perform in the context of day lighting quality. 4) How are the materials, components and composition of an intelligent building envelope designed to influence this performance. The emphasis is on design, environmental aspects, energy conservation, functional analysis and physical applications.

  19. Re-configurable digital receiver for optically envelope detected half cycle BPSK and MSK radio-on-fiber signals

    DEFF Research Database (Denmark)

    Guerrero Gonzalez, Neil; Prince, Kamau; Zibar, Darko

    2011-01-01

    We present the first known integration of a digital receiver into optically envelope detection radio-on-fiber systems. We also present a re-configurable scheme for two different types of optically envelope detected wireless signals while keeping the complexity of used optical components low. Our...... novel digital receiver consists of a digital signal processing unit integrating functions such as filtering, peak-powers detection, symbol synchronization and signal demodulation for optically envelope detected half-cycle binary phase-shift-keying and minimum-shift-keying signals. Furthermore, radio......-frequency signal down-conversion is not required in our proposed approach; simplifying evens more the optical receiver front-end. We experimentally demonstrate error-free optical transmission (bit-error rate corresponding to 10−3 related to FEC-compatible levels) for both 416.6 Mbit/s half-cycle binary phase...

  20. Three-Dimensional Reconstruction of Nuclear Envelope Architecture Using Dual-Color Metal-Induced Energy Transfer Imaging.

    Science.gov (United States)

    Chizhik, Anna M; Ruhlandt, Daja; Pfaff, Janine; Karedla, Narain; Chizhik, Alexey I; Gregor, Ingo; Kehlenbach, Ralph H; Enderlein, Jörg

    2017-12-26

    The nuclear envelope, comprising the inner and the outer nuclear membrane, separates the nucleus from the cytoplasm and plays a key role in cellular functions. Nuclear pore complexes (NPCs), which are embedded in the nuclear envelope, control transport of macromolecules between the two compartments. Here, using dual-color metal-induced energy transfer (MIET), we determine the axial distance between Lap2β and Nup358 as markers for the inner nuclear membrane and the cytoplasmic side of the NPC, respectively. Using MIET imaging, we reconstruct the 3D profile of the nuclear envelope over the whole basal area, with an axial resolution of a few nanometers. This result demonstrates that optical microscopy can achieve nanometer axial resolution in biological samples and without recourse to complex interferometric approaches.

  1. Torsin Mediates Primary Envelopment of Large Ribonucleoprotein Granules at the Nuclear Envelope

    Directory of Open Access Journals (Sweden)

    Vahbiz Jokhi

    2013-04-01

    Full Text Available A previously unrecognized mechanism through which large ribonucleoprotein (megaRNP granules exit the nucleus is by budding through the nuclear envelope (NE. This mechanism is akin to the nuclear egress of herpes-type viruses and is essential for proper synapse development. However, the molecular machinery required to remodel the NE during this process is unknown. Here, we identify Torsin, an AAA-ATPase that in humans is linked to dystonia, as a major mediator of primary megaRNP envelopment during NE budding. In torsin mutants, megaRNPs accumulate within the perinuclear space, and the messenger RNAs contained within fail to reach synaptic sites, preventing normal synaptic protein synthesis and thus proper synaptic bouton development. These studies begin to establish the cellular machinery underlying the exit of megaRNPs via budding, offer an explanation for the “nuclear blebbing” phenotype found in dystonia models, and provide an important link between Torsin and the synaptic phenotypes observed in dystonia.

  2. An experimental detrending approach to attributing change of pan evaporation in comparison with the traditional partial differential method

    Science.gov (United States)

    Wang, Tingting; Sun, Fubao; Xia, Jun; Liu, Wenbin; Sang, Yanfang

    2017-04-01

    In predicting how droughts and hydrological cycles would change in a warming climate, change of atmospheric evaporative demand measured by pan evaporation (Epan) is one crucial element to be understood. Over the last decade, the derived partial differential (PD) form of the PenPan equation is a prevailing attribution approach to attributing changes to Epan worldwide. However, the independency among climatic variables required by the PD approach cannot be met using long term observations. Here we designed a series of numerical experiments to attribute changes of Epan over China by detrending each climatic variable, i.e., an experimental detrending approach, to address the inter-correlation among climate variables, and made comparison with the traditional PD method. The results show that the detrending approach is superior not only to a complicate system with multi-variables and mixing algorithm like aerodynamic component (Ep,A) and Epan, but also to a simple case like radiative component (Ep,R), when compared with traditional PD method. The major reason for this is the strong and significant inter-correlation of input meteorological forcing. Very similar and fine attributing results have been achieved based on detrending approach and PD method after eliminating the inter-correlation of input through a randomize approach. The contribution of Rh and Ta in net radiation and thus Ep,R, which has been overlooked based on the PD method but successfully detected by detrending approach, provides some explanation to the comparing results. We adopted the control run from the detrending approach and applied it to made adjustment of PD method. Much improvement has been made and thus proven this adjustment an effective way in attributing changes to Epan. Hence, the detrending approach and the adjusted PD method are well recommended in attributing changes in hydrological models to better understand and predict water and energy cycle.

  3. Comparison of complications in transtrochanteric and anterolateral approaches in primary total hip arthroplasty.

    LENUS (Irish Health Repository)

    Cashman, James P

    2008-11-01

    Three surgical approaches to primary total hip arthroplasty (THA) have been in use since Charnley popularized the transtrochanteric approach. This study was designed to examine the difference in morbidity between the transtrochanteric approach and the anterolateral approach in primary THA. Information on 891 patients who underwent primary THA performed by a single surgeon was collected prospectively between 1998 and 2003 using a modified SF-36 form, preoperatively, intraoperatively, and at 3 months postoperatively. The transtrochanteric group had higher morbidity and more patients who were dissatisfied with their THA. There was a greater range of motion in the anterolateral group.

  4. Comparison of three different surgical approaches for treatment of thoracolumbar burst fracture

    Directory of Open Access Journals (Sweden)

    WU Han

    2013-02-01

    Full Text Available 【Abstract】Objective: The main treatment method used for thoracolumbar fractures is open reduction and in-ternal fixation. Commonly there are three surgical approaches: anterior, posterior and paraspinal. We attempt to compare the three approaches based on our clinical data analysis. Methods: A group of 94 patients with Denis type A or B thoracolumbar burst fracture between March 2008 and September 2010 were recruited in this study. These patients were treated by anterior-, posterior- or paraspinal-approach reduction with or without decompression. The fracture was fixed with titanium mesh and Z-plate via anterior approach (24 patients, screw and rod system via posterior approach (38 patients or paraspinal approach (32 patients. Clinical evaluations included operation duration, blood loss, inci-sion length, preoperative and postoperative Oswestry dis-ability index (ODI. Results: The average operation duration (94.1 min±13.7 min, blood loss (86.7 ml±20.0 ml, length of incision (9.3 mm± 0.7 mm and postoperative ODI (6±0.5 were signifi-cantly lower (P<0.05 in paraspinal approach group than in traditional posterior approach group (operation duration 94.1 min±13.7 min, blood loss 143.3 ml±28.3 ml, length of incision 15.4 cm±2.1 cm and ODI 12±0.7 and anterior approach group (operation duration 176.3 min±20.7 min, blood loss 255.1 ml±38.4 ml, length of incision 18.6 cm±2.4 cm and ODI 13±2.4. There was not statistical difference in terms of Cobb angle on radiographs among the three approaches. Conclusion: The anterior approach surgery is conve-nient for resection of the vertebrae and reconstruction of vertebral height, but it is more complicated and traumatic. Hence it is mostly used for severe Denis type B fracture. The posterior approach is commonly applied to most thora-columbar fractures and has fewer complications compared with the anterior approach, but it has some shortcomings as well. The paraspinal approach has great advantages

  5. Aspherical Dust Envelopes Around Oxygen-Rich AGB Stars

    Directory of Open Access Journals (Sweden)

    Kyung-Won Suh

    2006-12-01

    Full Text Available We model the aspherical dust envelopes around O-rich AGB stars. We perform the radiative transfer model calculations for axisymmetric dust distributions. We simulate what could be observed from the aspherical dust envelopes around O-rich AGB stars by presenting the model spectral energy distributions and images at various wavelengths for different optical depths and viewing angles. The model results are very different from the ones with spherically symmetric geometry.

  6. Preserving Envelope Efficiency in Performance Based Code Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A. [Thornton Energy Consulting (United States); Sullivan, Greg P. [Efficiency Solutions (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baechler, Michael C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-20

    The City of Seattle 2012 Energy Code (Seattle 2014), one of the most progressive in the country, is under revision for its 2015 edition. Additionally, city personnel participate in the development of the next generation of the Washington State Energy Code and the International Energy Code. Seattle has pledged carbon neutrality by 2050 including buildings, transportation and other sectors. The United States Department of Energy (DOE), through Pacific Northwest National Laboratory (PNNL) provided technical assistance to Seattle in order to understand the implications of one potential direction for its code development, limiting trade-offs of long-lived building envelope components less stringent than the prescriptive code envelope requirements by using better-than-code but shorter-lived lighting and heating, ventilation, and air-conditioning (HVAC) components through the total building performance modeled energy compliance path. Weaker building envelopes can permanently limit building energy performance even as lighting and HVAC components are upgraded over time, because retrofitting the envelope is less likely and more expensive. Weaker building envelopes may also increase the required size, cost and complexity of HVAC systems and may adversely affect occupant comfort. This report presents the results of this technical assistance. The use of modeled energy code compliance to trade-off envelope components with shorter-lived building components is not unique to Seattle and the lessons and possible solutions described in this report have implications for other jurisdictions and energy codes.

  7. Rainfall Prediction of Indian Peninsula: Comparison of Time Series Based Approach and Predictor Based Approach using Machine Learning Techniques

    Science.gov (United States)

    Dash, Y.; Mishra, S. K.; Panigrahi, B. K.

    2017-12-01

    Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.

  8. The Comparison of a Thematic versus Regional Approach to Teaching a World Geography Course

    Science.gov (United States)

    Korson, Cadey; Kusek, Weronika

    2016-01-01

    The benefits of a regional or thematic approach to the study and presentation of world geography have long been debated. The goal to not reimagine these debates or to promote one approach over another; the aim is to explore how world geography courses are currently being taught in American universities. By polling and sharing information about…

  9. Technical Note: A comparison of two empirical approaches to estimate in-stream net nutrient uptake

    Science.gov (United States)

    von Schiller, D.; Bernal, S.; Martí, E.

    2011-04-01

    To establish the relevance of in-stream processes on nutrient export at catchment scale it is important to accurately estimate whole-reach net nutrient uptake rates that consider both uptake and release processes. Two empirical approaches have been used in the literature to estimate these rates: (a) the mass balance approach, which considers changes in ambient nutrient loads corrected by groundwater inputs between two stream locations separated by a certain distance, and (b) the spiralling approach, which is based on the patterns of longitudinal variation in ambient nutrient concentrations along a reach following the nutrient spiralling concept. In this study, we compared the estimates of in-stream net nutrient uptake rates of nitrate (NO3) and ammonium (NH4) and the associated uncertainty obtained with these two approaches at different ambient conditions using a data set of monthly samplings in two contrasting stream reaches during two hydrological years. Overall, the rates calculated with the mass balance approach tended to be higher than those calculated with the spiralling approach only at high ambient nitrogen (N) concentrations. Uncertainty associated with these estimates also differed between both approaches, especially for NH4 due to the general lack of significant longitudinal patterns in concentration. The advantages and disadvantages of each of the approaches are discussed.

  10. Multidimensional Poverty Indices and First Order Dominance Techniques: An Empirical Comparison of Different Approaches

    DEFF Research Database (Denmark)

    Hussain, M. Azhar; Permanyer, Iñaki

    2018-01-01

    techniques (FOD). Our empirical findings suggest that the FOD approach might be a reasonable cost-effective alternative to the United Nations Development Program (UNDP)’s flagship poverty indicator: the Multidimensional Poverty Index (MPI). To the extent that the FOD approach is able to uncover the socio...

  11. Practical Skills Training in Agricultural Education--A Comparison between Traditional and Blended Approaches

    Science.gov (United States)

    Deegan, Donna; Wims, Padraig; Pettit, Tony

    2016-01-01

    Purpose: In this article the use of blended learning multimedia materials as an education tool was compared with the traditional approach for skills training. Design/Methodology/Approach: This study was conducted in Ireland using a pre-test, post-test experimental design. All students were instructed on how to complete two skills using either a…

  12. Comparison of a rational vs. high throughput approach for rapid salt screening and selection.

    Science.gov (United States)

    Collman, Benjamin M; Miller, Jonathan M; Seadeek, Christopher; Stambek, Julie A; Blackburn, Anthony C

    2013-01-01

    In recent years, high throughput (HT) screening has become the most widely used approach for early phase salt screening and selection in a drug discovery/development setting. The purpose of this study was to compare a rational approach for salt screening and selection to those results previously generated using a HT approach. The rational approach involved a much smaller number of initial trials (one salt synthesis attempt per counterion) that were selected based on a few strategic solubility determinations of the free form combined with a theoretical analysis of the ideal solvent solubility conditions for salt formation. Salt screening results for sertraline, tamoxifen, and trazodone using the rational approach were compared to those previously generated by HT screening. The rational approach produced similar results to HT screening, including identification of the commercially chosen salt forms, but with a fraction of the crystallization attempts. Moreover, the rational approach provided enough solid from the very initial crystallization of a salt for more thorough and reliable solid-state characterization and thus rapid decision-making. The crystallization techniques used in the rational approach mimic larger-scale process crystallization, allowing smoother technical transfer of the selected salt to the process chemist.

  13. Comparison of two approaches to the surgical management of cochlear implantation

    NARCIS (Netherlands)

    Postelmans, Job T. F.; Grolman, Wilko; Tange, Rinze A.; Stokroos, Robert J.

    2009-01-01

    OBJECTIVES/HYPOTHESIS: Our study was designed to compare two surgical approaches that are currently employed in cochlear implantation. METHODS: There were 315 patients who were divided into two groups according to the surgical technique used for implantation. The suprameatal approach (SMA) was

  14. Comparison of qualitative and quantitative approach to prostate MR spectroscopy in peripheral zone cancer detection

    International Nuclear Information System (INIS)

    Klijn, Stijn; De Visschere, Pieter J.; De Meerleer, Gert O.; Villeirs, Geert M.

    2012-01-01

    Objective: To compare the diagnostic performance of a qualitative (pattern recognition) and a quantitative (numerical assessment) approach to magnetic resonance spectroscopy (MRS) in the diagnosis of peripheral zone prostate cancer. Methods: 185 patients (131 with histopathologically proven cancer, 54 normal/benign after at least 12 months follow-up) were prospectively evaluated with qualitative MRS using a 4-point scale between 3/2004 and 1/2008, and retrospectively reassessed using a prototype quantitative postprocessing software in April 2008. Based on pathology and follow-up data, diagnostic performance parameters were calculated. Results: The qualitative and quantitative approaches were concordant in 78.9% (146/185) of cases. The difference between the areas under the ROC curve (0.791 versus 0.772, respectively) was not statistically significant. The sensitivity, specificity and accuracy were 55.7%, 94.4% and 67.0% for the qualitative approach, and 55.0%, 83.3% and 63.2% for the quantitative approach. The sensitivity for high grade tumours (Gleason 4 + 3 or higher) was 85.2% (23/27) for both approaches. All cancers missed on either one approach separately (31/31) and 91% of cancers missed on both approaches together (23/27) were of lower grade (Gleason 3 + 4 or lower). Conclusions: Qualitative and quantitative approaches to MRS yield similar diagnostic results. Discordances in tumour detection only occurred in lower grade cancers.

  15. [Glucose-6-phosphatase from nuclear envelope in rat liver].

    Science.gov (United States)

    González-Mujica, Freddy

    2008-06-01

    Nuclear envelope (NE) and microsomal glucosa-6-phosphatase (G-6-Pase) activities were compared. Intact microsomes were unable to hydrolyze mannose-6-phosphate (M-6-P), on the other hand, intact NE hydrolyzes this substrate. Galactose-6-phosphate showed to be a good substrate for both NE and microsomal enzymes, with similar latency to that obtained with M-6-P using microsomes. In consequence, this substrate was used to measure the NE integrity. The kinetic parameters (Kii and Kis) of the intact NE G-6-Pase for the phlorizin inhibition using glucose-6-phosphate (G-6-P) and M-6-P as substrates, were very similar. The NE T1 transporter was more sensitive to amiloride than the microsomal T1. The microsomal system was more sensitive to N-ethylmalemide (NEM) than the NE and the latter was insensitive to anion transport inhibitors DIDS and SITS, which strongly affect the microsomal enzyme. The above results allowed to postulate the presence of a hexose-6-phosphate transporter in the NE which is able to carry G-6-P and M-6-P, and perhaps other hexose-6-phosphate which could be different from that present in microsomes or, if it is the same, its activity could by modified by the membrane system where it is included. The higher PPi hydrolysis activity of the intact NE G-6-Pase in comparison to the intact microsomal, suggests differences between the Pi/PPi transport (T2) of both systems. The lower sensitivity of the NE G-6-Pase to NEM suggests that the catalytic subunit of this system has some differences with the microsomal isoform.

  16. Three approaches to deal with inconsistent decision tables - Comparison of decision tree complexity

    KAUST Repository

    Azad, Mohammad; Chikalov, Igor; Moshkov, Mikhail

    2013-01-01

    In inconsistent decision tables, there are groups of rows with equal values of conditional attributes and different decisions (values of the decision attribute). We study three approaches to deal with such tables. Instead of a group of equal rows, we consider one row given by values of conditional attributes and we attach to this row: (i) the set of all decisions for rows from the group (many-valued decision approach); (ii) the most common decision for rows from the group (most common decision approach); and (iii) the unique code of the set of all decisions for rows from the group (generalized decision approach). We present experimental results and compare the depth, average depth and number of nodes of decision trees constructed by a greedy algorithm in the framework of each of the three approaches. © 2013 Springer-Verlag.

  17. Comparison of candidate solar array maximum power utilization approaches. [for spacecraft propulsion

    Science.gov (United States)

    Costogue, E. N.; Lindena, S.

    1976-01-01

    A study was made of five potential approaches that can be utilized to detect the maximum power point of a solar array while sustaining operations at or near maximum power and without endangering stability or causing array voltage collapse. The approaches studied included: (1) dynamic impedance comparator, (2) reference array measurement, (3) onset of solar array voltage collapse detection, (4) parallel tracker, and (5) direct measurement. The study analyzed the feasibility and adaptability of these approaches to a future solar electric propulsion (SEP) mission, and, specifically, to a comet rendezvous mission. Such missions presented the most challenging requirements to a spacecraft power subsystem in terms of power management over large solar intensity ranges of 1.0 to 3.5 AU. The dynamic impedance approach was found to have the highest figure of merit, and the reference array approach followed closely behind. The results are applicable to terrestrial solar power systems as well as to other than SEP space missions.

  18. Comparison of methodological approaches to identify economic activity regularities in transition economy

    Directory of Open Access Journals (Sweden)

    Jitka Poměnková

    2011-01-01

    Full Text Available Presented paper focuses on consideration and evaluation of methodical approaches to analyze cyclical structure character of economic activity in transition economy. As a starting point, work in time domain is applied, which is followed in frequency domain approach. Both approaches are viewed from methodical as well as application point of view and their advantage and disadvantage are discussed. Consequently, time-frequency domain approach is added and applied on real data. On the basis of obtained results recommendation is formulated. All discussed methodical approaches are also considered from the perspective of capability to evaluate behaving of business cycle in time of global economic crisis before/after year 2008. The empirical part of the paper deals with data of gross domestic product in the Czech Republic in 1996/Q1–2010/Q2.

  19. MR imaging of soft tissue alterations after total hip arthroplasty: comparison of classic surgical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Agten, Christoph A.; Sutter, Reto; Pfirrmann, Christian W.A. [Balgrist University Hospital, Radiology, Zurich (Switzerland); University of Zurich, Faculty of Medicine, Zurich (Switzerland); Dora, Claudio [Balgrist University Hospital, Orthopedic Surgery, Zurich (Switzerland); University of Zurich, Faculty of Medicine, Zurich (Switzerland)

    2017-03-15

    To compare soft-tissue changes after total hip arthroplasty with posterior, direct-lateral, anterolateral, or anterior surgical approaches. MRI of 120 patients after primary total hip arthroplasty (30 per approach) were included. Each MRI was assessed by two readers regarding identification of surgical access, fatty muscle atrophy (Goutallier classification), tendon quality (0 = normal, 1 = tendinopathy, 2 = partial tear, 3 = avulsion), and fluid collections. Readers were blinded to the surgical approach. Surgical access was correctly identified in all cases. The direct lateral approach showed highest Goutallier grades and tendon damage for gluteus minimus muscle (2.07-2.67 and 2.00-2.77; p = 0.017 and p = 0.001 for readers 1 and 2, respectively) and tendon (2.30/1.67; p < 0.0005 for reader 1/2), and the lateral portion of the gluteus medius tendon (2.77/2.20; p < 0.0005 for reader 1/2). The posterior approach showed highest Goutallier grades and tendon damage for external rotator muscles (1.97-2.67 and 1.57-2.40; p < 0.0005-0.006 for reader 1/2) and tendons (1.41-2.45 and 1.93-2.76; p < 0.0005 for reader 1/2). The anterolateral and anterior approach showed less soft tissue damage. Fluid collections showed no differences between the approaches. MRI is well suited to identify surgical approaches after THA. The anterior and anterolateral approach showed less soft tissue damage compared to the posterior and direct lateral approach. (orig.)

  20. MR imaging of soft tissue alterations after total hip arthroplasty: comparison of classic surgical approaches

    International Nuclear Information System (INIS)

    Agten, Christoph A.; Sutter, Reto; Pfirrmann, Christian W.A.; Dora, Claudio

    2017-01-01

    To compare soft-tissue changes after total hip arthroplasty with posterior, direct-lateral, anterolateral, or anterior surgical approaches. MRI of 120 patients after primary total hip arthroplasty (30 per approach) were included. Each MRI was assessed by two readers regarding identification of surgical access, fatty muscle atrophy (Goutallier classification), tendon quality (0 = normal, 1 = tendinopathy, 2 = partial tear, 3 = avulsion), and fluid collections. Readers were blinded to the surgical approach. Surgical access was correctly identified in all cases. The direct lateral approach showed highest Goutallier grades and tendon damage for gluteus minimus muscle (2.07-2.67 and 2.00-2.77; p = 0.017 and p = 0.001 for readers 1 and 2, respectively) and tendon (2.30/1.67; p < 0.0005 for reader 1/2), and the lateral portion of the gluteus medius tendon (2.77/2.20; p < 0.0005 for reader 1/2). The posterior approach showed highest Goutallier grades and tendon damage for external rotator muscles (1.97-2.67 and 1.57-2.40; p < 0.0005-0.006 for reader 1/2) and tendons (1.41-2.45 and 1.93-2.76; p < 0.0005 for reader 1/2). The anterolateral and anterior approach showed less soft tissue damage. Fluid collections showed no differences between the approaches. MRI is well suited to identify surgical approaches after THA. The anterior and anterolateral approach showed less soft tissue damage compared to the posterior and direct lateral approach. (orig.)

  1. Comparison of approaches for mobile document image analysis using server supported smartphones

    Science.gov (United States)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  2. Comparison of rule induction, decision trees and formal concept analysis approaches for classification

    Science.gov (United States)

    Kotelnikov, E. V.; Milov, V. R.

    2018-05-01

    Rule-based learning algorithms have higher transparency and easiness to interpret in comparison with neural networks and deep learning algorithms. These properties make it possible to effectively use such algorithms to solve descriptive tasks of data mining. The choice of an algorithm depends also on its ability to solve predictive tasks. The article compares the quality of the solution of the problems with binary and multiclass classification based on the experiments with six datasets from the UCI Machine Learning Repository. The authors investigate three algorithms: Ripper (rule induction), C4.5 (decision trees), In-Close (formal concept analysis). The results of the experiments show that In-Close demonstrates the best quality of classification in comparison with Ripper and C4.5, however the latter two generate more compact rule sets.

  3. Effects Comparison of Different Resilience Enhancing Strategies for Municipal Water Distribution Network: A Multidimensional Approach

    Directory of Open Access Journals (Sweden)

    Xudong Zhao

    2015-01-01

    Full Text Available Water distribution network (WDN is critical to the city service, economic rehabilitation, public health, and safety. Reconstructing the WDN to improve its resilience in seismic disaster is an important and ongoing issue. Although a considerable body of research has examined the effects of different reconstruction strategies on seismic resistance, it is still hard for decision-makers to choose optimal resilience enhancing strategy. Taking the pipeline ductile retrofitting and network meshed expansion as demonstration, we proposed a feasible framework to contrast the resilience enhancing effects of two reconstruction strategies—units retrofitting strategy and network optimization strategy—in technical and organizational dimension. We also developed a new performance response function (PRF which is based on network equilibrium theory to conduct the effects comparison in integrated technical and organizational dimension. Through the case study of municipal WDN in Lianyungang, China, the comparison results were thoroughly shown and the holistic decision-making support was provided.

  4. Comparison of two anoxia models in rainbow trout cells by a 2-DE and MS/MS-based proteome approach

    DEFF Research Database (Denmark)

    Wulff, Tune; Hoffmann, E.K.; Roepstorff, P.

    2008-01-01

    In the literature, a variety of ways have been used to obtain anoxia, and most often results are compared between studies without taking into consideration how anoxia has been obtained. Here, we provide a comprehensive study of two types of anoxia, using a proteomics approach to compare changes...... and protein synthesis. It was also revealed that the level of a number of keratins was down-regulated. This study therefore provides a valuable comparison of two different anoxia models and shows that great care should be taken when comparing the effects of anoxia in studies that have used different types...

  5. Optimization of approximate decision rules relative to number of misclassifications: Comparison of greedy and dynamic programming approaches

    KAUST Repository

    Amin, Talha

    2013-01-01

    In the paper, we present a comparison of dynamic programming and greedy approaches for construction and optimization of approximate decision rules relative to the number of misclassifications. We use an uncertainty measure that is a difference between the number of rows in a decision table T and the number of rows with the most common decision for T. For a nonnegative real number γ, we consider γ-decision rules that localize rows in subtables of T with uncertainty at most γ. Experimental results with decision tables from the UCI Machine Learning Repository are also presented. © 2013 Springer-Verlag.

  6. An international comparison of models and approaches for the estimation of the radiological exposure of non-human biota

    International Nuclear Information System (INIS)

    Beresford, Nicholas A.; Balonov, Mikhail; Beaugelin-Seiller, Karine; Brown, Justin; Copplestone, David; Hingston, Joanne L.; Horyna, Jan; Hosseini, Ali; Howard, Brenda J.; Kamboj, Sunita; Nedveckaite, Tatjana; Olyslaegers, Geert; Sazykina, Tatiana; Vives i Batlle, Jordi; Yankovich, Tamara L.; Yu, Charley

    2008-01-01

    Over the last decade a number of models and approaches have been developed for the estimation of the exposure of non-human biota to ionising radiations. In some countries these are now being used in regulatory assessments. However, to date there has been no attempt to compare the outputs of the different models used. This paper presents the work of the International Atomic Energy Agency's EMRAS Biota Working Group which compares the predictions of a number of such models in model-model and model-data inter-comparisons

  7. An estimation of the transformation value by means of the estimation function. Market Comparison Approach with abridged data chart

    Directory of Open Access Journals (Sweden)

    Maurizio d’Amato

    2015-06-01

    Full Text Available This essay suggests a re-elaboration of the Marketing Comparison Approach in order to set the value of properties subject to transformation. The essay focuses on identifying the property valuation following a certain transformation and is aimed at determining the land value by means of the extraction method. The outcome, based on trading data and a study case in the province of Bari may also be applied to under construction properties valuation and to the identification of the value of properties under construction, (investment property under construction by means of the Future Value method.

  8. Space commerce in a global economy - Comparison of international approaches to commercial space

    Science.gov (United States)

    Stone, Barbara A.; Kleber, Peter

    1992-01-01

    A historical perspective, current status, and comparison of national government/commercial space industry relationships in the United States and Europe are presented. It is noted that space technology has been developed and used primarily to meet the needs of civil and military government initiatives. Two future trends of space technology development include new space enterprises, and the national drive to achieve a more competitive global economic position.

  9. A Study on Influencing Factors of Knowledge Management Systems Adoption: Models Comparison Approach

    OpenAIRE

    Mei-Chun Yeh; Ming-Shu Yuan

    2007-01-01

    Using Linear Structural Relation model (LISREL model) as analysis method and technology acceptance model and decomposed theory of planned behavior as research foundation, this study approachesmainly from the angle of behavioral intention to examine the influential factors of 421 employees adopting knowledge management systems and in the meantime to compare the two method models mentioned on the top. According to the research, there is no, in comparison with technology acceptance model anddeco...

  10. Isostatic lines’ study to optimize steel space grid envelope structures for tall buildings according to their solicitations

    OpenAIRE

    Señís López, Roger

    2013-01-01

    Based on the first study completed with wind tunnel tests, the aim of this paper is to define a second methodology for the optimization of steel space grid envelope structures for tall buildings according to their isostatic lines according to their solicitations. It is by means of the comparison NatHaz online database and numerical simulation research of wind flow repercussion in buildings, through Computational Fluid Dynamics (CDF), that we can understand and analyse the grid ...

  11. Comparing Diagnostic Accuracy of Cognitive Screening Instruments: A Weighted Comparison Approach

    Directory of Open Access Journals (Sweden)

    A.J. Larner

    2013-03-01

    Full Text Available Background/Aims: There are many cognitive screening instruments available to clinicians when assessing patients' cognitive function, but the best way to compare the diagnostic utility of these tests is uncertain. One method is to undertake a weighted comparison which takes into account the difference in sensitivity and specificity of two tests, the relative clinical misclassification costs of true- and false-positive diagnosis, and also disease prevalence. Methods: Data were examined from four pragmatic diagnostic accuracy studies from one clinic which compared the Mini-Mental State Examination (MMSE with the Addenbrooke's Cognitive Examination-Revised (ACE-R, the Montreal Cognitive Assessment (MoCA, the Test Your Memory (TYM test, and the Mini-Mental Parkinson (MMP, respectively. Results: Weighted comparison calculations suggested a net benefit for ACE-R, MoCA, and MMP compared to MMSE, but a net loss for TYM test compared to MMSE. Conclusion: Routine incorporation of weighted comparison or other similar net benefit measures into diagnostic accuracy studies merits consideration to better inform clinicians of the relative value of cognitive screening instruments.

  12. Comparison between goal programming and cointegration approaches in enhanced index tracking

    Science.gov (United States)

    Lam, Weng Siew; Jamaan, Saiful Hafizah Hj.

    2013-04-01

    Index tracking is a popular form of passive fund management in stock market. Passive management is a buy-and-hold strategy that aims to achieve rate of return similar to the market return. Index tracking problem is a problem of reproducing the performance of a stock market index, without purchasing all of the stocks that make up the index. This can be done by establishing an optimal portfolio that minimizes risk or tracking error. An improved index tracking (enhanced index tracking) is a dual-objective optimization problem, a trade-off between maximizing the mean return and minimizing the tracking error. Enhanced index tracking aims to generate excess return over the return achieved by the index. The objective of this study is to compare the portfolio compositions and performances by using two different approaches in enhanced index tracking problem, which are goal programming and cointegration. The result of this study shows that the optimal portfolios for both approaches are able to outperform the Malaysia market index which is Kuala Lumpur Composite Index. Both approaches give different optimal portfolio compositions. Besides, the cointegration approach outperforms the goal programming approach because the cointegration approach gives higher mean return and lower risk or tracking error. Therefore, the cointegration approach is more appropriate for the investors in Malaysia.

  13. A comparison between neural response telemetry via cochleostomy or the round window approach in cochlear implantation.

    Science.gov (United States)

    Hamerschmidt, Rogério; Schuch, Luiz Henrique; Rezende, Rodrigo Kopp; Wiemes, Gislaine Richter Minhoto; Oliveira, Adriana Kosma Pires de; Mocellin, Marcos

    2012-01-01

    There are two techniques for cochlear implant (CI) electrode placement: cochleostomy and the round window (RW) approach. This study aims to compare neural response telemetry (NRT) results immediately after surgery to check for possible differences on auditory nerve stimulation between these two techniques. This is a prospective cross-sectional study. Twenty-three patients were enrolled. Six patients underwent surgery by cochleostomy and 17 had it through the RW approach. Mean charge units (MCU) for high frequency sounds: patients submitted to the RW approach had a mean value of 190.4 (± 29.2) while cochleostomy patients averaged 187.8 (± 32.7); p = 0.71. MCU for mid frequency sounds: patients submitted to the RW approach had a mean value of 192.5 (± 22) while cochleostomy patients averaged 178.5 (± 18.5); p = 0.23. MCU for low frequency sounds: patients submitted to the RW approach had a mean value of 183.3 (± 25) while cochleostomy patients averaged 163.8 (± 19.3); p = 0.19. This study showed no differences in the action potential of the distal portion of the auditory nerve in patients with multichannel cochlear implants submitted to surgery by cochleostomy or through the RW approach, using the implant itself to generate stimuli and record responses. Both techniques equally stimulate the cochlear nerve. Therefore, the choice of approach can be made based on the surgeon's own preference and experience.

  14. Comparison of statistical approaches dealing with time-dependent confounding in drug effectiveness studies.

    Science.gov (United States)

    Karim, Mohammad Ehsanul; Petkau, John; Gustafson, Paul; Platt, Robert W; Tremlett, Helen

    2018-06-01

    In longitudinal studies, if the time-dependent covariates are affected by the past treatment, time-dependent confounding may be present. For a time-to-event response, marginal structural Cox models are frequently used to deal with such confounding. To avoid some of the problems of fitting marginal structural Cox model, the sequential Cox approach has been suggested as an alternative. Although the estimation mechanisms are different, both approaches claim to estimate the causal effect of treatment by appropriately adjusting for time-dependent confounding. We carry out simulation studies to assess the suitability of the sequential Cox approach for analyzing time-to-event data in the presence of a time-dependent covariate that may or may not be a time-dependent confounder. Results from these simulations revealed that the sequential Cox approach is not as effective as marginal structural Cox model in addressing the time-dependent confounding. The sequential Cox approach was also found to be inadequate in the presence of a time-dependent covariate. We propose a modified version of the sequential Cox approach that correctly estimates the treatment effect in both of the above scenarios. All approaches are applied to investigate the impact of beta-interferon treatment in delaying disability progression in the British Columbia Multiple Sclerosis cohort (1995-2008).

  15. Preliminary comparison of the endoscopic transnasal vs the sublabial transseptal approach for clinically nonfunctioning pituitary macroadenomas.

    Science.gov (United States)

    Sheehan, M T; Atkinson, J L; Kasperbauer, J L; Erickson, B J; Nippoldt, T B

    1999-07-01

    To assess the advantages and disadvantages of an endoscopic transnasal approach to pituitary surgery for a select group of clinically nonfunctioning macroadenomas and to compare results of this approach with the sublabial transseptal approach at a single institution. We retrospectively reviewed the records of 26 patients with clinically nonfunctioning pituitary macroadenomas approached endoscopically and 44 matched control patients with the same tumors approached sublabially between January 1, 1995, and October 31, 1997. At baseline, the groups were not significantly different for age, sex distribution, number of comorbid conditions, visual field defects, degree of anterior pituitary insufficiency, or preoperative assessment of tumor volume or invasiveness. Mean (SD) operative times were significantly reduced in the endoscopic group vs the sublabial group: 2.7 (0.7) hours vs 3.4 (0.9) hours (P working channel to the sella turcica. For these reasons, the endoscopic approach or its variation is an alternative to the sublabial approach but should be considered only by experienced pituitary neurosurgeons.

  16. A systematic comparison of different approaches of density functional theory for the study of electrical double layers

    International Nuclear Information System (INIS)

    Yang, Guomin; Liu, Longcheng

    2015-01-01

    Based on the best available knowledge of density functional theory (DFT), the reference-fluid perturbation method is here extended to yield different approaches that well account for the cross correlations between the Columbic interaction and the hard-sphere exclusion in an inhomogeneous ionic hard-sphere fluid. In order to quantitatively evaluate the advantage and disadvantage of different approaches in describing the interfacial properties of electrical double layers, this study makes a systematic comparison against Monte Carlo simulations over a wide range of conditions. The results suggest that the accuracy of the DFT approaches is well correlated to a coupling parameter that describes the coupling strength of electrical double layers by accounting for the steric effect and that can be used to classify the systems into two regimes. In the weak-coupling regime, the approaches based on the bulk-fluid perturbation method are shown to be more accurate than the counterparts based on the reference-fluid perturbation method, whereas they exhibit the opposite behavior in the strong-coupling regime. More importantly, the analysis indicates that, with a suitable choice of the reference fluid, the weighted correlation approximation (WCA) to DFT gives the best account of the coupling effect of the electrostatic-excluded volume correlations. As a result, a piecewise WCA approach can be developed that is robust enough to describe the structural and thermodynamic properties of electrical double layers over both weak- and strong-coupling regimes

  17. Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks

    Science.gov (United States)

    Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.

    2015-03-01

    The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which is to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.

  18. Can intermuscular cleavage planes provide proper transverse screw angle? Comparison of two paraspinal approaches.

    Science.gov (United States)

    Cheng, Xiaofei; Ni, Bin; Liu, Qi; Chen, Jinshui; Guan, Huapeng

    2013-01-01

    The goal of this study was to determine which paraspinal approach provided a better transverse screw angle (TSA) for each vertebral level in lower lumbar surgery. Axial computed tomography (CT) images of 100 patients, from L3 to S1, were used to measure the angulation parameters, including transverse pedicle angle (TPA) and transverse cleavage plane angle (TCPA) of entry from the two approaches. The difference value between TCPA and TPA, defined as difference angle (DA), was calculated. Statistical differences of DA obtained by the two approaches and the angulation parameters between sexes, and the correlation between each angulation parameter and age or body mass index (BMI) were analyzed. TPA ranged from about 16° at L3 to 30° at S1. TCPA through the Wiltse's and Weaver's approach ranged from about -10° and 25° at L3 to 12° and 32° at S1, respectively. The absolute values of DA through the Weaver's approach were significantly lower than those through the Wiltse's approach at each level. The angulation parameters showed no significant difference with sex and no significant correlation with age or BMI. In the lower lumbar vertebrae (L3-L5) and S1, pedicle screw placement through the Weaver's approach may more easily yield the preferred TSA consistent with TPA than that through the Wiltse's approach. The reference values obtained in this paper may be applied regardless of sex, age or BMI and the descriptive statistical results may be used as references for applying the two paraspinal approaches.

  19. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    Science.gov (United States)

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  20. Laying the Foundations for Democratic Behavior - A Comparison of Two Different Approaches to Democratic Education

    Directory of Open Access Journals (Sweden)

    Viola HUANG

    2014-07-01

    Full Text Available A democracy is a society in which everyone has equal rights and is able to participate in decision-making processes. Consequently, in a democratic society, democratic behavior is essential. This work investigates the question: In what ways and to what extent can alternative models of education support the development of democratic skills in children? To explore this question, the author analyzes and compares two different approaches to democratic education: The Sudbury approach and the democratic free school approach. The study is based on qualitative research participant observation and open-ended interviews conducted at different Sudbury and democratic free schools in the US.

  1. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...

  2. A comparison of approaches for finding minimum identifying codes on graphs

    Science.gov (United States)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  3. Automatic Diagnosis of Fetal Heart Rate: Comparison of Different Methodological Approaches

    National Research Council Canada - National Science Library

    Magenes, G

    2001-01-01

    .... A Multilayer Perception (MLP) neural network and an Adaptive Neuro-Fuzzy Inference System (ANFIS) were compared with classical statistical methods. Both the neural and neuro-fuzzy approaches seem to give better results than any tested statistical classifier.

  4. A Comparison of Three Holistic Approaches to Health: One Health, EcoHealth, and Planetary Health

    Directory of Open Access Journals (Sweden)

    Henrik Lerner

    2017-09-01

    Full Text Available Several holistic and interdisciplinary approaches exist to safeguard health. Three of the most influential concepts at the moment, One Health, EcoHealth, and Planetary Health, are analyzed in this paper, revealing similarities and differences at the theoretical conceptual level. These approaches may appear synonymous, as they all promote the underlying assumption of humans and other animals sharing the same planet and the same environmental challenges, infections and infectious agents as well as other aspects of physical—and possibly mental—health. However, we would like to illuminate the differences between these three concepts or approaches, and how the choice of terms may, deliberately or involuntary, signal the focus, and underlying values of the approaches. In this paper, we have chosen some proposed and well-known suggestions of definitions. In our theoretical analysis, we will focus on at least two areas. These are (1 the value of the potential scientific areas which could be included and (2 core values present within the approach. In the first area, our main concern is whether the approaches are interdisciplinary and whether the core scientific areas are assigned equal importance. For the second area, which is rather wide, we analyze core values such as biodiversity, health, and how one values humans, animals, and ecosystems. One Health has been described as either a narrow approach combining public health and veterinary medicine or as a wide approach as in the wide-spread “umbrella” depiction including both scientific fields, core concepts, and interdisciplinary research areas. In both cases, however, safeguarding the health of vertebrates is usually in focus although ecosystems are also included in the model. The EcoHealth approach seems to have more of a biodiversity focus, with an emphasis on all living creatures, implying that parasites, unicellular organisms, and possibly also viruses have a value and should be protected

  5. A Comparison of Three Holistic Approaches to Health: One Health, EcoHealth, and Planetary Health.

    Science.gov (United States)

    Lerner, Henrik; Berg, Charlotte

    2017-01-01

    Several holistic and interdisciplinary approaches exist to safeguard health. Three of the most influential concepts at the moment, One Health, EcoHealth, and Planetary Health, are analyzed in this paper, revealing similarities and differences at the theoretical conceptual level. These approaches may appear synonymous, as they all promote the underlying assumption of humans and other animals sharing the same planet and the same environmental challenges, infections and infectious agents as well as other aspects of physical-and possibly mental-health. However, we would like to illuminate the differences between these three concepts or approaches, and how the choice of terms may, deliberately or involuntary, signal the focus, and underlying values of the approaches. In this paper, we have chosen some proposed and well-known suggestions of definitions. In our theoretical analysis, we will focus on at least two areas. These are (1) the value of the potential scientific areas which could be included and (2) core values present within the approach. In the first area, our main concern is whether the approaches are interdisciplinary and whether the core scientific areas are assigned equal importance. For the second area, which is rather wide, we analyze core values such as biodiversity, health, and how one values humans, animals, and ecosystems. One Health has been described as either a narrow approach combining public health and veterinary medicine or as a wide approach as in the wide-spread "umbrella" depiction including both scientific fields, core concepts, and interdisciplinary research areas. In both cases, however, safeguarding the health of vertebrates is usually in focus although ecosystems are also included in the model. The EcoHealth approach seems to have more of a biodiversity focus, with an emphasis on all living creatures, implying that parasites, unicellular organisms, and possibly also viruses have a value and should be protected. Planetary Health, on the

  6. A comparison of image restoration approaches applied to three-dimensional confocal and wide-field fluorescence microscopy.

    Science.gov (United States)

    Verveer, P. J; Gemkow, M. J; Jovin, T. M

    1999-01-01

    We have compared different image restoration approaches for fluorescence microscopy. The most widely used algorithms were classified with a Bayesian theory according to the assumed noise model and the type of regularization imposed. We considered both Gaussian and Poisson models for the noise in combination with Tikhonov regularization, entropy regularization, Good's roughness and without regularization (maximum likelihood estimation). Simulations of fluorescence confocal imaging were used to examine the different noise models and regularization approaches using the mean squared error criterion. The assumption of a Gaussian noise model yielded only slightly higher errors than the Poisson model. Good's roughness was the best choice for the regularization. Furthermore, we compared simulated confocal and wide-field data. In general, restored confocal data are superior to restored wide-field data, but given sufficient higher signal level for the wide-field data the restoration result may rival confocal data in quality. Finally, a visual comparison of experimental confocal and wide-field data is presented.

  7. Laying the Foundations for Democratic Behavior - A Comparison of Two Different Approaches to Democratic Education

    OpenAIRE

    Viola HUANG

    2014-01-01

    A democracy is a society in which everyone has equal rights and is able to participate in decision-making processes. Consequently, in a democratic society, democratic behavior is essential. This work investigates the question: In what ways and to what extent can alternative models of education support the development of democratic skills in children? To explore this question, the author analyzes and compares two different approaches to democratic education: The Sudbury approach and the democr...

  8. Reconsidering Cluster Bias in Multilevel Data: A Monte Carlo Comparison of Free and Constrained Baseline Approaches.

    Science.gov (United States)

    Guenole, Nigel

    2018-01-01

    The test for item level cluster bias examines the improvement in model fit that results from freeing an item's between level residual variance from a baseline model with equal within and between level factor loadings and between level residual variances fixed at zero. A potential problem is that this approach may include a misspecified unrestricted model if any non-invariance is present, but the log-likelihood difference test requires that the unrestricted model is correctly specified. A free baseline approach where the unrestricted model includes only the restrictions needed for model identification should lead to better decision accuracy, but no studies have examined this yet. We ran a Monte Carlo study to investigate this issue. When the referent item is unbiased, compared to the free baseline approach, the constrained baseline approach led to similar true positive (power) rates but much higher false positive (Type I error) rates. The free baseline approach should be preferred when the referent indicator is unbiased. When the referent assumption is violated, the false positive rate was unacceptably high for both free and constrained baseline approaches, and the true positive rate was poor regardless of whether the free or constrained baseline approach was used. Neither the free or constrained baseline approach can be recommended when the referent indicator is biased. We recommend paying close attention to ensuring the referent indicator is unbiased in tests of cluster bias. All Mplus input and output files, R, and short Python scripts used to execute this simulation study are uploaded to an open access repository.

  9. A Comparison of Approaches for Solving Hard Graph-Theoretic Problems

    Science.gov (United States)

    2015-04-29

    and Search”, in Discrete Mathematics and Its Applications, Book 7, CRC Press (1998): Boca Raton. [6] A. Lucas, “Ising Formulations of Many NP Problems...owner. 14. ABSTRACT In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many... combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a

  10. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  11. Comparison of various approaches for the treatment of fractures of the mandibular condylar process.

    Science.gov (United States)

    Handschel, Jörg; Rüggeberg, Tim; Depprich, Rita; Schwarz, Frank; Meyer, Ulrich; Kübler, Norbert R; Naujoks, Christian

    2012-12-01

    Fractures of the mandibular condyle process are the most common fractures of the lower jaw. Unfortunately, the type of treatment is still a matter of debate. The aim of this investigation was to compare the outcome of different treatment approaches regarding function and surgical side-effects. 111 fractures of the mandibular condyle representing all types according to the classification of Spiessl and Schroll were included. Both closed reduction (CR) and open reduction with internal fixation (ORIF) including the retromandibular/transparotid, submandibular, preauricular and intraoral approach were performed. The clinical examination included functional and aesthetic aspects at least 1 year after the fracture. The majority of fractures (45%) were classified into Type II and IV according to Spiessl and Schroll followed by fractures without any displacement or dislocation (29.7%). The submandibular approach showed the worst outcome regarding permanent palsy of the facial nerve and hypertrophic scarring. No significant differences between the various approaches were detected in the functional status in any diagnosis group. Inferior condylar neck fractures benefit from ORIF by an intraoral approach whereas in high condylar neck fractures the retromandibular/transparotid approach shows the best results. Fractures of the condylar head were almost all treated by CR and our results cannot contribute to the debate of CR vs. ORIF in this type of fracture. Copyright © 2012 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  12. COMPARISONS BETWEEN AND COMBINATIONS OF DIFFERENT APPROACHES TO ACCELERATE ENGINEERING PROJECTS

    Directory of Open Access Journals (Sweden)

    H. Steyn

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: In this article, traditional project management methods such as PERT and CPM, as well as fast-tracking and systems approaches, viz. concurrent engineering and critical chain, are reviewed with specific reference to their contribution to reducing the duration of the execution phase of engineering projects. Each of these techniques has some role to play in the acceleration of project execution. Combinations of approaches are evaluated by considering the potential of sets consisting of two different approaches each. While PERT and CPM approaches have been combined for many years in a technique called PERT/CPM, new combinations of approaches are discussed. Certain assumptions inherent to PERT and often wrong are not made by the critical chain approach.

    AFRIKAANSE OPSOMMING: In hierdie artikel word tradisionele projekbestuurbenaderings soos PERT en CPM asook projekversnelling en stelselbenaderings, naamlik gelyktydige ingenieurswese, en kritiekeketting-ondersoek met betrekking tot die bydrae wat elk tot die versnelling van die uitvoeringsfase van ingenieursprojekte kan lewer. Elk van hierdie benaderings kan ‘n spesifieke bydrae tot die versnelling van projekte lewer. Kombinasies, elk bestaande uit twee verskillende benaderings, word geëvalueer. Terwyl PERT en CPM reeds baie jare lank in kombinasie gebruik word, word nuwe kombinasies ook hier bespreek. Sekere aannames inherent aan die PERT-benadering is dikwels foutief. Hierdie aannames word nie deur die kritieke-ketting-benadering gemaak nie.

  13. Advanced Envelope Research for Factory Built Housing, Phase 3. Whole-House Prototyping

    Energy Technology Data Exchange (ETDEWEB)

    Levy, E. [Advanced Residential Integrated Energy Solutions (ARIES), New York, NY (United States); Mullens, M. [Advanced Residential Integrated Energy Solutions (ARIES), New York, NY (United States); Rath, P. [Advanced Residential Integrated Energy Solutions (ARIES), New York, NY (United States)

    2014-04-01

    The Advanced Envelope Research effort will provide factory homebuilders with high performance, cost-effective envelope designs that can be effectively integrated into the plant production process while meeting the thermal requirements of the 2012 IECC standards. This work is part of a multiphase effort. Phase 1 identified seven envelope technologies and provided a preliminary assessment of three methods for building high performance walls. Phase 2 focused on developing viable product designs, manufacturing strategies, addressing code and structural issues, and cost analysis of the three selected options. An industry advisory committee helped narrow the research focus to perfecting a stud wall design with exterior continuous insulation (CI). This report describes Phase 3, which was completed in two stages and continued the design development effort, exploring and evaluating a range or methods for applying CI to factory built homes. The scope also included material selection, manufacturing and cost analysis, and prototyping and testing. During this phase, a home was built with CI, evaluated, and placed in service. The experience of building a mock up wall section with CI and then constructing on line a prototype home resolved important concerns about how to integrate the material into the production process. First steps were taken toward finding least expensive approaches for incorporating CI in standard factory building practices and a preliminary assessment suggested that even at this early stage the technology is attractive when viewed from a life cycle cost perspective.

  14. Characterization and interactome study of white spot syndrome virus envelope protein VP11.

    Directory of Open Access Journals (Sweden)

    Wang-Jing Liu

    Full Text Available White spot syndrome virus (WSSV is a large enveloped virus. The WSSV viral particle consists of three structural layers that surround its core DNA: an outer envelope, a tegument and a nucleocapsid. Here we characterize the WSSV structural protein VP11 (WSSV394, GenBank accession number AF440570, and use an interactome approach to analyze the possible associations between this protein and an array of other WSSV and host proteins. Temporal transcription analysis showed that vp11 is an early gene. Western blot hybridization of the intact viral particles and fractionation of the viral components, and immunoelectron microscopy showed that VP11 is an envelope protein. Membrane topology software predicted VP11 to be a type of transmembrane protein with a highly hydrophobic transmembrane domain at its N-terminal. Based on an immunofluorescence assay performed on VP11-transfected Sf9 cells and a trypsin digestion analysis of the virion, we conclude that, contrary to topology software prediction, the C-terminal of this protein is in fact inside the virion. Yeast two-hybrid screening combined with co-immunoprecipitation assays found that VP11 directly interacted with at least 12 other WSSV structural proteins as well as itself. An oligomerization assay further showed that VP11 could form dimers. VP11 is also the first reported WSSV structural protein to interact with the major nucleocapsid protein VP664.

  15. Joint Processing of Envelope Alignment and Phase Compensation for Isar Imaging

    Science.gov (United States)

    Chen, Tao; Jin, Guanghu; Dong, Zhen

    2018-04-01

    Range envelope alignment and phase compensation are spilt into two isolated parts in the classical methods of translational motion compensation in Inverse Synthetic Aperture Radar (ISAR) imaging. In classic method of the rotating object imaging, the two reference points of the envelope alignment and the Phase Difference (PD) estimation are probably not the same point, making it difficult to uncouple the coupling term by conducting the correction of Migration Through Resolution Cell (MTRC). In this paper, an improved approach of joint processing which chooses certain scattering point as the sole reference point is proposed to perform with utilizing the Prominent Point Processing (PPP) method. With this end in view, we firstly get the initial image using classical methods from which a certain scattering point can be chose. The envelope alignment and phase compensation using the selected scattering point as the same reference point are subsequently conducted. The keystone transform is thus smoothly applied to further improve imaging quality. Both simulation experiments and real data processing are provided to demonstrate the performance of the proposed method compared with classical method.

  16. Comparison of Posteromedial Versus Posterolateral Approach for Posterior Malleolus Fixation in Trimalleolar Ankle Fractures.

    Science.gov (United States)

    Zhong, Sheng; Shen, Lin; Zhao, Jia-Guo; Chen, Jie; Xie, Jin-Feng; Shi, Qi; Wu, Ying-Hua; Zeng, Xian-Tie

    2017-02-01

    To compare clinical and radiographic outcomes of posterior malleolar fractures (PMF) treated with lag screws from anterior to posterior versus posterior to anterior approach. We retrospectively analyzed 48 patients with trimalleolar fractures who underwent open reduction and internal fixation (ORIF) with either posteromedial (PM) or posterolateral (PL) approaches between January 2012 and December 2014. Fixation of the posterior malleolus was made with anteroposterior screws in 20 patients using the PM approach and posteroanterior screws in 28 patients using the PL approach. The American Orthopedic Foot and Ankle Society (AOFAS) scores and range of motion (ROM) of the ankle were used as the main outcome measurements, and results were evaluated at the 6-month, 12-month and final follow-up. Postoperative radiographs and computed tomography scans were used to evaluate the residual gap/step-off. The degree of arthritis was evaluated on final follow-up using Bargon criteria. Other complications were also recorded to compare the clinical outcomes of the two approaches. The mean duration of follow-up regardless of the approaches was 21.1 months (range, 15-54 months). None of the patients developed delayed union or nonunion. Functional bone healing was obtained in all patients at 10.7 weeks (range, 8-16 weeks). The mean AOFAS scores of the PM group at the postoperative 6-mouth, 12-month, and final follow-up were 91.4 (range, 82-100), 92.5 (range, 84-100), and 92.9 (range, 86-100), respectively. In the PL group, the mean AOFAS scores were 89.9 (range, 72-100), 91.4 (range, 77-100), and 91.9 (range, 77-100), respectively. At the final follow-up, the median loss of range of motion (ROM) for dorsiflexion and plantaflexion were 0°(0°, 5°) and 0°(0°, 0°), respectively, in both groups. There were no significant differences between the two approaches in AOFAS scores and ROM of the ankle in each period postoperatively (P > 0.05). Two patients in the PL group and 1 in the PM

  17. Numerical simulation of phase change material composite wallboard in a multi-layered building envelope

    International Nuclear Information System (INIS)

    Zwanzig, Stephen D.; Lian, Yongsheng; Brehob, Ellen G.

    2013-01-01

    Highlights: ► A numerical method to study the heat transfer through a PCM composite wallboard is presented. ► PCM wallboard can reduce energy consumption and shift peak electricity load. ► There is an optimal location for the PCM wallboard in the building envelop. ► The PCM wallboard performance depends on weather conditions. - Abstract: Phase change materials (PCMs) have the capability to store/release massive latent heat when undergoing phase change. When impregnated or encapsulated into wallboard or concrete systems, PCMs can greatly enhance their thermal energy storage capacity and effective thermal mass. When used in the building envelope PCM wallboard has the potential to improve building operation by reducing the energy requirement for maintaining thermal comfort, downsizing the AC/heating equipment, and shifting the peak load from the electrical grid. In this work we numerically studied the potential of PCM on energy saving for residential homes. For that purpose we solved the one-dimensional, transient heat equation through the multi-layered building envelope using the Crank–Nicolson discretization scheme. A source term is incorporated to account for the thermal-physical properties of the composite PCM wallboard. Using this code we examined a PCM composite wallboard incorporated into the walls and roof of a typical residential building across various climate zones. The PCM performance was studied under all seasonal conditions using the latest typical meteorological year (TMY3) data for exterior boundary conditions. Our simulations show that PCM performance highly depends on the weather conditions, emphasizing the necessity to choose different PCMs at different climate zones. Comparisons were also made between different PCM wallboard locations. Our work shows that there exists an optimal location for PCM placement within building envelope dependent upon the resistance values between the PCM layer and the exterior boundary conditions. We further

  18. Comparison of weighting approaches for genetic risk scores in gene-environment interaction studies.

    Science.gov (United States)

    Hüls, Anke; Krämer, Ursula; Carlsten, Christopher; Schikowski, Tamara; Ickstadt, Katja; Schwender, Holger

    2017-12-16

    Weighted genetic risk scores (GRS), defined as weighted sums of risk alleles of single nucleotide polymorphisms (SNPs), are statistically powerful for detection gene-environment (GxE) interactions. To assign weights, the gold standard is to use external weights from an independent study. However, appropriate external weights are not always available. In such situations and in the presence of predominant marginal genetic effects, we have shown in a previous study that GRS with internal weights from marginal genetic effects ("GRS-marginal-internal") are a powerful and reliable alternative to single SNP approaches or the use of unweighted GRS. However, this approach might not be appropriate for detecting predominant interactions, i.e. interactions showing an effect stronger than the marginal genetic effect. In this paper, we present a weighting approach for such predominant interactions ("GRS-interaction-training") in which parts of the data are used to estimate the weights from the interaction terms and the remaining data are used to determine the GRS. We conducted a simulation study for the detection of GxE interactions in which we evaluated power, type I error and sign-misspecification. We compared this new weighting approach to the GRS-marginal-internal approach and to GRS with external weights. Our simulation study showed that in the absence of external weights and with predominant interaction effects, the highest power was reached with the GRS-interaction-training approach. If marginal genetic effects were predominant, the GRS-marginal-internal approach was more appropriate. Furthermore, the power to detect interactions reached by the GRS-interaction-training approach was only slightly lower than the power achieved by GRS with external weights. The power of the GRS-interaction-training approach was confirmed in a real data application to the Traffic, Asthma and Genetics (TAG) Study (N = 4465 observations). When appropriate external weights are unavailable, we

  19. Comparison of the coracoid and retroclavicular approaches for ultrasound-guided infraclavicular brachial plexus block.

    Science.gov (United States)

    Kavrut Ozturk, Nilgun; Kavakli, Ali Sait

    2017-08-01

    This prospective randomized study compared the coracoid and retroclavicular approaches to ultrasound-guided infraclavicular brachial plexus block (IBPB) in terms of needle tip and shaft visibility and quality of block. We hypothesized that the retroclavicular approach would increase needle tip and shaft visibility and decrease the number of needle passes compared to the coracoid approach. A total of 100 adult patients who received IBPB block for upper limb surgery were randomized into two groups: a coracoid approach group (group C) and a retroclavicular approach group (group R). In group C, the needle was inserted 2 cm medial and 2 cm inferior to the coracoid process and directed from ventral to dorsal. In group R, the needle insertion point was posterior to the clavicle and the needle was advanced from cephalad to caudal. All ultrasound images were digitally stored for analysis. The primary aim of the present study was to compare needle tip and shaft visibility between the coracoid approach and retroclavicular approach in patients undergoing upper limb surgery. The secondary aim was to investigate differences between the two groups in the number of needle passes, sensory and motor block success rates, surgical success rate, block performance time, block performance-related pain, patient satisfaction, use of supplemental local anesthetic and analgesic, and complications. Needle tip visibility and needle shaft visibility were significantly better in group R (p = 0.040, p = 0.032, respectively). Block performance time and anesthesia-related time were significantly shorter in group R (p = 0.022, p = 0.038, respectively). Number of needle passes was significantly lower in group R (p = 0.044). Paresthesia during block performance was significantly higher in group C (p = 0.045). There were no statistically significant differences between the two groups in terms of sensory or motor block success, surgical success, block-related pain, and patient satisfaction

  20. Comparison of the different approaches to generate holograms from data acquired with a Kinect sensor

    Science.gov (United States)

    Kang, Ji-Hoon; Leportier, Thibault; Ju, Byeong-Kwon; Song, Jin Dong; Lee, Kwang-Hoon; Park, Min-Chul

    2017-05-01

    Data of real scenes acquired in real-time with a Kinect sensor can be processed with different approaches to generate a hologram. 3D models can be generated from a point cloud or a mesh representation. The advantage of the point cloud approach is that computation process is well established since it involves only diffraction and propagation of point sources between parallel planes. On the other hand, the mesh representation enables to reduce the number of elements necessary to represent the object. Then, even though the computation time for the contribution of a single element increases compared to a simple point, the total computation time can be reduced significantly. However, the algorithm is more complex since propagation of elemental polygons between non-parallel planes should be implemented. Finally, since a depth map of the scene is acquired at the same time than the intensity image, a depth layer approach can also be adopted. This technique is appropriate for a fast computation since propagation of an optical wavefront from one plane to another can be handled efficiently with the fast Fourier transform. Fast computation with depth layer approach is convenient for real time applications, but point cloud method is more appropriate when high resolution is needed. In this study, since Kinect can be used to obtain both point cloud and depth map, we examine the different approaches that can be adopted for hologram computation and compare their performance.

  1. A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting

    Science.gov (United States)

    Kim, T.; Joo, K.; Seo, J.; Heo, J. H.

    2016-12-01

    Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.

  2. Delineating Individual Trees from Lidar Data: A Comparison of Vector- and Raster-based Segmentation Approaches

    Directory of Open Access Journals (Sweden)

    Maggi Kelly

    2013-08-01

    Full Text Available Light detection and ranging (lidar data is increasingly being used for ecosystem monitoring across geographic scales. This work concentrates on delineating individual trees in topographically-complex, mixed conifer forest across the California’s Sierra Nevada. We delineated individual trees using vector data and a 3D lidar point cloud segmentation algorithm, and using raster data with an object-based image analysis (OBIA of a canopy height model (CHM. The two approaches are compared to each other and to ground reference data. We used high density (9 pulses/m2, discreet lidar data and WorldView-2 imagery to delineate individual trees, and to classify them by species or species types. We also identified a new method to correct artifacts in a high-resolution CHM. Our main focus was to determine the difference between the two types of approaches and to identify the one that produces more realistic results. We compared the delineations via tree detection, tree heights, and the shape of the generated polygons. The tree height agreement was high between the two approaches and the ground data (r2: 0.93–0.96. Tree detection rates increased for more dominant trees (8–100 percent. The two approaches delineated tree boundaries that differed in shape: the lidar-approach produced fewer, more complex, and larger polygons that more closely resembled real forest structure.

  3. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  4. Comparison of governance approaches for the control of antimicrobial resistance: Analysis of three European countries

    Directory of Open Access Journals (Sweden)

    Gabriel Birgand

    2018-02-01

    Full Text Available Abstract Policy makers and governments are calling for coordination to address the crisis emerging from the ineffectiveness of current antibiotics and stagnated pipe-line of new ones – antimicrobial resistance (AMR. Wider contextual drivers and mechanisms are contributing to shifts in governance strategies in health care, but are national health system approaches aligned with strategies required to tackle antimicrobial resistance? This article provides an analysis of governance approaches within healthcare systems including: priority setting, performance monitoring and accountability for AMR prevention in three European countries: England, France and Germany. Advantages and unresolved issues from these different experiences are reported, concluding that mechanisms are needed to support partnerships between healthcare professionals and patients with democratized decision-making and accountability via collaboration. But along with this multi-stakeholder approach to governance, a balance between regulation and persuasion is needed.

  5. A comparison of two closely-related approaches to aerodynamic design optimization

    Science.gov (United States)

    Shubin, G. R.; Frank, P. D.

    1991-01-01

    Two related methods for aerodynamic design optimization are compared. The methods, called the implicit gradient approach and the variational (or optimal control) approach, both attempt to obtain gradients necessary for numerical optimization at a cost significantly less than that of the usual black-box approach that employs finite difference gradients. While the two methods are seemingly quite different, they are shown to differ (essentially) in that the order of discretizing the continuous problem, and of applying calculus, is interchanged. Under certain circumstances, the two methods turn out to be identical. We explore the relationship between these methods by applying them to a model problem for duct flow that has many features in common with transonic flow over an airfoil. We find that the gradients computed by the variational method can sometimes be sufficiently inaccurate to cause the optimization to fail.

  6. Comparison between the basic least squares and the Bayesian approach for elastic constants identification

    Science.gov (United States)

    Gogu, C.; Haftka, R.; LeRiche, R.; Molimard, J.; Vautrin, A.; Sankar, B.

    2008-11-01

    The basic formulation of the least squares method, based on the L2 norm of the misfit, is still widely used today for identifying elastic material properties from experimental data. An alternative statistical approach is the Bayesian method. We seek here situations with significant difference between the material properties found by the two methods. For a simple three bar truss example we illustrate three such situations in which the Bayesian approach leads to more accurate results: different magnitude of the measurements, different uncertainty in the measurements and correlation among measurements. When all three effects add up, the Bayesian approach can have a large advantage. We then compared the two methods for identification of elastic constants from plate vibration natural frequencies.

  7. A Monte Carlo Study on Multiple Output Stochastic Frontiers: Comparison of Two Approaches

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Jensen, Uwe

    , dividing all other output quantities by the selected output quantity, and using these ratios as regressors (OD). Another approach is the stochastic ray production frontier (SR) which transforms the output quantities into their Euclidean distance as the dependent variable and their polar coordinates......In the estimation of multiple output technologies in a primal approach, the main question is how to handle the multiple outputs. Often an output distance function is used, where the classical approach is to exploit its homogeneity property by selecting one output quantity as the dependent variable...... of both specifications for the case of a Translog output distance function with respect to different common statistical problems as well as problems arising as a consequence of zero values in the output quantities. Although, our results partly show clear reactions to statistical misspecifications...

  8. Comparison of two approaches for differentiating full-field data in solid mechanics

    International Nuclear Information System (INIS)

    Avril, Stéphane; Feissel, Pierre; Villon, Pierre; Pierron, Fabrice

    2010-01-01

    In this study, the issue of reconstructing the gradients of noisy full-field data is addressed within the framework of solid mechanics. Two approaches are considered, a global one based on finite element approximation (FEA) and a local one based on diffuse approximation (DA). For both approaches, it is proposed to monitor locally the filtering effect in order to adapt the uncertainty to the local signal to noise ratio. Both approaches are applied to a case study which is commonly considered as difficult in solid mechanics (open-hole tensile test on a composite laminate). Both DA and FEA are successful for detecting local subsurface damage from the measured noisy displacement fields. Indications are also provided about the compared performances of DA and FEA. It is shown that DA is more robust, but the downside is that it is also more CPU time consuming

  9. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential.

    Science.gov (United States)

    Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A

    2013-08-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential

    Science.gov (United States)

    Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.

    2014-01-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726

  11. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    Science.gov (United States)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  12. A comparison of labeling and label-free mass spectrometry-based proteomics approaches.

    Science.gov (United States)

    Patel, Vibhuti J; Thalassinos, Konstantinos; Slade, Susan E; Connolly, Joanne B; Crombie, Andrew; Murrell, J Colin; Scrivens, James H

    2009-07-01

    The proteome of the recently discovered bacterium Methylocella silvestris has been characterized using three profiling and comparative proteomics approaches. The organism has been grown on two different substrates enabling variations in protein expression to be identified. The results obtained using the experimental approaches have been compared with respect to number of proteins identified, confidence in identification, sequence coverage and agreement of regulated proteins. The sample preparation, instrumental time and sample loading requirements of the differing experiments are compared and discussed. A preliminary screen of the protein regulation results for biological significance has also been performed.

  13. Avian fatalities at wind energy facilities in North America: A comparison of recent approaches

    Science.gov (United States)

    Johnson, Douglas H.; Loss, Scott R.; Smallwood, K. Shawn; Erickson, Wallace P.

    2016-01-01

    Three recent publications have estimated the number of birds killed each year by wind energy facilities at 2012 build-out levels in the United States. The 3 publications differ in scope, methodology, and resulting estimates. We compare and contrast characteristics of the approaches used in the publications. In addition, we describe decisions made in obtaining the estimates that were produced. Despite variation in the 3 approaches, resulting estimates were reasonably similar; about a quarter- to a half-million birds are killed per year by colliding with wind turbines.

  14. Performance comparisons between PCA-EA-LBG and PCA-LBG-EA approaches in VQ codebook generation for image compression

    Science.gov (United States)

    Tsai, Jinn-Tsong; Chou, Ping-Yi; Chou, Jyh-Horng

    2015-11-01

    The aim of this study is to generate vector quantisation (VQ) codebooks by integrating principle component analysis (PCA) algorithm, Linde-Buzo-Gray (LBG) algorithm, and evolutionary algorithms (EAs). The EAs include genetic algorithm (GA), particle swarm optimisation (PSO), honey bee mating optimisation (HBMO), and firefly algorithm (FF). The study is to provide performance comparisons between PCA-EA-LBG and PCA-LBG-EA approaches. The PCA-EA-LBG approaches contain PCA-GA-LBG, PCA-PSO-LBG, PCA-HBMO-LBG, and PCA-FF-LBG, while the PCA-LBG-EA approaches contain PCA-LBG, PCA-LBG-GA, PCA-LBG-PSO, PCA-LBG-HBMO, and PCA-LBG-FF. All training vectors of test images are grouped according to PCA. The PCA-EA-LBG used the vectors grouped by PCA as initial individuals, and the best solution gained by the EAs was given for LBG to discover a codebook. The PCA-LBG approach is to use the PCA to select vectors as initial individuals for LBG to find a codebook. The PCA-LBG-EA used the final result of PCA-LBG as an initial individual for EAs to find a codebook. The search schemes in PCA-EA-LBG first used global search and then applied local search skill, while in PCA-LBG-EA first used local search and then employed global search skill. The results verify that the PCA-EA-LBG indeed gain superior results compared to the PCA-LBG-EA, because the PCA-EA-LBG explores a global area to find a solution, and then exploits a better one from the local area of the solution. Furthermore the proposed PCA-EA-LBG approaches in designing VQ codebooks outperform existing approaches shown in the literature.

  15. The South Carolina bridge-scour envelope curves

    Science.gov (United States)

    Benedict, Stephen T.; Feaster, Toby D.; Caldwell, Andral W.

    2016-09-30

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a series of three field investigations to evaluate historical, riverine bridge scour in the Piedmont and Coastal Plain regions of South Carolina. These investigations included data collected at 231 riverine bridges, which lead to the development of bridge-scour envelope curves for clear-water and live-bed components of scour. The application and limitations of the South Carolina bridge-scour envelope curves were documented in four reports, each report addressing selected components of bridge scour. The current investigation (2016) synthesizes the findings of these previous reports into a guidance manual providing an integrated procedure for applying the envelope curves. Additionally, the investigation provides limited verification for selected bridge-scour envelope curves by comparing them to field data collected outside of South Carolina from previously published sources. Although the bridge-scour envelope curves have limitations, they are useful supplementary tools for assessing the potential for scour at riverine bridges in South Carolina.

  16. Solar envelope concepts: moderate density building applications. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, R.L.; Berry, R.D.

    1980-04-01

    Solar energy utilization in urban areas requires public guarantees that all property owners have direct access to the sun. The study examines the implications of this premise in relation to the need for cities to also encourage or accommodate rebuilding and future development. The public policy mechanism for guaranteeing solar access is conceptualized as a solar zoning envelope that allows the largest possible building bulk on a land parcel without shadowing neighboring properties during specified times. Step-by-step methods for generating solar envelopes are described with extensive drawings, showing a variety of urban platting and lot configurations. Development and design possibilities are examined on a selected set of Los Angeles sites with typically diverse urban characteristics. Envelope attributes suitable for encouraging moderate-density commercial and residential building are examined in the context of two hypothetical but realistic development programs: one for speculative office buildings and one for condominium housing. Numerous illustrations of envelope forms and prototypical building designs are provided. The results of development simulation studies on all test sites are tabulated to show building bulk, density, land-coverage and open space characteristics obtainable under the hypothesized envelopes.

  17. Comparison of the Modeling Approach between Membrane Bioreactor and Conventional Activated Sludge Processes

    DEFF Research Database (Denmark)

    Jiang, Tao; Sin, Gürkan; Spanjers, Henri

    2009-01-01

    Activated sludge models (ASM) have been developed and largely applied in conventional activated sludge (CAS) systems. The applicability of ASM to model membrane bioreactors (MBR) and the differences in modeling approaches have not been studied in detail. A laboratory-scale MBR was modeled using ASM...

  18. Continuous Training and Wages: An Empirical Analysis Using a Comparison-Group Approach

    Science.gov (United States)

    Gorlitz, Katja

    2011-01-01

    Using German linked employer-employee data, this paper investigates the short-term impact of on-the-job training on wages. The applied estimation approach was first introduced by Leuven and Oosterbeek (2008). Wages of employees who intended to participate in training but did not do so because of a random event are compared to wages of training…

  19. Risk-based microbiological criteria for Campylobacter in broiler meat: A comparison of two approaches

    DEFF Research Database (Denmark)

    Nauta, Maarten; Andersen, Jens Kirk; Tuominen, Pirkko

    2015-01-01

    Risk-based microbiological criteria can offer a tool to control Campylobacter in the broiler meat production chain. Recently two approaches have been applied to derive such criteria and to analyse their potential impact in terms of human health risk reduction: the risk-based version...

  20. A Comparison of Online and Face-to-Face Approaches to Teaching Introduction to American Government

    Science.gov (United States)

    Bolsen, Toby; Evans, Michael; Fleming, Anna McCaghren

    2016-01-01

    This article reports results from a large study comparing four different approaches to teaching Introduction to American Government: (1) traditional, a paper textbook with 100% face-to-face lecture-style teaching; (2) breakout, a paper textbook with 50% face-to-face lecture-style teaching and 50% face-to-face small-group breakout discussion…

  1. Distinguishing Continuous and Discrete Approaches to Multilevel Mixture IRT Models: A Model Comparison Perspective

    Science.gov (United States)

    Zhu, Xiaoshu

    2013-01-01

    The current study introduced a general modeling framework, multilevel mixture IRT (MMIRT) which detects and describes characteristics of population heterogeneity, while accommodating the hierarchical data structure. In addition to introducing both continuous and discrete approaches to MMIRT, the main focus of the current study was to distinguish…

  2. Learning Biology through Innovative Curricula: A Comparison of Game- and Nongame-Based Approaches

    Science.gov (United States)

    Sadler, Troy D.; Romine, William L.; Menon, Deepika; Ferdig, Richard E.; Annetta, Leonard

    2015-01-01

    This study explored student learning in the context of innovative biotechnology curricula and the effects of gaming as a central element of the learning experience. The quasi-experimentally designed study compared learning outcomes between two curricular approaches: One built around a computer-based game, and the other built around a narrative…

  3. Psychiatric Cultures Compared : Psychiatry and Mental Health Care in the Twentieth Century: Comparisons and Approaches

    NARCIS (Netherlands)

    Gijswijt-Hofstra, Marijke; Oosterhuis, Harry; Vijselaar, Joost; Freeman, Hugh

    2005-01-01

    The history of mental health care in the twentieth century is a relatively uncharted territory. Exemplifying a new emphasis on the comparative approach, this volume offers overviews of various national psychiatric cultures and explores new research subjects. By confronting Dutch psychiatry with

  4. Detecting autologous blood transfusions: a comparison of three passport approaches and four blood markers

    DEFF Research Database (Denmark)

    Mørkeberg, J; Sharpe, K; Belhage, B

    2011-01-01

    Blood passport has been suggested as an indirect tool to detect various kinds of blood manipulations. Autologous blood transfusions are currently undetectable, and the objective of this study was to examine the sensitivities of different blood markers and blood passport approaches in order to det...

  5. A Critical Comparison of Transformation and Deep Approach Theories of Learning

    Science.gov (United States)

    Howie, Peter; Bagnall, Richard

    2015-01-01

    This paper reports a critical comparative analysis of two popular and significant theories of adult learning: the transformation and the deep approach theories of learning. These theories are operative in different educational sectors, are significant, respectively, in each, and they may be seen as both touching on similar concerns with learning…

  6. Comparison of Two Music Training Approaches on Music and Speech Perception in Cochlear Implant Users

    NARCIS (Netherlands)

    Fuller, Christina D; Galvin, John J; Maat, Bert; Başkent, Deniz; Free, Rolien H

    2018-01-01

    In normal-hearing (NH) adults, long-term music training may benefit music and speech perception, even when listening to spectro-temporally degraded signals as experienced by cochlear implant (CI) users. In this study, we compared two different music training approaches in CI users and their effects

  7. A comparison between prescriptive- and performance-based approaches in fire safety design of structures

    DEFF Research Database (Denmark)

    Budny, Iwona; Giuliani, Luisa

    2010-01-01

    methodology of the performance-based fire design approach is considered, with the avail of computer-aided simulations of the main frame of the car-park. Nonlinear analyses, with respect to thermally induced effects and emphasis laid to the collapse modality, are carried out on a frame of the considered...

  8. Comparison between AGC and a tuningless LFC approach based on direct observation of DERs

    DEFF Research Database (Denmark)

    Prostejovsky, Alexander Maria; Marinelli, Mattia

    2017-01-01

    , and the resulting reduction of available inertia. In this paper, we propose a tuningless Load-Frequency Control (LFC) approach able to cope with the changing dynamics of electric power grids. Harnessing the possibilities of modern monitoring and communication means, the so-called Direct Load-Frequency Control (DLFC...

  9. International Students' Motivation and Learning Approach: A Comparison with Local Students

    Science.gov (United States)

    Chue, Kah Loong; Nie, Youyan

    2016-01-01

    Psychological factors contribute to motivation and learning for international students as much as teaching strategies. 254 international students and 144 local students enrolled in a private education institute were surveyed regarding their perception of psychological needs support, their motivation and learning approach. The results from this…

  10. A comparison of two different approaches for mapping potential ozone damage to vegetation. A model study

    International Nuclear Information System (INIS)

    Simpson, D.; Ashmore, M.R.; Emberson, L.; Tuovinen, J.-P.

    2007-01-01

    Two very different types of approaches are currently in use today for indicating risk of ozone damage to vegetation in Europe. One approach is the so-called AOTX (accumulated exposure over threshold of X ppb) index, which is based upon ozone concentrations only. The second type of approach entails an estimate of the amount of ozone entering via the stomates of vegetation, the AFstY approach (accumulated stomatal flux over threshold of Y nmol m -2 s -1 ). The EMEP chemical transport model is used to map these different indicators of ozone damage across Europe, for two illustrative vegetation types, wheat and beech forests. The results show that exceedences of critical levels for either type of indicator are widespread, but that the indicators give very different spatial patterns across Europe. Model simulations for year 2020 scenarios suggest reductions in risks of vegetation damage whichever indicator is used, but suggest that AOT40 is much more sensitive to emission control than AFstY values. - Model calculations of AOT40 and AFstY show very different spatial variations in the risks of ozone damage to vegetation

  11. A Comparison of Three Approaches to Correct for Direct and Indirect Range Restrictions: A Simulation Study

    Science.gov (United States)

    Pfaffel, Andreas; Schober, Barbara; Spiel, Christiane

    2016-01-01

    A common methodological problem in the evaluation of the predictive validity of selection methods, e.g. in educational and employment selection, is that the correlation between predictor and criterion is biased. Thorndike's (1949) formulas are commonly used to correct for this biased correlation. An alternative approach is to view the selection…

  12. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  13. Comparison of Two Music Training Approaches on Music and Speech Perception in Cochlear Implant Users.

    Science.gov (United States)

    Fuller, Christina D; Galvin, John J; Maat, Bert; Başkent, Deniz; Free, Rolien H

    2018-01-01

    In normal-hearing (NH) adults, long-term music training may benefit music and speech perception, even when listening to spectro-temporally degraded signals as experienced by cochlear implant (CI) users. In this study, we compared two different music training approaches in CI users and their effects on speech and music perception, as it remains unclear which approach to music training might be best. The approaches differed in terms of music exercises and social interaction. For the pitch/timbre group, melodic contour identification (MCI) training was performed using computer software. For the music therapy group, training involved face-to-face group exercises (rhythm perception, musical speech perception, music perception, singing, vocal emotion identification, and music improvisation). For the control group, training involved group nonmusic activities (e.g., writing, cooking, and woodworking). Training consisted of weekly 2-hr sessions over a 6-week period. Speech intelligibility in quiet and noise, vocal emotion identification, MCI, and quality of life (QoL) were measured before and after training. The different training approaches appeared to offer different benefits for music and speech perception. Training effects were observed within-domain (better MCI performance for the pitch/timbre group), with little cross-domain transfer of music training (emotion identification significantly improved for the music therapy group). While training had no significant effect on QoL, the music therapy group reported better perceptual skills across training sessions. These results suggest that more extensive and intensive training approaches that combine pitch training with the social aspects of music therapy may further benefit CI users.

  14. A comparison of direct aspiration versus stent retriever as a first approach ('COMPASS'): protocol.

    Science.gov (United States)

    Turk, Aquilla S; Siddiqui, Adnan H; Mocco, J

    2018-02-20

    Acute ischemic stroke is a potentially devastating condition and leading cause of morbidity and mortality, affecting an estimated 800 000 people per year in the USA. The natural history of untreated or unrevascularized large vessel occlusions in acute stroke patients results in mortality rates approaching 30%, with only 25% achieving good neurologic outcomes at 90 days. Recently, data have demonstrated that early endovascular recanalization of large vessel occlusions results in better outcomes than medical therapy alone. However, the majority of patients in these studies were treated with a stent retriever based approach. The purpose of COMPASS is to evaluate whether patients treated with a direct aspiration first pass (ADAPT) approach have non-inferior functional outcomes to those treated with a stent retriever as the firstline (SRFL) approach. All patients who meet the inclusion and exclusion criteria and consent to participate will be enrolled at participating centers. Treatment will be randomly assigned by a central web based system in a 1:1 manner to treatment with either ADAPT or SRFL thrombectomy. Statistical methodology is prespecified with details available in the statistical analysis plan. The trial recently completed enrollment, and data collection/verification is ongoing. The final results will be made available on completion of enrollment and follow-up. This paper details the design of the COMPASS trial, a randomized, blinded adjudicator, concurrent, controlled trial of patients treated with either ADAPT or SRFL approaches in order to evaluate whether ADAPT results in non-inferior functional outcome. NCT02466893, Results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. A comparison of regional flood frequency analysis approaches in a simulation framework

    Science.gov (United States)

    Ganora, D.; Laio, F.

    2016-07-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially smooth regional approach outperforms the others, with overall errors 10-50% lower than the other methods. In the case of a step-change, the spatially smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are

  16. Improved Detection of Vowel Envelope Frequency Following Responses Using Hotelling's T2 Analysis.

    Science.gov (United States)

    Vanheusden, Frederique J; Bell, Steven L; Chesnaye, Michael A; Simpson, David M

    2018-05-11

    were generated. Performance of the algorithms was assessed based on the number of sets for which a response could be detected at each SNR. In simulation studies, HT2_3F significantly outperformed the other algorithms when detecting a vowel stimulus in noise. For simulations containing responses only at a single frequency, HT2_3F performs worse compared with other approaches applied in this study as the additional frequencies included do not contain additional information. For recorded EEG data, HT2_MC showed a significantly higher response detection rate compared with MSC and FA-F-Test. Both HT2_MC and HT2_F0 also showed a significant reduction in detection time compared with the FA-F-Test algorithm. Comparisons between different electrode locations confirmed a higher number of detections for electrodes close to Cz compared to more peripheral locations. The HT2 method is more sensitive than FA-F-Test and MSC in detecting responses to complex stimuli because it allows detection of multiple frequencies (HT2_F3) and multiple EEG channels (HT2_MC) simultaneously. This effect was shown in simulation studies for HT2_3F and in EEG data for the HT2_MC algorithm. The spread in detection time across subjects is also lower for the HT2 algorithm, with decision on the presence of an eFFR possible within 5 min.

  17. Performance comparison of novel WNN approach with RBFNN in navigation of autonomous mobile robotic agent

    Directory of Open Access Journals (Sweden)

    Ghosh Saradindu

    2016-01-01

    Full Text Available This paper addresses the performance comparison of Radial Basis Function Neural Network (RBFNN with novel Wavelet Neural Network (WNN of designing intelligent controllers for path planning of mobile robot in an unknown environment. In the proposed WNN, different types of activation functions such as Mexican Hat, Gaussian and Morlet wavelet functions are used in the hidden nodes. The neural networks are trained by an intelligent supervised learning technique so that the robot makes a collision-free path in the unknown environment during navigation from different starting points to targets/goals. The efficiency of two algorithms is compared using some MATLAB simulations and experimental setup with Arduino Mega 2560 microcontroller in terms of path length and time taken to reach the target as an indicator for the accuracy of the network models.

  18. Common Envelope Light Curves. I. Grid-code Module Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Galaviz, Pablo; Marco, Orsola De; Staff, Jan E.; Iaconi, Roberto [Department of Physics and Astronomy, Macquarie University, Sydney, NSW (Australia); Passy, Jean-Claude, E-mail: Pablo.Galaviz@me.com [Argelander-Institut für Astronomie, Auf dem Hügel 71, D-53121 Bonn (Germany)

    2017-04-01

    The common envelope (CE) binary interaction occurs when a star transfers mass onto a companion that cannot fully accrete it. The interaction can lead to a merger of the two objects or to a close binary. The CE interaction is the gateway of all evolved compact binaries, all stellar mergers, and likely many of the stellar transients witnessed to date. CE simulations are needed to understand this interaction and to interpret stars and binaries thought to be the byproduct of this stage. At this time, simulations are unable to reproduce the few observational data available and several ideas have been put forward to address their shortcomings. The need for more definitive simulation validation is pressing and is already being fulfilled by observations from time-domain surveys. In this article, we present an initial method and its implementation for post-processing grid-based CE simulations to produce the light curve so as to compare simulations with upcoming observations. Here we implemented a zeroth order method to calculate the light emitted from CE hydrodynamic simulations carried out with the 3D hydrodynamic code Enzo used in unigrid mode. The code implements an approach for the computation of luminosity in both optically thick and optically thin regimes and is tested using the first 135 days of the CE simulation of Passy et al., where a 0.8  M {sub ⊙} red giant branch star interacts with a 0.6  M {sub ⊙} companion. This code is used to highlight two large obstacles that need to be overcome before realistic light curves can be calculated. We explain the nature of these problems and the attempted solutions and approximations in full detail to enable the next step to be identified and implemented. We also discuss our simulation in relation to recent data of transients identified as CE interactions.

  19. Assessment of academic departments efficiency using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Salah R. Agha

    2011-07-01

    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  20. Predictive Utility of Personality Disorder in Depression: Comparison of Outcomes and Taxonomic Approach.

    Science.gov (United States)

    Newton-Howes, Giles; Mulder, Roger; Ellis, Pete M; Boden, Joseph M; Joyce, Peter

    2017-09-19

    There is debate around the best model for diagnosing personality disorder, both in terms of its relationship to the empirical data and clinical utility. Four randomized controlled trials examining various treatments for depression were analyzed at an individual patient level. Three different approaches to the diagnosis of personality disorder were analyzed in these patients. A total of 578 depressed patients were included in the analysis. Personality disorder, however measured, was of little predictive utility in the short term but added significantly to predictive modelling of medium-term outcomes, accounting for more than twice as much of the variance in social functioning outcome as depression psychopathology. Personality disorder assessment is of predictive utility with longer timeframes and when considering social outcomes as opposed to symptom counts. This utility is sufficiently great that there appears to be value in assessing personality; however, no particular approach outperforms any other.

  1. Comparison of different Kalman filter approaches in deriving time varying connectivity from EEG data.

    Science.gov (United States)

    Ghumare, Eshwar; Schrooten, Maarten; Vandenberghe, Rik; Dupont, Patrick

    2015-08-01

    Kalman filter approaches are widely applied to derive time varying effective connectivity from electroencephalographic (EEG) data. For multi-trial data, a classical Kalman filter (CKF) designed for the estimation of single trial data, can be implemented by trial-averaging the data or by averaging single trial estimates. A general linear Kalman filter (GLKF) provides an extension for multi-trial data. In this work, we studied the performance of the different Kalman filtering approaches for different values of signal-to-noise ratio (SNR), number of trials and number of EEG channels. We used a simulated model from which we calculated scalp recordings. From these recordings, we estimated cortical sources. Multivariate autoregressive model parameters and partial directed coherence was calculated for these estimated sources and compared with the ground-truth. The results showed an overall superior performance of GLKF except for low levels of SNR and number of trials.

  2. Robotic longitudinal pancreaticojejunostomy for chronic pancreatitis: Comparison of clinical outcomes and cost to the open approach.

    Science.gov (United States)

    Kirks, Russell C; Lorimer, Patrick D; Fruscione, Michael; Cochran, Allyson; Baker, Erin H; Iannitti, David A; Vrochides, Dionisios; Martinie, John B

    2017-09-01

    This study compares clinical and cost outcomes of robot-assisted laparoscopic (RAL) and open longitudinal pancreaticojejunostomy (LPJ) for chronic pancreatitis. Clinical and cost data were retrospectively compared between open and RAL LPJ performed at a single center from 2008-2015. Twenty-six patients underwent LPJ: 19 open and 7 RAL. Two robot-assisted cases converted to open were included in the open group for analysis. Patients undergoing RAL LPJ had less intraoperative blood loss, a shorter surgical length of stay, and lower medication costs. Operation supply cost was higher in the RAL group. No difference in hospitalization cost was found. Versus the open approach, RAL LPJ performed for chronic pancreatitis shortens hospitalization and reduces medication costs; hospitalization costs are equivalent. A higher operative cost for RAL LPJ is mitigated by a shorter hospitalization. Decreased morbidity and healthcare resource economy support use of the robotic approach for LPJ when appropriate. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Alcohol intake and colorectal cancer: a comparison of approaches for including repeated measures of alcohol consumption

    DEFF Research Database (Denmark)

    Thygesen, Lau Caspar; Wu, Kana; Grønbaek, Morten

    2008-01-01

    BACKGROUND: In numerous studies, alcohol intake has been found to be positively associated with colorectal cancer risk. However, the majority of studies included only one exposure measurement, which may bias the results if long-term intake is relevant.METHODS: We compared different approaches...... for including repeated measures of alcohol intake among 47,432 US men enrolled in the Health Professionals Follow-up Study. Questionnaires including questions on alcohol intake had been completed in 1986, 1990, 1994, and 1998. The outcome was incident colorectal cancer during follow-up from 1986 to 2002.RESULTS......: During follow-up, 868 members of the cohort experienced colorectal cancer. Baseline, updated, and cumulative average alcohol intakes were positively associated with colorectal cancer, with only minor differences among the approaches. These results support moderately increased risk for intake >30 g...

  4. Neurobiological studies of risk assessment: A comparison of expected utility and mean-variance approaches

    OpenAIRE

    d'Acremont, M.; Bossaerts, Peter

    2008-01-01

    When modeling valuation under uncertainty, economists generally prefer expected utility because it has an axiomatic foundation, meaning that the resulting choices will satisfy a number of rationality requirements. In expected utility theory, values are computed by multiplying probabilities of each possible state of nature by the payoff in that state and summing the results. The drawback of this approach is that all state probabilities need to be dealt with separately, which becomes extremely ...

  5. Sample preparation with solid phase microextraction and exhaustive extraction approaches: Comparison for challenging cases.

    Science.gov (United States)

    Boyacı, Ezel; Rodríguez-Lafuente, Ángel; Gorynski, Krzysztof; Mirnaghi, Fatemeh; Souza-Silva, Érica A; Hein, Dietmar; Pawliszyn, Janusz

    2015-05-11

    In chemical analysis, sample preparation is frequently considered the bottleneck of the entire analytical method. The success of the final method strongly depends on understanding the entire process of analysis of a particular type of analyte in a sample, namely: the physicochemical properties of the analytes (solubility, volatility, polarity etc.), the environmental conditions, and the matrix components of the sample. Various sample preparation strategies have been developed based on exhaustive or non-exhaustive extraction of analytes from matrices. Undoubtedly, amongst all sample preparation approaches, liquid extraction, including liquid-liquid (LLE) and solid phase extraction (SPE), are the most well-known, widely used, and commonly accepted methods by many international organizations and accredited laboratories. Both methods are well documented and there are many well defined procedures, which make them, at first sight, the methods of choice. However, many challenging tasks, such as complex matrix applications, on-site and in vivo applications, and determination of matrix-bound and free concentrations of analytes, are not easily attainable with these classical approaches for sample preparation. In the last two decades, the introduction of solid phase microextraction (SPME) has brought significant progress in the sample preparation area by facilitating on-site and in vivo applications, time weighted average (TWA) and instantaneous concentration determinations. Recently introduced matrix compatible coatings for SPME facilitate direct extraction from complex matrices and fill the gap in direct sampling from challenging matrices. Following introduction of SPME, numerous other microextraction approaches evolved to address limitations of the above mentioned techniques. There is not a single method that can be considered as a universal solution for sample preparation. This review aims to show the main advantages and limitations of the above mentioned sample

  6. A Comparison of Organizational Structure and Pedagogical Approach: Online versus Face-to-face

    OpenAIRE

    Donovan A. McFarlane

    2011-01-01

    This paper examines online versus face-to-face organizational structure and pedagogy in terms of education and the teaching and learning process. The author distinguishes several important terms related to distance/online/e-learning, virtual learning and brick-and-mortar learning interactions and concepts such as asynchronous and synchronous interactions, etc, before deliberating on perceived differences in organizational structure and pedagogical approaches of virtual and brick-and-mortar sc...

  7. Comparison between endoscopic and microscopic approaches for surgery of pituitary tumours.

    Science.gov (United States)

    Khan, Inamullah; Shamim, Muhammad Shahzad

    2017-11-01

    Surgical techniques for resection of pituitary tumours have come a long way since it was first introduced in late 18th century. Nowadays, most pituitary surgeries are performed through trans-nasal trans-sphenoidal approach either using a microscope, or an endoscope. Herein the authors review the literature and compare these two instruments with regards to their outcomes when used for resection of pituitary tumours. .

  8. Optimal PID settings for first and second-order processes - Comparison with different controller tuning approaches

    OpenAIRE

    Pappas, Iosif

    2016-01-01

    PID controllers are extensively used in industry. Although many tuning methodologies exist, finding good controller settings is not an easy task and frequently optimization-based design is preferred to satisfy more complex criteria. In this thesis, the focus was to find which tuning approaches, if any, present close to optimal behavior. Pareto-optimal controllers were found for different first and second-order processes with time delay. Performance was quantified in terms of the integrat...

  9. Comparison of numerical approaches to solve a Poincare-covariant Faddeev equation

    International Nuclear Information System (INIS)

    Alkofer, R.; Eichmann, G.; Krassnigg, A.; Schwinzerl, M.

    2006-01-01

    Full text: The quark core of Baryons can be described with the help of the numerical solution of the Poincare-Faddeev equation. Hereby the used elements, as e.g. the quark propagator are taken from non-perturbative studies of Landau gauge QCD. Different numerical approaches to solve in this way the relativistic three quark problem are compared and benchmarked results for the efficiency of different algorithms are presented. (author)

  10. Comparisons of single-stage and two-stage approaches to genomic selection.

    Science.gov (United States)

    Schulz-Streeck, Torben; Ogutu, Joseph O; Piepho, Hans-Peter

    2013-01-01

    Genomic selection (GS) is a method for predicting breeding values of plants or animals using many molecular markers that is commonly implemented in two stages. In plant breeding the first stage usually involves computation of adjusted means for genotypes which are then used to predict genomic breeding values in the second stage. We compared two classical stage-wise approaches, which either ignore or approximate correlations among the means by a diagonal matrix, and a new method, to a single-stage analysis for GS using ridge regression best linear unbiased prediction (RR-BLUP). The new stage-wise method rotates (orthogonalizes) the adjusted means from the first stage before submitting them to the second stage. This makes the errors approximately independently and identically normally distributed, which is a prerequisite for many procedures that are potentially useful for GS such as machine learning methods (e.g. boosting) and regularized regression methods (e.g. lasso). This is illustrated in this paper using componentwise boosting. The componentwise boosting method minimizes squared error loss using least squares and iteratively and automatically selects markers that are most predictive of genomic breeding values. Results are compared with those of RR-BLUP using fivefold cross-validation. The new stage-wise approach with rotated means was slightly more similar to the single-stage analysis than the classical two-stage approaches based on non-rotated means for two unbalanced datasets. This suggests that rotation is a worthwhile pre-processing step in GS for the two-stage approaches for unbalanced datasets. Moreover, the predictive accuracy of stage-wise RR-BLUP was higher (5.0-6.1%) than that of componentwise boosting.

  11. Comparison of approaches for the characterization of contamination at rural megasites

    DEFF Research Database (Denmark)

    Rein, Arno; Popp, Steffen; Zacharias, Steffen

    2011-01-01

    the complete area. The DP investigation provided information on the contamination distribution and yielded also important information on hydraulic conditions. Statistical analysis of the results applying indicator kriging revealed that the conventional approach is markedly risky when decision-making relies...... consideration into account, DP-based groundwater screening is recommended to obtain either first or complementary information on the entire site. Based on these data, also locations for a long-term monitoring could be selected if temporal variability is assumed relevant....

  12. High frequency vibration analysis by the complex envelope vectorization.

    Science.gov (United States)

    Giannini, O; Carcaterra, A; Sestieri, A

    2007-06-01

    The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.

  13. Accounting for context in studies of health inequalities: a review and comparison of analytic approaches.

    Science.gov (United States)

    Schempf, Ashley H; Kaufman, Jay S

    2012-10-01

    A common epidemiologic objective is to evaluate the contribution of residential context to individual-level disparities by race or socioeconomic position. We reviewed analytic strategies to account for the total (observed and unobserved factors) contribution of environmental context to health inequalities, including conventional fixed effects (FE) and hybrid FE implemented within a random effects (RE) or a marginal model. To illustrate results and limitations of the various analytic approaches of accounting for the total contextual component of health disparities, we used data on births nested within neighborhoods as an applied example of evaluating neighborhood confounding of racial disparities in gestational age at birth, including both a continuous and a binary outcome. Ordinary and RE models provided disparity estimates that can be substantially biased in the presence of neighborhood confounding. Both FE and hybrid FE models can account for cluster level confounding and provide disparity estimates unconfounded by neighborhood, with the latter having greater flexibility in allowing estimation of neighborhood-level effects and intercept/slope variability when implemented in a RE specification. Given the range of models that can be implemented in a hybrid approach and the frequent goal of accounting for contextual confounding, this approach should be used more often. Published by Elsevier Inc.

  14. Comparison of scientific and engineering approaches to the treatment of mixed wastes

    International Nuclear Information System (INIS)

    Gilbert, K.V.; Bowers, J.S.

    1993-12-01

    This paper discusses two approaches to the treatment of mixed waste. (Mixed waste, defined as radioactive waste that is co-contaminated with hazardous waste as defined in the Resource Conservation and Conservation Act, is presently stored throughout the United States awaiting the establishment of treatment capability.)The first approach employs conventional engineering that focuses on low risk technology which has been proven in other industries in similar applications and is adaptable for waste treatment use. The term ''low risk'' means that implementation success is relatively certain, and the major uncertainty is the degree of success. Technologies under consideration include centrifugation, evaporation, microfiltration and stabilization. Process offgases are treated with traditional scrubbers and carbon absorption units.For the scientific approach, Lawrence Livermore National Laboratory is in the conceptual design phase of a project to demonstrate incinerator alternatives to destroy organic contaminants in radioactive waste streams without the use of incineration. This Mixed Waste Management Facility will use approximately 15000 square feet of an existing facility to demonstrate an integrated waste management system. Robotic and telerobotic systems will be employed for waste segregation, characterization and feed preparation. Waste feeds will be treated using molten salt oxidation, mediated electrochemical oxidation and wet oxidation. Residues, which can be managed as radioactive-only waste, will be immobilized in an organic matrix prior to shipment to an authorized disposal site

  15. A comparison of bilingual education and generalist teachers' approaches to scientific biliteracy

    Science.gov (United States)

    Garza, Esther

    The purpose of this study was to determine if educators were capitalizing on bilingual learners' use of their biliterate abilities to acquire scientific meaning and discourse that would formulate a scientific biliterate identity. Mixed methods were used to explore teachers' use of biliteracy and Funds of Knowledge (Moll, L., Amanti, C., Neff, D., & Gonzalez, N., 1992; Gonzales, Moll, & Amanti, 2005) from the students' Latino heritage while conducting science inquiry. The research study explored four constructs that conceptualized scientific biliteracy. The four constructs include science literacy, science biliteracy, reading comprehension strategies and students' cultural backgrounds. There were 156 4th-5th grade bilingual and general education teachers in South Texas that were surveyed using the Teacher Scientific Biliteracy Inventory (TSBI) and five teachers' science lessons were observed. Qualitative findings revealed that a variety of scientific biliteracy instructional strategies were frequently used in both bilingual and general education classrooms. The language used to deliver this instruction varied. A General Linear Model revealed that classroom assignment, bilingual or general education, had a significant effect on a teacher's instructional approach to employ scientific biliteracy. A simple linear regression found that the TSBI accounted for 17% of the variance on 4th grade reading benchmarks. Mixed methods results indicated that teachers were utilizing scientific biliteracy strategies in English, Spanish and/or both languages. Household items and science experimentation at home were encouraged by teachers to incorporate the students' cultural backgrounds. Finally, science inquiry was conducted through a universal approach to science learning versus a multicultural approach to science learning.

  16. A comparison of two surgical approaches to the scapulohumeral joint in dogs.

    Science.gov (United States)

    McLaughlin, R; Roush, J K

    1995-01-01

    Two scapulohumeral arthrotomy techniques were evaluated and compared in 10 normal, young adult greyhounds. A caudolateral approach with craniodorsal retraction of the teres minor muscle (no-tenotomy) and a craniolateral approach with tenotomy of the infraspinatus tendon were each performed unilaterally in 5 dogs. The dogs were evaluated using force plate gait analysis, lameness evaluation, radiography, and goniometry for 5 weeks and then euthanatized. Tenotomy sites and sections of the humeral articular cartilage were collected from shoulder joints that had been operated on and examined microscopically. The same surgical approach has then performed on the contralateral shoulder in the cadavers and exposure of the humeral articular cartilage was measured using planimetry. Peak vertical force applied to the operated limbs in the tenotomy group was significantly less than preoperative leads on day 3 and significantly less than the no-tenotomy group on days 21 and 28. The peak vertical force applied to the operated limbs in the no-tenotomy group was not significantly different from preoperative levels during the study. Scapulohumeral arthrotomy by tenotomy of the infraspinatus resulted in decreased range-of-motion and joint extension compared with joints operated on without tenotomies, but provided significantly greater exposure to the articular surface. Scapulohumeral arthrotomy with craniodorsal retraction of the teres minor muscle did not significantly alter goniometric measurements compared with unoperated joints. Both techniques resulted in similar subjective lameness scores and caused no gross microscopic or radiographic evidence of articular cartilage damage.

  17. Comparison of Australian and American orthodontic clinical approaches towards root resorption.

    Science.gov (United States)

    Lim, Elaine; Sameshima, Glenn; Petocz, Peter; Darendeliler, Ali

    2012-11-01

    As part of The Rocky Mountain Travelling Fellowship, a pilot survey was conducted to assess current diagnostic and clinical approaches to the management of orthodontic patients in relation to root resorption. Groups comprising Australians (Sydney, New South Wales) and North Americans (Los Angeles, California), in two stages of their orthodontic careers (post-graduate orthodontic students from the University of Sydney and University of Southern California and qualified practising orthodontists) were asked to complete a questionnaire. The questions examined diagnosis and management approaches related to root resorption used in their clinical practice. Replies demonstrated that there were differences in management depending on operator experience and the country of clinical practice. However, a summarised common approach to orthodontic root resorption comprised (1) the use of an orthopantomogram as a screening diagnostic tool, followed by periapical radiographs for those perceived as 'higher risk' patients, particularly individuals with a history of root resorption; (2) a six monthly radiographic review during treatment; (3) the use of light forces and/or rest periods (discontinuous forces) every two to three months; (4) the extraction of deciduous teeth if permanent successors were erupting ectopically and causing damage to adjacent root structures; and (5) the use of fixed retention after treatment. This project was intended to initiate discussion and form a basis for further investigation into the clinical management of orthodontic root resorption.

  18. Incarcerated inguinal hernia management in children: 'a comparison of the open and laparoscopic approach'.

    Science.gov (United States)

    Mishra, Pankaj Kumar; Burnand, Katherine; Minocha, Ashish; Mathur, Azad B; Kulkarni, Milind S; Tsang, Thomas

    2014-06-01

    To compare the outcomes of management of incarcerated inguinal hernia by open versus laparoscopic approach. This is a retrospective analysis of incarcerated inguinal hernina in a paediatric surgery centre involving four consultants. Manual reduction was attempted in all and failure was managed by emergency surgery. The laparoscopy group had 27 patients. Four patients failed manual reduction and underwent emergency laparoscopic surgery. Three of them had small bowel strangulation which was reduced laparoscopically. The strangulated bowel was dusky in colour initially but changed to normal colour subsequently under vision. The fourth patient required appendectomy for strangulated appendix. One patient had concomitant repair of umbilical hernia and one patient had laparoscopic pyloromyotomy at the same time. One patient had testicular atrophy, one had hydrocoele and one had recurrence of hernia on the asymptomatic side. The open surgery group had 45 patients. Eleven patients had failed manual reduction requiring emergency surgery, of these two required resection and anastomosis of small intestine. One patient in this group had concomitant repair of undescended testis. There was no recurrence in this group, one had testicular atrophy and seven had metachronous hernia. Both open herniotomy and laparoscopic repair offer safe surgery with comparable outcomes for incarcerated inguinal hernia in children. Laparoscopic approach and hernioscopy at the time of open approach appear to show the advantage of repairing the contralateral patent processus vaginalis at the same time and avoiding metachronous inguinal hernia.

  19. Neurobiological studies of risk assessment: a comparison of expected utility and mean-variance approaches.

    Science.gov (United States)

    D'Acremont, Mathieu; Bossaerts, Peter

    2008-12-01

    When modeling valuation under uncertainty, economists generally prefer expected utility because it has an axiomatic foundation, meaning that the resulting choices will satisfy a number of rationality requirements. In expected utility theory, values are computed by multiplying probabilities of each possible state of nature by the payoff in that state and summing the results. The drawback of this approach is that all state probabilities need to be dealt with separately, which becomes extremely cumbersome when it comes to learning. Finance academics and professionals, however, prefer to value risky prospects in terms of a trade-off between expected reward and risk, where the latter is usually measured in terms of reward variance. This mean-variance approach is fast and simple and greatly facilitates learning, but it impedes assigning values to new gambles on the basis of those of known ones. To date, it is unclear whether the human brain computes values in accordance with expected utility theory or with mean-variance analysis. In this article, we discuss the theoretical and empirical arguments that favor one or the other theory. We also propose a new experimental paradigm that could determine whether the human brain follows the expected utility or the mean-variance approach. Behavioral results of implementation of the paradigm are discussed.

  20. Basic echocardiography for undergraduate students: a comparison of different peer-teaching approaches.

    Science.gov (United States)

    Gradl-Dietsch, G; Menon, A K; Gürsel, A; Götzenich, A; Hatam, N; Aljalloud, A; Schrading, S; Hölzl, F; Knobe, M

    2018-02-01

    The aim of this study was to assess the impact of different teaching interventions in a peer-teaching environment on basic echocardiography skills and to examine the influence of gender on learning outcomes. We randomly assigned 79 s year medical students (55 women, 24 men) to one of four groups: peer teaching (PT), peer teaching using Peyton's four-step approach (PPT), team based learning (TBL) and video-based learning (VBL). All groups received theoretical and practical hands-on training according to the different approaches. Using a pre-post-design we assessed differences in theoretical knowledge [multiple choice (MC) exam], practical skills (Objective Structured Practical Examination, OSPE) and evaluation results with respect to gender. There was a significant gain in theoretical knowledge for all students. There were no relevant differences between the four groups regarding the MC exam and OSPE results. The majority of students achieved good or very good results. Acceptance of the peer-teaching concept was moderate and all students preferred medical experts to peer tutors even though the overall rating of the instructors was fairly good. Students in the Video group would have preferred a different training method. There was no significant effect of gender on evaluation results. Using different peer-teaching concepts proved to be effective in teaching basic echocardiography. Gender does not seem to have an impact on effectiveness of the instructional approach. Qualitative analysis revealed limited acceptance of peer teaching and especially of video-based instruction.

  1. Optimising MR perfusion imaging: comparison of different software-based approaches in acute ischaemic stroke

    Energy Technology Data Exchange (ETDEWEB)

    Schaafs, Lars-Arne [Charite-Universitaetsmedizin, Department of Radiology, Berlin (Germany); Charite-Universitaetsmedizin, Academic Neuroradiology, Department of Neurology and Center for Stroke Research, Berlin (Germany); Porter, David [Fraunhofer Institute for Medical Image Computing MEVIS, Bremen (Germany); Audebert, Heinrich J. [Charite-Universitaetsmedizin, Department of Neurology with Experimental Neurology, Berlin (Germany); Fiebach, Jochen B.; Villringer, Kersten [Charite-Universitaetsmedizin, Academic Neuroradiology, Department of Neurology and Center for Stroke Research, Berlin (Germany)

    2016-11-15

    Perfusion imaging (PI) is susceptible to confounding factors such as motion artefacts as well as delay and dispersion (D/D). We evaluate the influence of different post-processing algorithms on hypoperfusion assessment in PI analysis software packages to improve the clinical accuracy of stroke PI. Fifty patients with acute ischaemic stroke underwent MRI imaging in the first 24 h after onset. Diverging approaches to motion and D/D correction were applied. The calculated MTT and CBF perfusion maps were assessed by volumetry of lesions and tested for agreement with a standard approach and with the final lesion volume (FLV) on day 6 in patients with persisting vessel occlusion. MTT map lesion volumes were significantly smaller throughout the software packages with correction of motion and D/D when compared to the commonly used approach with no correction (p = 0.001-0.022). Volumes on CBF maps did not differ significantly (p = 0.207-0.925). All packages with advanced post-processing algorithms showed a high level of agreement with FLV (ICC = 0.704-0.879). Correction of D/D had a significant influence on estimated lesion volumes and leads to significantly smaller lesion volumes on MTT maps. This may improve patient selection. (orig.)

  2. Optimising MR perfusion imaging: comparison of different software-based approaches in acute ischaemic stroke

    International Nuclear Information System (INIS)

    Schaafs, Lars-Arne; Porter, David; Audebert, Heinrich J.; Fiebach, Jochen B.; Villringer, Kersten

    2016-01-01

    Perfusion imaging (PI) is susceptible to confounding factors such as motion artefacts as well as delay and dispersion (D/D). We evaluate the influence of different post-processing algorithms on hypoperfusion assessment in PI analysis software packages to improve the clinical accuracy of stroke PI. Fifty patients with acute ischaemic stroke underwent MRI imaging in the first 24 h after onset. Diverging approaches to motion and D/D correction were applied. The calculated MTT and CBF perfusion maps were assessed by volumetry of lesions and tested for agreement with a standard approach and with the final lesion volume (FLV) on day 6 in patients with persisting vessel occlusion. MTT map lesion volumes were significantly smaller throughout the software packages with correction of motion and D/D when compared to the commonly used approach with no correction (p = 0.001-0.022). Volumes on CBF maps did not differ significantly (p = 0.207-0.925). All packages with advanced post-processing algorithms showed a high level of agreement with FLV (ICC = 0.704-0.879). Correction of D/D had a significant influence on estimated lesion volumes and leads to significantly smaller lesion volumes on MTT maps. This may improve patient selection. (orig.)

  3. Indicators for assessment of rural electrification-An approach for the comparison of apples and pears

    International Nuclear Information System (INIS)

    Ilskog, Elisabeth

    2008-01-01

    Despite a large number of rural electrification projects being implemented in developing countries, there are few published in-depth evaluations of the effects of these projects on sustainable development. There is also no generally accepted method for the assessment of such effects that includes all relevant aspects of sustainability. An issue of growing importance is whether rural electrification implemented by private entrepreneurs or other non-governmental organisations contribute more effectively to sustainable development than the conventional approach where rural electrification is the responsibility of a government utility. This paper presents a method for sustainability evaluation based on the use of 39 indicators. The proposed indicators cover the five dimensions of sustainability: technical, economical, social/ethical, environmental and institutional sustainability. The paper presents the indicators and gives a detailed example of the procedure to calculate an indicator based on information that can realistically be collected in field studies. It is suggested that this interdisciplinary approach will lead to an improved basis for evaluation of projects than previous, more limited approaches. Projects promoted on the basis of information only about prioritised dimensions of sustainability, such as environment, may fail as a result of weaknesses in other dimensions. The proposed method may reduce this risk

  4. Link removal for the control of stochastically evolving epidemics over networks: a comparison of approaches.

    Science.gov (United States)

    Enns, Eva A; Brandeau, Margaret L

    2015-04-21

    For many communicable diseases, knowledge of the underlying contact network through which the disease spreads is essential to determining appropriate control measures. When behavior change is the primary intervention for disease prevention, it is important to understand how to best modify network connectivity using the limited resources available to control disease spread. We describe and compare four algorithms for selecting a limited number of links to remove from a network: two "preventive" approaches (edge centrality, R0 minimization), where the decision of which links to remove is made prior to any disease outbreak and depends only on the network structure; and two "reactive" approaches (S-I edge centrality, optimal quarantining), where information about the initial disease states of the nodes is incorporated into the decision of which links to remove. We evaluate the performance of these algorithms in minimizing the total number of infections that occur over the course of an acute outbreak of disease. We consider different network structures, including both static and dynamic Erdös-Rényi random networks with varying levels of connectivity, a real-world network of residential hotels connected through injection drug use, and a network exhibiting community structure. We show that reactive approaches outperform preventive approaches in averting infections. Among reactive approaches, removing links in order of S-I edge centrality is favored when the link removal budget is small, while optimal quarantining performs best when the link removal budget is sufficiently large. The budget threshold above which optimal quarantining outperforms the S-I edge centrality algorithm is a function of both network structure (higher for unstructured Erdös-Rényi random networks compared to networks with community structure or the real-world network) and disease infectiousness (lower for highly infectious diseases). We conduct a value-of-information analysis of knowing which

  5. The nuclear envelope from basic biology to therapy.

    Science.gov (United States)

    Worman, Howard J; Foisner, Roland

    2010-02-01

    The nuclear envelope has long been a focus of basic research for a highly specialized group of cell biologists. More recently, an expanding group of scientists and physicians have developed a keen interest in the nuclear envelope since mutations in the genes encoding lamins and associated proteins have been shown to cause a diverse range of human diseases often called laminopathies or nuclear envelopathies. Most of these diseases have tissue-selective phenotypes, suggesting that the nuclear envelope must function in cell-type- and developmental-stage-specific processes such as chromatin organization, regulation of gene expression, controlled nucleocytoplasmic transport and response to stress in metazoans. On 22-23 April 2009, Professor Christopher Hutchison organized the 4th British Nuclear Envelope Disease and Chromatin Organization meeting at the College of St Hild and St Bede at Durham University, sponsored by the Biochemical Society. In attendance were investigators with one common interest, the nuclear envelope, but with diverse expertise and training in animal and plant cell biology, genetics, developmental biology and medicine. We were each honoured to be keynote speakers. This issue of Biochemical Society Transactions contains papers written by some of the presenters at this scientifically exciting meeting, held in a bucolic setting where the food was tasty and the wine flowed freely. Perhaps at the end of this excellent meeting more questions were raised than answered, which will stimulate future research. However, what became clear is that the nuclear envelope is a cellular structure with critical functions in addition to its traditional role as a barrier separating the nuclear and cytoplasmic compartments in interphase eukaryotic cells.

  6. Envelope proteins of bovine herpesvirus 1: immunological and biochemical studies

    International Nuclear Information System (INIS)

    Rodriguez Roque, L.L.

    1986-01-01

    The authors studied immunological and biochemical properties of the bovid herpesvirus 1 (BHV-1) envelope proteins in order to understand the pathogenesis of BHV-1 infection and to provide basic information for the production of effective subunit vaccines against BHV-1. Ten glycoproteins MW 180, 150, 130, 115, 97, 77, 74, 64, 55, and 45 kilodaltons (K), and a single non-glycosylated 108 K protein were quantitatively removed from purified BHV-1 virions by detergent treatment. These glycoproteins were present on the virion envelope and on the surface of BHV-1 infected cells. The quantitative removal from virions by treatment with nonionic detergents and their presence on the surface of infected cells indicate that 180/97, 150/77, and 130/74/55 K are major components of the BHV-1 envelope and are also the targets of virus neutralizing humoral immune response. Envelope glycoproteins of herpes simplex type 1 (HSV-1) bind immunoglobulin by the Fc end and it is suggested this may increase pathogenicity of this virus. They searched for a similar function in BVH-1 by measuring the ability of BHV-1 infected cells and viral envelope proteins to bind radiolabelled rabbit and bovine IgG. Binding activity for rabbit IgG or bovine IgG-Fc could not be demonstrated by BHV-1 infected MDBK cells, whereas, MDBK cells infected with HSV-1 bound rabbit IgG and bovine IgG-Fc. None of the three major envelope proteins of BHV-1 bound to rabbit or bovine IgG. The results of this study indicate that BHV-1, unlike some other herpesviruses, lack Fc binding activity

  7. Quantitative Comparison of Ternary Eutectic Phase-Field Simulations with Analytical 3D Jackson-Hunt Approaches

    Science.gov (United States)

    Steinmetz, Philipp; Kellner, Michael; Hötzer, Johannes; Nestler, Britta

    2018-02-01

    For the analytical description of the relationship between undercoolings, lamellar spacings and growth velocities during the directional solidification of ternary eutectics in 2D and 3D, different extensions based on the theory of Jackson and Hunt are reported in the literature. Besides analytical approaches, the phase-field method has been established to study the spatially complex microstructure evolution during the solidification of eutectic alloys. The understanding of the fundamental mechanisms controlling the morphology development in multiphase, multicomponent systems is of high interest. For this purpose, a comparison is made between the analytical extensions and three-dimensional phase-field simulations of directional solidification in an ideal ternary eutectic system. Based on the observed accordance in two-dimensional validation cases, the experimentally reported, inherently three-dimensional chain-like pattern is investigated in extensive simulation studies. The results are quantitatively compared with the analytical results reported in the literature, and with a newly derived approach which uses equal undercoolings. A good accordance of the undercooling-spacing characteristics between simulations and the analytical Jackson-Hunt apporaches are found. The results show that the applied phase-field model, which is based on the Grand potential approach, is able to describe the analytically predicted relationship between the undercooling and the lamellar arrangements during the directional solidification of a ternary eutectic system in 3D.

  8. Transoral endoscopic thyroidectomy vestibular approach (TOETVA) for Graves' disease: a comparison of surgical results with open thyroidectomy.

    Science.gov (United States)

    Jitpratoom, Pornpeera; Ketwong, Khwannara; Sasanakietkul, Thanyawat; Anuwong, Angkoon

    2016-12-01

    Transoral endoscopic thyroidectomy vestibular approach (TOETVA) provides excellent cosmetic results from its potential for scar-free operation. The procedure has been applied successfully for Graves' disease by the authors of this work and compared with the standard open cervical approach to evaluate its safety and outcomes. From January 2014 to November 2016, a total of 97 patients with Graves' disease were reviewed retrospectively. Open thyroidectomy (OT) and TOETVA were performed in 49 patients and 46 patients, respectively. For TOETVA, a three-port technique through the oral vestibule was utilized. The thyroidectomy was done endoscopically using conventional laparoscopic instruments and an ultrasonic device. Patient demographics and surgical variables, including operative time, blood loss, and complications, were investigated and compared. TOETVA was performed successfully in all 45 patients, although conversion to open surgery was deemed necessary in one patient. All patient characteristics for both groups were similar. Operative time was shorter for the OT group compared to the TOETVA group, which totaled 101.97±24.618 and 134.11±31.48 minutes, respectively (PGraves' disease in comparison to the standard open cervical approach. It is considered a viable alternative for patients who have been indicated for surgery with excellent cosmetic results.

  9. A Cross-Disciplinary Successful Aging Intervention and Evaluation: Comparison of Person-to-Person and Digital-Assisted Approaches

    Directory of Open Access Journals (Sweden)

    Hui-Chuan Hsu

    2018-05-01

    Full Text Available Background: Successful aging has been the paradigm of old-age life. The purpose of this study was to implement and evaluate a cross-disciplinary intervention program using two approaches for community-based older adults in Taichung, Taiwan. Methods: The content of the intervention included successful aging concepts and preparation, physical activity, chronic disease and health management, dietary and nutrition information, cognitive training, emotional awareness and coping skills, family relationship and resilience, legal concepts regarding financial protection, and Internet use. The traditional person-to-person (P2P intervention approach was implemented among participants at urban centers, and the personal-and-digital (P&D intervention approach was implemented among participants at rural centers; before the P&D group received the intervention, participants were assessed as the control group for comparison. Results: Healthy behavior and nutrition improved for the P2P group, although not significantly. Strategies for adapting to old age and reducing ineffective coping were significantly improved in the P2P group. The ability to search for health information improved in the P&D group, and knowledge of finance-related law increased in the P2P group. Conclusion: A continuous, well-designed and evidence-based intervention program is beneficial for improving the health of older adults, or at least delaying its decline.

  10. Enveloping algebras of Lie groups with descrete series

    International Nuclear Information System (INIS)

    Nguyen huu Anh; Vuong manh Son

    1990-09-01

    In this article we shall prove that the enveloping algebra of the Lie algebra of some unimodular Lie group having discrete series, when localized at some element of the center, is isomorphic to the tensor product of a Weyl algebra over the ring of Laurent polynomials of one variable and the enveloping algebra of some reductive Lie algebra. In particular, it will be proved that the Lie algebra of a unimodular solvable Lie group having discrete series satisfies the Gelfand-Kirillov conjecture. (author). 6 refs

  11. Constant envelope OFDM scheme for 6PolSK-QPSK

    Science.gov (United States)

    Li, Yupeng; Ding, Ding

    2018-03-01

    A constant envelope OFDM scheme with phase modulator (PM-CE-OFDM) for 6PolSK-QPSK modulation was demonstrated. Performance under large fiber launch power is measured to check its advantages in counteracting fiber nonlinear impairments. In our simulation, PM-CE-OFDM, RF-assisted constant envelope OFDM (RF-CE-OFDM) and conventional OFDM (Con-OFDM) are transmitted through 80 km standard single mode fiber (SSMF) single channel and WDM system. Simulation results confirm that PM-CE-OFDM has best performance in resisting fiber nonlinearity. In addition, benefiting from the simple system structure, the complexity and cost of PM-CE-OFDM system could be reduced effectively.

  12. Data Envelopment Analysis (DEA) Model in Operation Management

    Science.gov (United States)

    Malik, Meilisa; Efendi, Syahril; Zarlis, Muhammad

    2018-01-01

    Quality management is an effective system in operation management to develops, maintains, and improves quality from groups of companies that allow marketing, production, and service at the most economycal level as well as ensuring customer satisfication. Many companies are practicing quality management to improve their bussiness performance. One of performance measurement is through measurement of efficiency. One of the tools can be used to assess efficiency of companies performance is Data Envelopment Analysis (DEA). The aim of this paper is using Data Envelopment Analysis (DEA) model to assess efficiency of quality management. In this paper will be explained CCR, BCC, and SBM models to assess efficiency of quality management.

  13. Asymmetry of the envelope of supernova 1987A

    Energy Technology Data Exchange (ETDEWEB)

    Papaliolios, C.; Karovska, M.; Koechlin, L.; Nisenson, P.; Standley, C.; Heathcote, S.

    1989-04-13

    The supernova SN1987A in the Large Magellanic Cloud has been observed by high-angular-resolution speckle interferometry since 25 March (30 days after the explosion) with the 4-m telescope at the Cerro Tololo Interamerican Observatory. Data obtained on 25 March and 2 April 1987 revealed a second bright 'companion' source separated from the supernova by 60 milliarcseconds and less than three magnitudes fainter than the supernova. Measurements of the average diameter of the supernova envelope have been made from data recorded from March 1987 to April 1988. Here we present a more detailed analysis of these data, which shows that the expanding envelope is asymmetric. (author).

  14. Asymmetry of the envelope of supernova 1987A

    International Nuclear Information System (INIS)

    Papaliolios, C.; Karovska, M.; Koechlin, L.; Nisenson, P.; Standley, C.; Heathcote, S.

    1989-01-01

    The supernova SN1987A in the Large Magellanic Cloud has been observed by high-angular-resolution speckle interferometry since 25 March (30 days after the explosion) with the 4-m telescope at the Cerro Tololo Interamerican Observatory. Data obtained on 25 March and 2 April 1987 revealed a second bright 'companion' source separated from the supernova by 60 milliarcseconds and less than three magnitudes fainter than the supernova. Measurements of the average diameter of the supernova envelope have been made from data recorded from March 1987 to April 1988. Here we present a more detailed analysis of these data, which shows that the expanding envelope is asymmetric. (author)

  15. Calculation of CWKB envelope in boson and fermion productions

    International Nuclear Information System (INIS)

    Biswas, S.; Chowdhury, I.

    2007-01-01

    We present the calculation of envelope of boson and of both low-and high-mass fermion production at the end of inflation when the coherently oscillating inflations decay into bosons and fermions. We consider three different models of inflation and use CWKB technique to calculate the envelope to understand the structure of resonance band formation. We observe that though low-mass fermion production is not effective in preheating because of Pauli blocking, it is quite probable for high-mass fermion to take part in pre heating. (author)

  16. Refractive index dispersion measurement using carrier-envelope phasemeters

    International Nuclear Information System (INIS)

    Hansinger, Peter; Töpfer, Philipp; Adolph, Daniel; Hoff, Dominik; Rathje, Tim; Sayler, A Max; Paulus, Gerhard G; Dimitrov, Nikolay; Dreischuh, Alexander

    2017-01-01

    We introduce a novel method for direct and accurate measurement of refractive index dispersion based on carrier-envelope phase detection of few-cycle laser pulses, exploiting the difference between phase and group velocity in a dispersive medium. In a layout similar to an interferometer, two carrier-envelope phasemeters are capable of measuring the dispersion of a transparent or reflective sample, where one phasemeter serves as the reference and the other records the influence of the sample. Here we report on proof-of-principle measurements that already reach relative uncertainties of a few 10 −4 . Further development is expected to allow for unprecedented precision. (paper)

  17. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    Science.gov (United States)

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  18. Monitoring the training intensity and recovery with a psychometrics approach: a gender comparison with young athletes

    Directory of Open Access Journals (Sweden)

    Ramon Cruz

    2017-12-01

    Full Text Available Abstract AIMS The purpose of present study was verify if the RPE-training session differs between females and males during the track and field training and if biological maturity (BM has interference on this response. METHODS Seventy-five athletes (13-15 years old have participated of study, with 38 male 37 female. Five training sessions of track and field were prescribe and monitoring by RPE-training session (intensity and Total Quality Recovery (TQR (recovery. RESULTS There was no statistical difference between males and females on 75-meters run, long jump and shot put. Otherwise, for training of 250 and 1000-meters females related higher RPE-values than males 3.68 ± 0.79, 3.26 ± 0.56, p < 0.01 and 4.14 ± 0.94, 3.72 ± 0.89, p < 0.05; respectively. Even when controlling the effect of biological maturity the same results were observed to 250-meters F1,73 = 2.060; p = 0.002 and 1000-meters F1,73 = 0.997; p = 0.036. There was no difference for TQR between genders. CONCLUSION The comparison the RPE-training session of females and males indicated there were difference to 250 and 1000-m training sessions, females have more RPE-training sessions than males. Additionally, there were no differences between genders for recovery parameters, even controlling BM.

  19. MR-guided stereotactic neurosurgery-comparison of fiducial-based and anatomical landmark transformation approaches

    International Nuclear Information System (INIS)

    Hunsche, S; Sauner, D; Maarouf, M; Hoevels, M; Luyken, K; Schulte, O; Lackner, K; Sturm, V; Treuer, H

    2004-01-01

    For application in magnetic resonance (MR) guided stereotactic neurosurgery, two methods for transformation of MR-image coordinates in stereotactic, frame-based coordinates exist: the direct stereotactic fiducial-based transformation method and the indirect anatomical landmark method. In contrast to direct stereotactic MR transformation, indirect transformation is based on anatomical landmark coregistration of stereotactic computerized tomography and non-stereotactic MR images. In a patient study, both transformation methods have been investigated with visual inspection and mutual information analysis. Comparison was done for our standard imaging protocol, including t2-weighted spin-echo as well as contrast enhanced t1-weighted gradient-echo imaging. For t2-weighted spin-echo imaging, both methods showed almost similar and satisfying performance with a small, but significant advantage for fiducial-based transformation. In contrast, for t1-weighted gradient-echo imaging with more geometric distortions due to field inhomogenities and gradient nonlinearity than t2-weighted spin-echo imaging, mainly caused by a reduced bandwidth per pixel, anatomical landmark transformation delivered markedly better results. Here, fiducial-based transformation yielded results which are intolerable for stereotactic neurosurgery. Mean Euclidian distances between both transformation methods were 0.96 mm for t2-weighted spin-echo and 1.67 mm for t1-weighted gradient-echo imaging. Maximum deviations were 1.72 mm and 3.06 mm, respectively

  20. Comparison of NDE standards in the frame of fracture mechanics approach

    International Nuclear Information System (INIS)

    Reale, S.; Capurro, E.; Corvi, A.

    1991-01-01

    The Design and Construction Codes are a set of rules which were set together because they were the best ones when the Codes were issued. A permanent objective must be to complete and improve these rules. This objective can be attained as the result of industrial experiences and by means of research and development activities. Until recently, high risk plants like nuclear plants were designed and built on the basis of the codes and standards of the country where the plant was to be built and operated, and this caused many disadvantages. On the contrary, the use of common codes and standards offers many advantages. A general objective is to compare codes in order to identify the differences in national rules and standards. The acceptance criteria based on nondestructive testing to reject dangerous defects are discussed. In this paper, the standards adopted in France, Germany, Italy and the United Kingdom are taken in consideration, and ultrasonic and radiographic inspections are selected. The methodology of this activity and the results of comparison are reported. (K.I.)