WorldWideScience

Sample records for process efficiency analysis

  1. Efficiency analysis of wood processing industry in China during 2006-2015

    Science.gov (United States)

    Zhang, Kun; Yuan, Baolong; Li, Yanxuan

    2018-03-01

    The wood processing industry is an important industry which affects the national economy and social development. The data envelopment analysis model (DEA) is a quantitative evaluation method for studying industrial efficiency. In this paper, the wood processing industry of 8 provinces in southern China is taken as the study object, and the efficiency of each province in 2006 to 2015 was measured and calculated with the DEA method, and the efficiency changes, technological changes and Malmquist index were analyzed dynamically. The empirical results show that there is a widening gap in the efficiency of wood processing industry of the 8 provinces, and the technological progress has shown a lag in the promotion of wood processing industry. According to the research conclusion, along with the situation of domestic and foreign wood processing industry development, the government must introduce relevant policies to strengthen the construction of the wood processing industry technology innovation policy system and the industrial coordinated development system.

  2. Process-based organization design and hospital efficiency.

    Science.gov (United States)

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  3. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  4. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  5. Comparison of second-generation processes for the conversion of sugarcane bagasse to liquid biofuels in terms of energy efficiency, pinch point analysis and Life Cycle Analysis

    International Nuclear Information System (INIS)

    Petersen, A.M.; Melamu, Rethabi; Knoetze, J.H.; Görgens, J.F.

    2015-01-01

    Highlights: • Process evaluation of thermochemical and biological routes for bagasse to fuels. • Pinch point analysis increases overall efficiencies by reducing utility consumption. • Advanced biological route increased efficiency and local environmental impacts. • Thermochemical routes have the highest efficiencies and low life cycle impacts. - Abstract: Three alternative processes for the production of liquid transportation biofuels from sugar cane bagasse were compared, on the perspective of energy efficiencies using process modelling, Process Environmental Assessments and Life Cycle Assessment. Bio-ethanol via two biological processes was considered, i.e. Separate Hydrolysis and Fermentation (Process 1) and Simultaneous Saccharification and Fermentation (Process 2), in comparison to Gasification and Fischer Tropsch synthesis for the production of synthetic fuels (Process 3). The energy efficiency of each process scenario was maximised by pinch point analysis for heat integration. The more advanced bio-ethanol process was Process 2 and it had a higher energy efficiency at 42.3%. Heat integration was critical for the Process 3, whereby the energy efficiency was increased from 51.6% to 55.7%. For both the Process Environmental and Life Cycle Assessment, Process 3 had the least potential for detrimental environmental impacts, due to its relatively high energy efficiency. Process 2 had the greatest Process Environmental Impact due to the intensive use of processing chemicals. Regarding the Life Cycle Assessments, Process 1 was the most severe due to its low energy efficiency

  6. Efficiency analysis system of material management

    Directory of Open Access Journals (Sweden)

    Bogusław Śliwczyński

    2012-12-01

    Full Text Available Background: Significant scope of enterprise's efficiency management is improving of material management process both the strategic and operational level. The complexity of material flow processes can lead to a threat such as distraction and disintegration of analysis focusing on many different factors influenced on effective sourcing and procurement management, transport and warehousing processes, inventory management, working capital and cash flow management. Material and methods: The presented article focuses on multidimensional and multi-criteria analysis of material management efficiency that is considered as decision support system. Authors have presented results of the research regarding ineffective material management confirm insufficient analytical supporting in various decisions of procurement operations. Results and conclusions: Based on research results authors presented in the article model of efficiency analysis system of material management.

  7. Energy and environment efficiency analysis based on an improved environment DEA cross-model: Case study of complex chemical processes

    International Nuclear Information System (INIS)

    Geng, ZhiQiang; Dong, JunGen; Han, YongMing; Zhu, QunXiong

    2017-01-01

    Highlights: •An improved environment DEA cross-model method is proposed. •Energy and environment efficiency analysis framework of complex chemical processes is obtained. •This proposed method is efficient in energy-saving and emission reduction of complex chemical processes. -- Abstract: The complex chemical process is a high pollution and high energy consumption industrial process. Therefore, it is very important to analyze and evaluate the energy and environment efficiency of the complex chemical process. Data Envelopment Analysis (DEA) is used to evaluate the relative effectiveness of decision-making units (DMUs). However, the traditional DEA method usually cannot genuinely distinguish the effective and inefficient DMU due to its extreme or unreasonable weight distribution of input and output variables. Therefore, this paper proposes an energy and environment efficiency analysis method based on an improved environment DEA cross-model (DEACM) method. The inputs of the complex chemical process are divided into energy and non-energy inputs. Meanwhile, the outputs are divided into desirable and undesirable outputs. And then the energy and environment performance index (EEPI) based on the cross evaluation is used to represent the overall performance of each DMU. Moreover, the improvement direction of energy-saving and carbon emission reduction of each inefficiency DMU is quantitatively obtained based on the self-evaluation model of the improved environment DEACM. The results show that the improved environment DEACM method has a better effective discrimination than the original DEA method by analyzing the energy and environment efficiency of the ethylene production process in complex chemical processes, and it can obtain the potential of energy-saving and carbon emission reduction of ethylene plants, especially the improvement direction of inefficient DMUs to improve energy efficiency and reduce carbon emission.

  8. The impact of transport processes standardization on supply chain efficiency

    Directory of Open Access Journals (Sweden)

    Maciej Stajniak

    2016-03-01

    Full Text Available Background: During continuous market competition, focusing on the customer service level, lead times and supply flexibility is very important to analyze the efficiency of logistics processes. Analysis of supply chain efficiency is one of the fundamental elements of controlling analysis. Transport processes are a key process that provides physical material flow through the supply chain. Therefore, in this article Authors focus attention on the transport processes efficiency. Methods: The research carried out in the second half of 2014 year, in 210 enterprises of the Wielkopolska Region. Observations and business practice studies conducted by the authors, demonstrate a significant impact of standardization processes on supply chain efficiency. Based on the research results, have been developed standard processes that have been assessed as being necessary to standardize in business practice. Results: Based on these research results and observations, authors have developed standards for transport processes by BPMN notation. BPMN allows authors to conduct multivariate simulation of these processes in further stages of research. Conclusions: Developed standards are the initial stage of research conducted by Authors in the assessment of transport processes efficiency. Further research direction is to analyze the use efficiency of transport processes standards in business practice and their impact on the effectiveness of the entire supply chain.

  9. The analysis of the process in the cooling tower with the low efficiency

    Science.gov (United States)

    Badriev, A. I.; Sharifullin, V. N.

    2017-11-01

    We put quite a difficult task maintaining a temperature drop to 11-12 degrees at thermal power plants to ensure the required depth of cooling of vacuum in the condenser, cooling towers. This requirement is achieved with the reducing of the hydraulic load with the low efficiency of the apparatus. The task analysis process in this unit and identify the causes of his poor performance was put in the work. One of the possible reasons may be the heterogeneity of the process in the volume of the apparatus. Therefore, it was decided to investigate experimentally the distribution of the irrigation water and the air flow in the cross section of industrial cooling towers. As a result, we found a significant uneven distribution of flows of water and air in the volume of the apparatus. We have shown theoretically that the uneven distribution of irrigation leads to a significant decrease in the efficiency of evaporation in the cooling tower. The velocity distribution of the air as the tower sections, and inside sections are interesting. The obtained experimental data allowed to establish the internal communication: the effects of the distributions of the density of irrigation in sections of the apparatus for the distribution of changes of the temperature and the air velocity. The obtained results allowed to formulate a methodology for determining process problems and to develop actions on increase of the efficiency of the cooling tower.

  10. A novel application of exergy analysis: Lean manufacturing tool to improve energy efficiency and flexibility of hydrocarbon processing

    International Nuclear Information System (INIS)

    Haragovics, Máté; Mizsey, Péter

    2014-01-01

    This work investigates the techniques used in evaluating distillation structures from lean manufacturing point of view. Oil and gas industry has already started adopting lean manufacturing principles in different types of processes from information flow to processing technologies. Generally, energy costs are the most important factors in processing hydrocarbons. Introducing flexibility desired by lean principles to the system may conflict energy efficiency of the system. However, this does not mean that the economic optimum is the energetic optimum. Therefore all possible changes due to temporarily stopped or not fully utilised plants have to be investigated, resulting in a large amount of cases that have to be evaluated. For evaluation exergy analysis can be used as it involves all energy types, and evaluation is straightforward. In this paper plain distillation structures are investigated, and the boundaries of the systems are set up according to the status of the site. Four component case studies are presented that show that the very same distillation structure can be more or less efficient depending on the status of the industrial site. It is also shown that exergy analysis used with different boundaries on the same system can show flexibility of the system and reveals potentials. - Highlights: • The article focuses on the flexibility aspect of lean manufacturing. • Exergy analysis of distillation scheme alternatives, energy efficiency. • Different boundaries define different scenarios of the same system is investigated. • The energy efficiency of distillation schemes also depends on their operating mode. • The exergy reserves of a distillation system can be revealed with exergy analysis

  11. Production and efficiency analysis with R

    CERN Document Server

    Behr, Andreas

    2015-01-01

    This textbook introduces essential topics and techniques in production and efficiency analysis and shows how to apply these methods using the statistical software R. Numerous small simulations lead to a deeper understanding of random processes assumed in the models and of the behavior of estimation techniques. Step-by-step programming provides an understanding of advanced approaches such as stochastic frontier analysis and stochastic data envelopment analysis. The text is intended for master students interested in empirical production and efficiency analysis. Readers are assumed to have a general background in production economics and econometrics, typically taught in introductory microeconomics and econometrics courses.

  12. Relative Efficiency Indicators of the Credit Management Process in a Colombian Bank by Means of Data Envelopment Analysis (DEA)

    OpenAIRE

    Sánchez-Gooding, Sandra Paola; Universidad Nacional de Colombia; Rodríguez-Lozano, Gloria Isabel; Universidad Nacional de Colombia

    2017-01-01

    The purpose of this work is to measure the relative efficiency of the units that take part in the credit management process in a Colombian bank by means of the use of Data Envelopment Analysis (DEA). Using a doublé optimization process, this advanced linear programmingmethodology generates a single relative efficiency index for each one of the units being studied, although it is capable of including multiple resources and multiple outputs. In the bank that was used as the object of this study...

  13. Development of energy-efficient processes for natural gas liquids recovery

    International Nuclear Information System (INIS)

    Yoon, Sekwang; Binns, Michael; Park, Sangmin; Kim, Jin-Kuk

    2017-01-01

    A new NGL (natural gas liquids) recovery process configuration is proposed which can offer improved energy efficiency and hydrocarbon recovery. The new process configuration is an evolution of the conventional turboexpander processes with the introduction of a split stream transferring part of the feed to the demethanizer column. In this way additional heat recovery is possible which improves the energy efficiency of the process. To evaluate the new process configuration a number of different NGL recovery process configurations are optimized and compared using a process simulator linked interactively with external optimization methods. Process integration methodology is applied as part of the optimization to improve energy recovery during the optimization. Analysis of the new process configuration compared with conventional turbo-expander process designs demonstrates the benefits of the new process configuration. - Highlights: • Development of a new energy-efficient natural gas liquids recovery process. • Improving energy recovery with application of process integration techniques. • Considering multiple different structural changes lead to considerable energy savings.

  14. Influence analysis of sewage sludge methane fermentation parameters on process efficiency

    OpenAIRE

    Катерина Борисівна Сорокіна

    2016-01-01

    The efficiency dependence of sewage sludge organic matter decomposition from organization and conditions of the process is analyzed. Support of the optimal values of several parameters ensures to provide completeness of the sludge fermentation process and obtain biogas in calculated amount. Biogas utilization reduces costs for reactor heating and provides additional obtaining of other types of energy

  15. Influence analysis of sewage sludge methane fermentation parameters on process efficiency

    Directory of Open Access Journals (Sweden)

    Катерина Борисівна Сорокіна

    2016-12-01

    Full Text Available The efficiency dependence of sewage sludge organic matter decomposition from organization and conditions of the process is analyzed. Support of the optimal values of several parameters ensures to provide completeness of the sludge fermentation process and obtain biogas in calculated amount. Biogas utilization reduces costs for reactor heating and provides additional obtaining of other types of energy

  16. Increased Efficiencies in the INEEL SAR/TSR/USQ Process

    International Nuclear Information System (INIS)

    Cole, N.E.

    2002-01-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) has implemented a number of efficiencies to reduce the time and cost of preparing safety basis documents. The INEEL is continuing to look at other aspects of the safety basis process to identify other efficiencies that can be implemented and remain in compliance with Title 10 Code of Federal Regulations (CFR) Part 830. A six-sigma approach is used to identify areas to improve efficiencies and develop the action plan for implementation of the new process, as applicable. Three improvement processes have been implemented: The first was the development of standardized Documented Safety Analysis (DSA) and technical safety requirement (TSR) documents that all nuclear facilities use, by adding facility-specific details. The second is a material procurement process, which is based on safety systems specified in the individual safety basis documents. The third is a restructuring of the entire safety basis preparation and approval process. Significant savings in time to prepare safety basis document, cost of materials, and total cost of the documents are currently being realized

  17. EFFICIENT QUANTITATIVE RISK ASSESSMENT OF JUMP PROCESSES: IMPLICATIONS FOR FOOD SAFETY

    OpenAIRE

    Nganje, William E.

    1999-01-01

    This paper develops a dynamic framework for efficient quantitative risk assessment from the simplest general risk, combining three parameters (contamination, exposure, and dose response) in a Kataoka safety-first model and a Poisson probability representing the uncertainty effect or jump processes associated with food safety. Analysis indicates that incorporating jump processes in food safety risk assessment provides more efficient cost/risk tradeoffs. Nevertheless, increased margin of safety...

  18. Overall equipment efficiency of Flexographic Printing process: A case study

    Science.gov (United States)

    Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.

    2017-12-01

    This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.

  19. Measuring the efficiency of zakat collection process using data envelopment analysis

    Science.gov (United States)

    Hamzah, Ahmad Aizuddin; Krishnan, Anath Rau

    2016-10-01

    It is really necessary for each zakat institution in the nation to timely measure and understand their efficiency in collecting zakat for the sake of continuous betterment. Pusat Zakat Sabah, Malaysia which has kicked off its operation in early of 2007, is not excused from this obligation as well. However, measuring the collection efficiency is not a very easy task as it usually incorporates the consideration of multiple inputs or/and outputs. This paper sequentially employed three data envelopment analysis models, namely Charnes-Cooper-Rhodes (CCR) primal model, CCR dual model, and slack based model to quantitatively evaluate the efficiency of zakat collection in Sabah across the year of 2007 up to 2015 by treating each year as a decision making unit. The three models were developed based on two inputs (i.e. number of zakat branches and number of staff) and one output (i.e. total collection). The causes for not achieving efficiency and the suggestions on how the efficiency in each year could have been improved were disclosed.

  20. Decomposition based parallel processing technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2000-01-01

    In practical design studies, most of designers solve multidisciplinary problems with complex design structure. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder original design processes to minimize total cost and time. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  1. Development of an eco-efficient product/process for the vulcanising industry

    Directory of Open Access Journals (Sweden)

    Becerra, M. B.

    2014-08-01

    Full Text Available This paper presents the development of an eco-efficient product/process, which has improved mechanical properties from the introduction of natural fibres in the EPDM (Ethylene-Propylene-Diene-Terpolymer rubber formulation. The optimisation analysis is made by a fractional factorial design 211-7. Different formulations were evaluated using a multi-response desirability function, with the aim of finding efficient levels for the manufacturing time-cycle, improving the mechanical properties of the product, and reducing the raw material costs. The development of an eco-efficient product/process generates a sustainable alternative to conventional manufacturing.

  2. Parallel processing based decomposition technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2001-01-01

    In practical design studies, most of designers solve multidisciplinary problems with large sized and complex design system. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder the original design processes to minimize total computational cost. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  3. Idaho Chemical Processing Plant Process Efficiency improvements

    International Nuclear Information System (INIS)

    Griebenow, B.

    1996-03-01

    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond

  4. Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes.

    Science.gov (United States)

    Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon

    2017-12-01

    Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.

  5. Process efficiency. Redesigning social networks to improve surgery patient flow.

    Science.gov (United States)

    Samarth, Chandrika N; Gloor, Peter A

    2009-01-01

    We propose a novel approach to improve throughput of the surgery patient flow process of a Boston area teaching hospital. A social network analysis was conducted in an effort to demonstrate that process efficiency gains could be achieved through redesign of social network patterns at the workplace; in conjunction with redesign of organization structure and the implementation of workflow over an integrated information technology system. Key knowledge experts and coordinators in times of crisis were identified and a new communication structure more conducive to trust and knowledge sharing was suggested. The new communication structure is scalable without compromising on coordination required among key roles in the network for achieving efficiency gains.

  6. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  7. Efficiency of manufacturing processes energy and ecological perspectives

    CERN Document Server

    Li, Wen

    2015-01-01

     This monograph presents a reliable methodology for characterising the energy and eco-efficiency of unit manufacturing processes. The Specific Energy Consumption, SEC, will be identified as the key indicator for the energy efficiency of unit processes.  An empirical approach will be validated on different machine tools and manufacturing processes to depict the relationship between process parameters and energy consumptions. Statistical results and additional validation runs will corroborate the high level of accuracy in predicting the energy consumption. In relation to the eco-efficiency, the value and the associated environmental impacts of  manufacturing processes will also be discussed. The interrelationship between process parameters, process value and the associated environmental impact will be integrated in the evaluation of eco-efficiency. The book concludes with a further investigation of the results in order to develop strategies for further efficiency improvement. The target audience primarily co...

  8. Development of a strategy for energy efficiency improvement in a Kraft process based on systems interactions analysis

    Science.gov (United States)

    Mateos-Espejel, Enrique

    The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is

  9. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  10. Thermodynamic analysis applied to a food-processing plant

    Energy Technology Data Exchange (ETDEWEB)

    Ho, J C; Chandratilleke, T T

    1987-01-01

    Two production lines of a multi-product, food-processing plant are selected for energy auditing and analysis. Thermodynamic analysis showed that the first-law and second-law efficiencies are 81.5% and 26.1% for the instant-noodles line and 23.6% and 7.9% for the malt-beverage line. These efficiency values are dictated primarily by the major energy-consuming sub-processes of each production line. Improvements in both first-law and second-law efficiencies are possible for the plants if the use of steam for heating is replaced by gaseous or liquid fuels, the steam ejectors for creating vacuum are replaced by a mechanical pump, and employing the cooler surroundings to assist in the cooling process.

  11. Simulation based energy-resource efficient manufacturing integrated with in-process virtual management

    Science.gov (United States)

    Katchasuwanmanee, Kanet; Cheng, Kai; Bateman, Richard

    2016-09-01

    As energy efficiency is one of the key essentials towards sustainability, the development of an energy-resource efficient manufacturing system is among the great challenges facing the current industry. Meanwhile, the availability of advanced technological innovation has created more complex manufacturing systems that involve a large variety of processes and machines serving different functions. To extend the limited knowledge on energy-efficient scheduling, the research presented in this paper attempts to model the production schedule at an operation process by considering the balance of energy consumption reduction in production, production work flow (productivity) and quality. An innovative systematic approach to manufacturing energy-resource efficiency is proposed with the virtual simulation as a predictive modelling enabler, which provides real-time manufacturing monitoring, virtual displays and decision-makings and consequentially an analytical and multidimensional correlation analysis on interdependent relationships among energy consumption, work flow and quality errors. The regression analysis results demonstrate positive relationships between the work flow and quality errors and the work flow and energy consumption. When production scheduling is controlled through optimization of work flow, quality errors and overall energy consumption, the energy-resource efficiency can be achieved in the production. Together, this proposed multidimensional modelling and analysis approach provides optimal conditions for the production scheduling at the manufacturing system by taking account of production quality, energy consumption and resource efficiency, which can lead to the key competitive advantages and sustainability of the system operations in the industry.

  12. Eco-efficiency of grinding processes and systems

    CERN Document Server

    Winter, Marius

    2016-01-01

    This research monograph aims at presenting an integrated assessment approach to describe, model, evaluate and improve the eco-efficiency of existing and new grinding processes and systems. Various combinations of grinding process parameters and system configurations can be evaluated based on the eco-efficiency. The book presents the novel concept of empirical and physical modeling of technological, economic and environmental impact indicators. This includes the integrated evaluation of different grinding process and system scenarios. The book is a valuable read for research experts and practitioners in the field of eco-efficiency of manufacturing processes but the book may also be beneficial for graduate students.

  13. Energy and exergy analysis of the silicon production process

    International Nuclear Information System (INIS)

    Takla, M.; Kamfjord, N.E.; Tveit, Halvard; Kjelstrup, S.

    2013-01-01

    We used energy and exergy analysis to evaluate two industrial and one ideal (theoretical) production process for silicon. The industrial processes were considered in the absence and presence of power production from waste heat in the off-gas. The theoretical process, with pure reactants and no side-reactions, was used to provide a more realistic upper limit of performance for the others. The energy analysis documented the large thermal energy source in the off-gas system, while the exergy analysis documented the potential for efficiency improvement. We found an exergetic efficiency equal to 0.33 ± 0.02 for the process without power production. The value increased to 0.41 ± 0.03 when waste heat was utilized. For the ideal process, we found an exergetic efficiency of 0.51. Utilization of thermal exergy in an off-gas of 800 °C increased this exergetic efficiency to 0.71. Exergy destructed due to combustion of by-product gases and exergy lost with the furnace off-gas were the largest contributors to the thermodynamic inefficiency of all processes. - Highlights: • The exergetic efficiency for an industrial silicon production process when silicon is the only product was estimated to 0.33. • With additional power production from thermal energy in the off-gas we estimated the exergetic efficiency to 0.41. • The theoretical silicon production process is established as the reference case. • Exergy lost with the off-gas and exergy destructed due to combustion account for roughly 75% of the total losses. • With utilization of the thermal exergy in the off-gas at a temperature of 800 °C the exergetic efficiency was 0.71

  14. Tolerance analysis on diffraction efficiency and polychromatic integral diffraction efficiency for harmonic diffractive optics

    Science.gov (United States)

    Shan, Mao

    2016-10-01

    In this dissertation, the mathematical model of effect of manufacturing errors including microstructure relative height error and relative width error on diffraction efficiency for the harmonic diffractive optical elements (HDEs) is set up. According to the expression of the phase delay and diffraction efficiency of the HDEs, the expression of diffraction efficiency of refraction and diffractive optical element with the microstructure height and periodic width errors in fabrication process is presented in this paper. Furthermore, the effect of manufacturing errors on diffraction efficiency for the harmonic diffractive optical elements is studied, and diffraction efficiency change is analyzed as the relative microstructure height-error in the same and in the opposite sign as well as relative width-error in the same and in the opposite sign. Example including infrared wavelength with materials GE has been discussed in this paper. Two kinds of manufacturing errors applied in 3.7 4.3um middle infrared and 8.7-11.5um far infrared optical system which results in diffraction efficiency and PIDE of HDEs are studied. The analysis results can be used for manufacturing error control in micro-structure height and periodic width. Results can be used for HDEs processing.

  15. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  16. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  17. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  19. Efficiency analysis of hydrogen production methods from biomass

    NARCIS (Netherlands)

    Ptasinski, K.J.

    2008-01-01

    Abstract: Hydrogen is considered as a universal energy carrier for the future, and biomass has the potential to become a sustainable source of hydrogen. This article presents an efficiency analysis of hydrogen production processes from a variety of biomass feedstocks by a thermochemical method –

  20. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  1. The Efficiency of Halal Processed Food Industry in Malaysia

    Directory of Open Access Journals (Sweden)

    Mohd Ali Mohd Noor

    2016-06-01

    Full Text Available Efficiency is indispensable for an industry to ensure cost reduction and profit maximization. It also helps the industry to be competitive and remain in the market. In 2010, Malaysia aims to be the world halal hub. The hub should capture at least five percent of the world halal market with at least 10,000 exporting firms. However the hub failed due to the small number of firms efficiency that finally contribute to less number of firms export. Thus, this study aimed to measure the efficiency of halal processed food industry in Malaysia using Data Envelopment Analysis (DEA. Input variables used were local raw inputs, labour, and monetary assets of halal food industry in Malaysia. Meanwhile the output used was the total sales revenue of the halal industry in Malaysia. The study shows that very few indusries are efficient in each category led by meat, dairy, cordials and juices, marine products, food crops, and grains industry. Therefore, the government needs to emphasize on industry’s efficiency to be competitive and be the world halal hub in the future.

  2. Efficiency of optical-electronic systems: methods application for the analysis of structural changes in the process of eye grounds diagnosis

    Science.gov (United States)

    Saldan, Yosyp R.; Pavlov, Sergii V.; Vovkotrub, Dina V.; Saldan, Yulia Y.; Vassilenko, Valentina B.; Mazur, Nadia I.; Nikolaichuk, Daria V.; Wójcik, Waldemar; Romaniuk, Ryszard; Suleimenov, Batyrbek; Bainazarov, Ulan

    2017-08-01

    Process of eye tomogram obtaining by means of optical coherent tomography is studied. Stages of idiopathic macula holes formation in the process of eye grounds diagnostics are considered. Main stages of retina pathology progression are determined: Fuzzy logic units for obtaining reliable conclusions regarding the result of diagnosis are developed. By the results of theoretical and practical research system and technique of retinal macular region of the eye state analysis is developed ; application of the system, based on fuzzy logic device, improves the efficiency of eye retina complex.

  3. Efficient processing of fluorescence images using directional multiscale representations.

    Science.gov (United States)

    Labate, D; Laezza, F; Negi, P; Ozcan, B; Papadakis, M

    2014-01-01

    Recent advances in high-resolution fluorescence microscopy have enabled the systematic study of morphological changes in large populations of cells induced by chemical and genetic perturbations, facilitating the discovery of signaling pathways underlying diseases and the development of new pharmacological treatments. In these studies, though, due to the complexity of the data, quantification and analysis of morphological features are for the vast majority handled manually, slowing significantly data processing and limiting often the information gained to a descriptive level. Thus, there is an urgent need for developing highly efficient automated analysis and processing tools for fluorescent images. In this paper, we present the application of a method based on the shearlet representation for confocal image analysis of neurons. The shearlet representation is a newly emerged method designed to combine multiscale data analysis with superior directional sensitivity, making this approach particularly effective for the representation of objects defined over a wide range of scales and with highly anisotropic features. Here, we apply the shearlet representation to problems of soma detection of neurons in culture and extraction of geometrical features of neuronal processes in brain tissue, and propose it as a new framework for large-scale fluorescent image analysis of biomedical data.

  4. Doping efficiency analysis of highly phosphorous doped epitaxial/amorphous silicon emitters grown by PECVD for high efficiency silicon solar cells

    Energy Technology Data Exchange (ETDEWEB)

    El-Gohary, H.G.; Sivoththaman, S. [Waterloo Univ., ON (Canada). Dept. of Electrical and Computer Engineering

    2008-08-15

    The efficient doping of hydrogenated amorphous and crystalline silicon thin films is a key factor in the fabrication of silicon solar cells. The most popular method for developing those films is plasma enhanced chemical vapor deposition (PECVD) because it minimizes defect density and improves doping efficiency. This paper discussed the preparation of different structure phosphorous doped silicon emitters ranging from epitaxial to amorphous films at low temperature. Phosphine (PH{sub 3}) was employed as the doping gas source with the same gas concentration for both epitaxial and amorphous silicon emitters. The paper presented an analysis of dopant activation by applying a very short rapid thermal annealing process (RTP). A spreading resistance profile (SRP) and SIMS analysis were used to detect both the active dopant and the dopant concentrations, respectively. The paper also provided the results of a structural analysis for both bulk and cross-section at the interface using high-resolution transmission electron microscopy and Raman spectroscopy, for epitaxial and amorphous films. It was concluded that a unity doping efficiency could be achieved in epitaxial layers by applying an optimized temperature profile using short time processing rapid thermal processing technique. The high quality, one step epitaxial layers, led to both high conductive and high doping efficiency layers.

  5. Efficient simulation of press hardening process through integrated structural and CFD analyses

    International Nuclear Information System (INIS)

    Palaniswamy, Hariharasudhan; Mondalek, Pamela; Wronski, Maciek; Roy, Subir

    2013-01-01

    Press hardened steel parts are being increasingly used in automotive structures for their higher strength to meet safety standards while reducing vehicle weight to improve fuel consumption. However, manufacturing of sheet metal parts by press hardening process to achieve desired properties is extremely challenging as it involves complex interaction of plastic deformation, metallurgical change, thermal distribution, and fluid flow. Numerical simulation is critical for successful design of the process and to understand the interaction among the numerous process parameters to control the press hardening process in order to consistently achieve desired part properties. Until now there has been no integrated commercial software solution that can efficiently model the complete process from forming of the blank, heat transfer between the blank and tool, microstructure evolution in the blank, heat loss from tool to the fluid that flows through water channels in the tools. In this study, a numerical solution based on Altair HyperWorks® product suite involving RADIOSS®, a non-linear finite element based structural analysis solver and AcuSolve®, an incompressible fluid flow solver based on Galerkin Least Square Finite Element Method have been utilized to develop an efficient solution for complete press hardening process design and analysis. RADIOSS is used to handle the plastic deformation, heat transfer between the blank and tool, and microstructure evolution in the blank during cooling. While AcuSolve is used to efficiently model heat loss from tool to the fluid that flows through water channels in the tools. The approach is demonstrated through some case studies

  6. Assessing efficiency and effectiveness of Malaysian Islamic banks: A two stage DEA analysis

    Science.gov (United States)

    Kamarudin, Norbaizura; Ismail, Wan Rosmanira; Mohd, Muhammad Azri

    2014-06-01

    Islamic banks in Malaysia are indispensable players in the financial industry with the growing needs for syariah compliance system. In the banking industry, most recent studies concerned only on operational efficiency. However rarely on the operational effectiveness. Since the production process of banking industry can be described as a two-stage process, two-stage Data Envelopment Analysis (DEA) can be applied to measure the bank performance. This study was designed to measure the overall performance in terms of efficiency and effectiveness of Islamic banks in Malaysia using Two-Stage DEA approach. This paper presents analysis of a DEA model which split the efficiency and effectiveness in order to evaluate the performance of ten selected Islamic Banks in Malaysia for the financial year period ended 2011. The analysis shows average efficient score is more than average effectiveness score thus we can say that Malaysian Islamic banks were more efficient rather than effective. Furthermore, none of the bank exhibit best practice in both stages as we can say that a bank with better efficiency does not always mean having better effectiveness at the same time.

  7. Analyzing the Efficient Execution of In-Store Logistics Processes in Grocery Retailing

    DEFF Research Database (Denmark)

    Reiner, Gerald; Teller, Christop; Kotzab, Herbert

    2013-01-01

    In this article, we examine in-store logistics processes for handling dairy products, from the incoming dock to the shelves of supermarkets and hypermarkets. The efficient execution of the in-store logistics related to such fast-moving, sensitive, and essential items is challenging and crucial...... for grocery retailers' sales, profits, and image. In our empirical study, we survey in-store logistics processes in 202 grocery supermarkets and hypermarkets belonging to a major retail chain in central Europe. Using a data envelopment analysis (DEA) and simulation, we facilitate process benchmarking....... In particular, we identify ways of improving in-store logistics processes by showing the performance impacts of different managerial strategies and tactics. The DEA results indicate different efficiency levels for different store formats; the hybrid store format of the small hypermarket exhibits a comparatively...

  8. Conversion efficiency in the process of copolarized spontaneous four-wave mixing

    International Nuclear Information System (INIS)

    Garay-Palmett, Karina; U'Ren, Alfred B.; Rangel-Rojo, Raul

    2010-01-01

    We study the process of copolarized spontaneous four-wave mixing in single-mode optical fibers, with an emphasis on an analysis of the conversion efficiency. We consider both the monochromatic-pump and pulsed-pump regimes, as well as both the degenerate-pump and nondegenerate-pump configurations. We present analytical expressions for the conversion efficiency, which are given in terms of double integrals. In the case of pulsed pumps we take these expressions to closed analytical form with the help of certain approximations. We present results of numerical simulations, and compare them to values obtained from our analytical expressions, for the conversion efficiency as a function of several key experimental parameters.

  9. Process development for high-efficiency silicon solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Gee, J.M.; Basore, P.A.; Buck, M.E.; Ruby, D.S.; Schubert, W.K.; Silva, B.L.; Tingley, J.W.

    1991-12-31

    Fabrication of high-efficiency silicon solar cells in an industrial environment requires a different optimization than in a laboratory environment. Strategies are presented for process development of high-efficiency silicon solar cells, with a goal of simplifying technology transfer into an industrial setting. The strategies emphasize the use of statistical experimental design for process optimization, and the use of baseline processes and cells for process monitoring and quality control. 8 refs.

  10. Pinch analysis for bioethanol production process from lignocellulosic biomass

    International Nuclear Information System (INIS)

    Fujimoto, S.; Yanagida, T.; Nakaiwa, M.; Tatsumi, H.; Minowa, T.

    2011-01-01

    Bioethanol produced from carbon neutral and renewable biomass resources is an attractive process for the mitigation of greenhouse gases from vehicle exhaust. This study investigated energy utilization during bioethanol production from lignocellulose while avoiding competition with food production from corn and considering the potential mitigation of greenhouse gases. Process design and simulations were performed for bioethanol production using concentrated sulfuric acid. Mass and heat balances were obtained by process simulations, and the heat recovery ratio was determined by pinch analysis. An energy saving of 38% was achieved. However, energy supply and demand were not effectively utilized in the temperature range from 95 to 100 o C. Therefore, a heat pump was used to improve the temperature range of efficient energy supply and demand. Results showed that the energy required for the process could be supplied by heat released during the process. Additionally, the power required was supplied by surplus power generated during the process. Thus, pinch analysis was used to improve the energy efficiency of the process. - Highlights: → Effective energy utilization of bioethanol production was studied by using pinch analysis. → It was found that energy was not effectively utilized in the temperature range from 95 to 100 o C. → Use of a heat pump was considered to improve the ineffective utilization. → Then, remarkable energy savings could be achieved by it. → Pinch analysis effectively improved the energy efficiency of the bioethanol production.

  11. Efficiency analysis of a cogeneration and district energy system

    International Nuclear Information System (INIS)

    Rosen, Marc A.; Le, Minh N.; Dincer, Ibrahim

    2005-01-01

    This paper presents an efficiency analysis, accounting for both energy and exergy considerations, of a design for a cogeneration-based district energy system. A case study is considered for the city of Edmonton, Canada, by the utility Edmonton Power. The original concept using central electric chillers, as well as two variations (one considering single-effect and the other double-effect absorption chillers) are examined. The energy- and exergy-based results differ markedly (e.g., overall energy efficiencies are shown to vary for the three configurations considered from 83% to 94%, and exergy efficiencies from 28% to 29%, respectively). For the overall processes, as well as individual subprocesses and selected combinations of subprocesses, the exergy efficiencies are generally found to be more meaningful and indicative of system behaviour than the energy efficiencies

  12. Design of sample analysis device for iodine adsorption efficiency test in NPPs

    International Nuclear Information System (INIS)

    Ji Jinnan

    2015-01-01

    In nuclear power plants, iodine adsorption efficiency test is used to check the iodine adsorption efficiency of the iodine adsorber. The iodine adsorption efficiency can be calculated through the analysis of the test sample, and thus to determine if the performance of the adsorber meets the requirement on the equipment operation and emission. Considering the process of test and actual demand, in this paper, a special device for the analysis of this kind of test sample is designed. The application shows that the device is with convenient operation and high reliability and accurate calculation, and improves the experiment efficiency and reduces the experiment risk. (author)

  13. Economic efficiency analysis of electron accelerator for irradiation processing

    International Nuclear Information System (INIS)

    Shi Huidong; Chen Ronghui

    2003-01-01

    The fixed assets, running cost and economic efficiency were discussed in this paper. For building electron accelerator of 10 MeV and 3 kW, the running cost is one time higher than building cobalt source at 2.22 x 10 15 Bq, but economic efficiency of building a electron accelerator is much higher than building a cobalt source

  14. Study on Parallel Processing for Efficient Flexible Multibody Analysis based on Subsystem Synthesis Method

    Energy Technology Data Exchange (ETDEWEB)

    Han, Jong-Boo; Song, Hajun; Kim, Sung-Soo [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-06-15

    Flexible multibody simulations are widely used in the industry to design mechanical systems. In flexible multibody dynamics, deformation coordinates are described either relatively in the body reference frame that is floating in the space or in the inertial reference frame. Moreover, these deformation coordinates are generated based on the discretization of the body according to the finite element approach. Therefore, the formulation of the flexible multibody system always deals with a huge number of degrees of freedom and the numerical solution methods require a substantial amount of computational time. Parallel computational methods are a solution for efficient computation. However, most of the parallel computational methods are focused on the efficient solution of large-sized linear equations. For multibody analysis, we need to develop an efficient formulation that could be suitable for parallel computation. In this paper, we developed a subsystem synthesis method for a flexible multibody system and proposed efficient parallel computational schemes based on the OpenMP API in order to achieve efficient computation. Simulations of a rotating blade system, which consists of three identical blades, were carried out with two different parallel computational schemes. Actual CPU times were measured to investigate the efficiency of the proposed parallel schemes.

  15. Industrial applications using BASF eco-efficiency analysis: perspectives on green engineering principles.

    Science.gov (United States)

    Shonnard, David R; Kicherer, Andreas; Saling, Peter

    2003-12-01

    Life without chemicals would be inconceivable, but the potential risks and impacts to the environment associated with chemical production and chemical products are viewed critically. Eco-efficiency analysis considers the economic and life cycle environmental effects of a product or process, giving these equal weighting. The major elements of the environmental assessment include primary energy use, raw materials utilization, emissions to all media, toxicity, safety risk, and land use. The relevance of each environmental category and also for the economic versus the environmental impacts is evaluated using national emissions and economic data. The eco-efficiency analysis method of BASF is briefly presented, and results from three applications to chemical processes and products are summarized. Through these applications, the eco-efficiency analyses mostly confirm the 12 Principles listed in Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37(5), 94A), with the exception that, in one application, production systems based on bio-based feedstocks were not the most eco-efficient as compared to those based on fossil resources. Over 180 eco-efficiency analyses have been conducted at BASF, and their results have been used to support strategic decision-making, marketing, research and development, and communication with external parties. Eco-efficiency analysis, as one important strategy and success factor in sustainable development, will continue to be a very strong operational tool at BASF.

  16. ADDED VALUE AS EFFICIENCY CRITERION FOR INDUSTRIAL PRODUCTION PROCESS

    Directory of Open Access Journals (Sweden)

    L. M. Korotkevich

    2016-01-01

    Full Text Available Literary analysis has shown that the majority of researchers are using classical efficiency criteria for construction of an optimization model for production process: profit maximization; cost minimization; maximization of commercial product output; minimization of back-log for product demand; minimization of total time consumption due to production change. The paper proposes to use an index of added value as an efficiency criterion because it combines economic and social interests of all main interested subjects of the business activity: national government, property owners, employees, investors. The following types of added value have been considered in the paper: joint-stock, market, monetary, economic, notional (gross, net, real. The paper makes suggestion to use an index of real value added as an efficiency criterion. Such approach permits to bring notional added value in comparable variant because added value can be increased not only due to efficiency improvement of enterprise activity but also due to environmental factors – excess in rate of export price increases over rate of import growth. An analysis of methods for calculation of real value added has been made on a country-by-country basis (extrapolation, simple and double deflation. A method of double deflation has been selected on the basis of the executed analysis and it is counted according to the Laspeyires, Paasche, Fischer indices. A conclusion has been made that the used expressions do not take into account fully economic peculiarities of the Republic of Belarus: they are considered as inappropriate in the case when product cost is differentiated according to marketing outlets; they do not take account of difference in rate of several currencies and such approach is reflected in export price of a released product and import price for raw material, supplies and component parts. Taking this into consideration expressions for calculation of real value added have been specified

  17. Refractories for Industrial Processing. Opportunities for Improved Energy Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Hemrick, James G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hayden, H. Wayne [Metals Manufacture Process and Controls Technology, Inc., Oak Ridge, TN (United States); Angelini, Peter [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moore, Robert E. [R.E. Moore Associates, Maricopa, AZ (United States); Headrick, William L. [R.E. Moore Associates, Maricopa, AZ (United States)

    2005-01-01

    Refractories are a class of materials of critical importance to manufacturing industries with high-temperature unit processes. This study describes industrial refractory applications and identifies refractory performance barriers to energy efficiency for processing. The report provides recommendations for R&D pathways leading to improved refractories for energy-efficient manufacturing and processing.

  18. Energy efficient process planning based on numerical simulations

    OpenAIRE

    Neugebauer, Reimund; Hochmuth, C.; Schmidt, G.; Dix, M.

    2011-01-01

    The main goal of energy-efficient manufacturing is to generate products with maximum value-added at minimum energy consumption. To this end, in metal cutting processes, it is necessary to reduce the specific cutting energy while, at the same time, precision requirements have to be ensured. Precision is critical in metal cutting processes because they often constitute the final stages of metalworking chains. This paper presents a method for the planning of energy-efficient machining processes ...

  19. Simple processing of high efficiency silicon solar cells

    International Nuclear Information System (INIS)

    Hamammu, I.M.; Ibrahim, K.

    2006-01-01

    Cost effective photovoltaic devices have been an area research since the development of the first solar cells, as cost is the major factor in their usage. Silicon solar cells have the biggest share in the photovoltaic market, though silicon os not the optimal material for solar cells. This work introduces a simplified approach for high efficiency silicon solar cell processing, by minimizing the processing steps and thereby reducing cost. The suggested procedure might also allow for the usage of lower quality materials compared to the one used today. The main features of the present work fall into: simplifying the diffusion process, edge shunt isolation and using acidic texturing instead of the standard alkaline processing. Solar cells of 17% efficiency have been produced using this procedure. Investigations on the possibility of improving the efficiency and using less quality material are still underway

  20. Assessing eco-efficiency: A metafrontier directional distance function approach using life cycle analysis

    International Nuclear Information System (INIS)

    Beltrán-Esteve, Mercedes; Reig-Martínez, Ernest; Estruch-Guitart, Vicent

    2017-01-01

    Sustainability analysis requires a joint assessment of environmental, social and economic aspects of production processes. Here we propose the use of Life Cycle Analysis (LCA), a metafrontier (MF) directional distance function (DDF) approach, and Data Envelopment Analysis (DEA), to assess technological and managerial differences in eco-efficiency between production systems. We use LCA to compute six environmental and health impacts associated with the production processes of nearly 200 Spanish citrus farms belonging to organic and conventional farming systems. DEA is then employed to obtain joint economic-environmental farm's scores that we refer to as eco-efficiency. DDF allows us to determine farms' global eco-efficiency scores, as well as eco-efficiency scores with respect to specific environmental impacts. Furthermore, the use of an MF helps us to disentangle technological and managerial eco-inefficiencies by comparing the eco-efficiency of both farming systems with regards to a common benchmark. Our core results suggest that the shift from conventional to organic farming technology would allow a potential reduction in environmental impacts of 80% without resulting in any decline in economic performance. In contrast, as regards farmers' managerial capacities, both systems display quite similar mean scores.

  1. Assessing eco-efficiency: A metafrontier directional distance function approach using life cycle analysis

    Energy Technology Data Exchange (ETDEWEB)

    Beltrán-Esteve, Mercedes, E-mail: mercedes.beltran@uv.es [Department of Applied Economics II, University of Valencia (Spain); Reig-Martínez, Ernest [Department of Applied Economics II, University of Valencia, Ivie (Spain); Estruch-Guitart, Vicent [Department of Economy and Social Sciences, Polytechnic University of Valencia (Spain)

    2017-03-15

    Sustainability analysis requires a joint assessment of environmental, social and economic aspects of production processes. Here we propose the use of Life Cycle Analysis (LCA), a metafrontier (MF) directional distance function (DDF) approach, and Data Envelopment Analysis (DEA), to assess technological and managerial differences in eco-efficiency between production systems. We use LCA to compute six environmental and health impacts associated with the production processes of nearly 200 Spanish citrus farms belonging to organic and conventional farming systems. DEA is then employed to obtain joint economic-environmental farm's scores that we refer to as eco-efficiency. DDF allows us to determine farms' global eco-efficiency scores, as well as eco-efficiency scores with respect to specific environmental impacts. Furthermore, the use of an MF helps us to disentangle technological and managerial eco-inefficiencies by comparing the eco-efficiency of both farming systems with regards to a common benchmark. Our core results suggest that the shift from conventional to organic farming technology would allow a potential reduction in environmental impacts of 80% without resulting in any decline in economic performance. In contrast, as regards farmers' managerial capacities, both systems display quite similar mean scores.

  2. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  3. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  4. A test of processing efficiency theory in a team sport context.

    Science.gov (United States)

    Smith, N C; Bellamy, M; Collins, D J; Newell, D

    2001-05-01

    In this study, we tested some key postulates of Eysenck and Calvo's processing efficiency theory in a team sport. The participants were 12 elite male volleyball players who were followed throughout the course of a competitive season. Self-report measures of pre-match and in-game cognitive anxiety and mental effort were collected in groups of players high and low in dispositional anxiety. Player performance was determined from the statistical analysis of match-play. Sets were classified according to the point spread separating the two teams into one of three levels of criticality. Game momentum was also analysed to determine its influence on in-game state anxiety. Significant differences in in-game cognitive anxiety were apparent between high and low trait anxiety groups. An interaction between anxiety grouping and momentum condition was also evident in cognitive anxiety. Differences in set criticality were reflected in significant elevations in mental effort, an effect more pronounced in dispositionally high anxious performers. Consistent with the predictions of processing efficiency theory, mental effort ratings were higher in high trait-anxious players in settings where their performance was equivalent to that of low trait-anxious performers. The usefulness of processing efficiency theory as an explanatory framework in sport anxiety research is discussed in the light of these findings.

  5. Analytic Hierarchy Process-Based Analysis to Determine the Barriers to Implementing a Material Efficiency Strategy: Electrical and Electronics’ Companies in the Malaysian Context

    Directory of Open Access Journals (Sweden)

    Fu Haw Ho

    2016-10-01

    Full Text Available Material efficiency is one of the most important strategies for helping manufacturing companies achieve sustainability in their production activities. However, there are many barriers to the implementation of material efficiency strategies in the manufacturing processes and overall business operations. The aim of this study is to identify and evaluate the barriers faced by Electrical and Electronics (E&E manufacturing companies in Malaysia in implementing material efficiency strategies. A mixed-mode research method was employed to collect data from these companies. Semi-structured interviews were used to identify the barriers faced by the Malaysian Electrical and Electronics (E&E industry, while an Analytic Hierarchy Process (AHP survey was utilized to determine the importance of each barrier. Seven companies participated in the semi-structured interviews, and 18 companies took part in the AHP survey. Nine barriers were generated from analysis of the interviews, and were then ranked by priority using the AHP method. These important findings could be used as a guide for E&E companies in managing or overcoming barriers during the implementation of material efficiency strategies and other sustainable manufacturing activities.

  6. Barriers' and policies' analysis of China's building energy efficiency

    International Nuclear Information System (INIS)

    Zhang, Yurong; Wang, Yuanfeng

    2013-01-01

    With the rapid economic growth and the improvement of people's living standards, China's building energy consumption has kept rising during the past 15 years. Under the effort of the Chinese government and the society, China's building energy efficiency has made certain achievements. However, the implementation of building energy efficiency in China is still far from its potential. Based on the analysis of the existing policies implemented in China, the article concluded that the most essential and the most effective ways to promote building energy efficiency is the government's involvement as well as economic and financial incentives. In addition, the main barriers in the process of promoting building energy efficiency in China are identified in six aspects. It has been found that the legal system and administrative issues constitute major barriers, and the lack of financial incentives and the mismatching of market mechanism also hamper the promotion of building energy efficiency. Finally, in view of the existing policies and barriers analysis, three corresponding policy proposals are presented. -- Highlights: •The existing policies implemented in China from three aspects are presented and analysed. •The Government's involvement is the most essential effective way to promote building-energy efficiency. •Six aspects of barriers in promoting building energy efficiency in China are identified. •The legal system and administrative issues constitute the major barriers. •Three policy proposals to further promote building energy efficiency in China are proposed

  7. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  8. Auditory processing efficiency deficits in children with developmental language impairments

    Science.gov (United States)

    Hartley, Douglas E. H.; Moore, David R.

    2002-12-01

    The ``temporal processing hypothesis'' suggests that individuals with specific language impairments (SLIs) and dyslexia have severe deficits in processing rapidly presented or brief sensory information, both within the auditory and visual domains. This hypothesis has been supported through evidence that language-impaired individuals have excess auditory backward masking. This paper presents an analysis of masking results from several studies in terms of a model of temporal resolution. Results from this modeling suggest that the masking results can be better explained by an ``auditory efficiency'' hypothesis. If impaired or immature listeners have a normal temporal window, but require a higher signal-to-noise level (poor processing efficiency), this hypothesis predicts the observed small deficits in the simultaneous masking task, and the much larger deficits in backward and forward masking tasks amongst those listeners. The difference in performance on these masking tasks is predictable from the compressive nonlinearity of the basilar membrane. The model also correctly predicts that backward masking (i) is more prone to training effects, (ii) has greater inter- and intrasubject variability, and (iii) increases less with masker level than do other masking tasks. These findings provide a new perspective on the mechanisms underlying communication disorders and auditory masking.

  9. Energy efficiency of China's industry sector: An adjusted network DEA (data envelopment analysis)-based decomposition analysis

    International Nuclear Information System (INIS)

    Liu, Yingnan; Wang, Ke

    2015-01-01

    The process of energy conservation and emission reduction in China requires the specific and accurate evaluation of the energy efficiency of the industry sector because this sector accounts for 70 percent of China's total energy consumption. Previous studies have used a “black box” DEA (data envelopment analysis) model to obtain the energy efficiency without considering the inner structure of the industry sector. However, differences in the properties of energy utilization (final consumption or intermediate conversion) in different industry departments may lead to bias in energy efficiency measures under such “black box” evaluation structures. Using the network DEA model and efficiency decomposition technique, this study proposes an adjusted energy efficiency evaluation model that can characterize the inner structure and associated energy utilization properties of the industry sector so as to avoid evaluation bias. By separating the energy-producing department and energy-consuming department, this adjusted evaluation model was then applied to evaluate the energy efficiency of China's provincial industry sector. - Highlights: • An adjusted network DEA (data envelopment analysis) model for energy efficiency evaluation is proposed. • The inner structure of industry sector is taken into account for energy efficiency evaluation. • Energy final consumption and energy intermediate conversion processes are separately modeled. • China's provincial industry energy efficiency is measured through the adjusted model.

  10. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    Science.gov (United States)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  11. Efficient separations & processing crosscutting program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-08-01

    The Efficient Separations and Processing Crosscutting Program (ESP) was created in 1991 to identify, develop, and perfect chemical and physical separations technologies and chemical processes which treat wastes and address environmental problems throughout the DOE complex. The ESP funds several multiyear tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESP supports applied research and development (R & D) leading to the demonstration or use of these separations technologies by other organizations within the Department of Energy (DOE), Office of Environmental Management.

  12. Energy efficiency analysis method based on fuzzy DEA cross-model for ethylene production systems in chemical industry

    International Nuclear Information System (INIS)

    Han, Yongming; Geng, Zhiqiang; Zhu, Qunxiong; Qu, Yixin

    2015-01-01

    DEA (data envelopment analysis) has been widely used for the efficiency analysis of industrial production process. However, the conventional DEA model is difficult to analyze the pros and cons of the multi DMUs (decision-making units). The DEACM (DEA cross-model) can distinguish the pros and cons of the effective DMUs, but it is unable to take the effect of the uncertainty data into account. This paper proposes an efficiency analysis method based on FDEACM (fuzzy DEA cross-model) with Fuzzy Data. The proposed method has better objectivity and resolving power for the decision-making. First we obtain the minimum, the median and the maximum values of the multi-criteria ethylene energy consumption data by the data fuzzification. On the basis of the multi-criteria fuzzy data, the benchmark of the effective production situations and the improvement directions of the ineffective of the ethylene plants under different production data configurations are obtained by the FDEACM. The experimental result shows that the proposed method can improve the ethylene production conditions and guide the efficiency of energy utilization during ethylene production process. - Highlights: • This paper proposes an efficiency analysis method based on FDEACM (fuzzy DEA cross-model) with data fuzzification. • The proposed method is more efficient and accurate than other methods. • We obtain an energy efficiency analysis framework and process based on FDEACM in ethylene production industry. • The proposed method is valid and efficient in improvement of energy efficiency in the ethylene plants

  13. Event-related potential evidence for the processing efficiency theory.

    Science.gov (United States)

    Murray, N P; Janelle, C M

    2007-01-15

    The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.

  14. Technology’s present situation and the development prospects of energy efficiency monitoring as well as performance testing & analysis for process flow compressors

    Science.gov (United States)

    Li, L.; Zhao, Y.; Wang, L.; Yang, Q.; Liu, G.; Tang, B.; Xiao, J.

    2017-08-01

    In this paper, the background of performance testing of in-service process flow compressors set in user field are introduced, the main technique barriers faced in the field test are summarized, and the factors that result in real efficiencies of most process flow compressors being lower than the guaranteed by manufacturer are analysed. The authors investigated the present operational situation of process flow compressors in China and found that low efficiency operation of flow compressors is because the compressed gas is generally forced to flow back into the inlet pipe for adapting to the process parameters variety. For example, the anti-surge valve is always opened for centrifugal compressor. To improve the operation efficiency of process compressors the energy efficiency monitoring technology was overviewed and some suggestions are proposed in the paper, which is the basis of research on energy efficiency evaluation and/or labelling of process compressors.

  15. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  16. Efficiency of European Dairy Processing Firms

    NARCIS (Netherlands)

    Soboh, R.A.M.E.; Oude Lansink, A.G.J.M.; Dijk, van G.

    2014-01-01

    This paper compares the technical efficiency and production frontier of dairy processing cooperativesand investor owned firms in six major dairy producing European countries. Two parametric produc-tion frontiers are estimated, i.e. for cooperatives and investor owned firms separately, which are

  17. Exergy analysis of the LFC process

    International Nuclear Information System (INIS)

    Li, Qingsong; Lin, Yuankui

    2016-01-01

    Highlights: • Mengdong lignite was upgraded by liquids from coal (LFC) process at a laboratory-scale. • True boiling point distillation of tar was performed. • Basing on experimental data, the LFC process was simulated in Aspen Plus. • Amounts of exergy destruction and efficiencies of blocks were calculated. • Potential measures for improving the LFC process are suggested. - Abstract: Liquid from coal (LFC) is a pyrolysis technology for upgrading lignite. LFC is close to viability as a large-scale commercial technology and is strongly promoted by the Chinese government. This paper presents an exergy analysis of the LFC process producing semicoke and tar, simulated in Aspen Plus. The simulation included the drying unit, pyrolysis unit, tar recovery unit and combustion unit. To obtain the data required for the simulation, Mengdong lignite was upgraded using a laboratory-scale experimental facility based on LFC technology. True boiling point distillation of tar was performed. Based on thermodynamic data obtained from the simulation, chemical exergy and physical exergy were determined for process streams and exergy destruction was calculated. The exergy budget of the LFC process is presented as a Grassmann flow diagram. The overall exergy efficiency was 76.81%, with the combustion unit causing the highest exergy destruction. The study found that overall exergy efficiency can be increased by reducing moisture in lignite and making full use of physical exergy of pyrolysates. A feasible method for making full use of physical exergy of semicoke was suggested.

  18. An efficiency improvement in warehouse operation using simulation analysis

    Science.gov (United States)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  19. Solar fuel processing efficiency for ceria redox cycling using alternative oxygen partial pressure reduction methods

    International Nuclear Information System (INIS)

    Lin, Meng; Haussener, Sophia

    2015-01-01

    Solar-driven non-stoichiometric thermochemical redox cycling of ceria for the conversion of solar energy into fuels shows promise in achieving high solar-to-fuel efficiency. This efficiency is significantly affected by the operating conditions, e.g. redox temperatures, reduction and oxidation pressures, solar irradiation concentration, or heat recovery effectiveness. We present a thermodynamic analysis of five redox cycle designs to investigate the effects of working conditions on the fuel production. We focused on the influence of approaches to reduce the partial pressure of oxygen in the reduction step, namely by mechanical approaches (sweep gassing or vacuum pumping), chemical approaches (chemical scavenger), and combinations thereof. The results indicated that the sweep gas schemes work more efficient at non-isothermal than isothermal conditions, and efficient gas phase heat recovery and sweep gas recycling was important to ensure efficient fuel processing. The vacuum pump scheme achieved best efficiencies at isothermal conditions, and at non-isothermal conditions heat recovery was less essential. The use of oxygen scavengers combined with sweep gas and vacuum pump schemes further increased the system efficiency. The present work can be used to predict the performance of solar-driven non-stoichiometric redox cycles and further offers quantifiable guidelines for system design and operation. - Highlights: • A thermodynamic analysis was conducted for ceria-based thermochemical cycles. • Five novel cycle designs and various operating conditions were proposed and investigated. • Pressure reduction method affects optimal operating conditions for maximized efficiency. • Chemical oxygen scavenger proves to be promising in further increasing efficiency. • Formulation of quantifiable design guidelines for economical competitive solar fuel processing

  20. Efficient energy conversion in the pulp and paper industry: application to a sulfite wood pulping process

    Energy Technology Data Exchange (ETDEWEB)

    Marechal, F.

    2007-07-01

    This report measures the actions performed in 2006 and the actions planned for 2007 within the framework of the project Efficient Energy Conversion in the Pulp and Paper Industry. In addition to the data reconciliation models of the steam and condensate networks and of the process of Borregaard Schweiz AG, process models have been developed with the goal of defining the heat requirements of the process. The combination of utility system data reconciliation with the process models allows to considerably reduce the need for detailed process modelling and for on-site data collection and measurement. A systematic definition of the hot and cold streams in the process has been developed in order to compute the minimum energy requirement of the process. The process requirements have been defined using the dual representation concept where the energy requirement of the process unit operations are systematically analysed from their thermodynamic requirement and the way they are satisfied by the technology that implements the operation. Corresponding to the same energy requirement but realised with different temperature allows on one hand to define the exergy efficiency of the heat transfer system in each of the process unit operations and to identify possible energy savings by heat exchange in the system. The analysis has been completed by the definition of the possible energy recovery from waste streams. The minimum energy requirement of the process using the different requirement representation has been realised and the analysis of the energy savings opportunities is now under preparation. This new step will first concern the definition of the utility system integration and the systematic analysis of the energy savings opportunities followed by the techno-economic evaluation of the most profitable energy savings options in the process. The national and international collaborations constitute also an important part of this project. The project is done in close

  1. Comparative analysis of energy efficiency in water users associations

    Energy Technology Data Exchange (ETDEWEB)

    Abadia, R.; Rocamora, M. C.; Corcoles, J. I.; Ruiz-Canales, A.; Martinez-Romero, A.; Moreno, M. A.

    2010-07-01

    The government of Spain has developed an energy strategy that includes a campaign of energy audits in water users associations (WUAs) in order to improve energy efficiency in irrigation. A guideline for energy audits has been developed, standardizing the audit process in WUAs. This guideline has been implemented in 22 WUAs in the Castilla-La Mancha, Valencia, and Murcia Regions. In this paper, an analysis of the indicators proposed in the guideline is performed, and the indicators that most represent energy efficiency of WUAs are identified. Also, the suitability of the proposed indicators and classifications under different conditions are discussed. In addition, a cluster analysis is performed on WUAs to classify them according to their energetic aspects. Results show that indicators global energy efficiency (GEE) and active energy consumed per hectare (EacSr) are not adequate for analysing the evolution of energy consumption in a WUA. The most representative energy indicators are those expressing ratios between energy consumption and water volume supplied to the users as the indicators active energy consumed per volume unit (EacVs) and energy cost per volume unit (CENVs). It is conclude that using the current methodology for calculate the supply energy efficiency indicator (SEE), GEE is not an adequate indicator for energy classification of WUAs, and also that the results of the energy analysis must be used to propose measures for energy conservation and energy cost reduction. (Author) 14 refs.

  2. Cost-efficient enactment of stream processing topologies

    Directory of Open Access Journals (Sweden)

    Christoph Hochreiner

    2017-12-01

    Full Text Available The continuous increase of unbound streaming data poses several challenges to established data stream processing engines. One of the most important challenges is the cost-efficient enactment of stream processing topologies under changing data volume. These data volume pose different loads to stream processing systems whose resource provisioning needs to be continuously updated at runtime. First approaches already allow for resource provisioning on the level of virtual machines (VMs, but this only allows for coarse resource provisioning strategies. Based on current advances and benefits for containerized software systems, we have designed a cost-efficient resource provisioning approach and integrated it into the runtime of the Vienna ecosystem for elastic stream processing. Our resource provisioning approach aims to maximize the resource usage for VMs obtained from cloud providers. This strategy only releases processing capabilities at the end of the VMs minimal leasing duration instead of releasing them eagerly as soon as possible as it is the case for threshold-based approaches. This strategy allows us to improve the service level agreement compliance by up to 25% and a reduction for the operational cost of up to 36%.

  3. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  4. Oxygen blast furnace and combined cycle (OBF-CC) - an efficient iron-making and power generation process

    International Nuclear Information System (INIS)

    Jianwei, Y.; Guolong, S.; Cunjiang, K.; Tianjun, Y.

    2003-01-01

    A new iron and power generating process, oxygen blast furnace and combined cycle (OBF-CC), is presented. In order to support the opinion, the features of the oxygen blast furnace and integrated coal gasification and combined cycle (IGCC) are summarized. The relation between the blasting parameters and the output gas quantity, as well as caloric value is calculated based on mass and energy balance. Analysis and calculation indicate that the OBF-CC will be an efficient iron-making and power generation process with higher energy efficiency and less pollution

  5. Evaluating the efficiency of municipalities in collecting and processing municipal solid waste: A shared input DEA-model

    International Nuclear Information System (INIS)

    Rogge, Nicky; De Jaeger, Simon

    2012-01-01

    Highlights: ► Complexity in local waste management calls for more in depth efficiency analysis. ► Shared-input Data Envelopment Analysis can provide solution. ► Considerable room for the Flemish municipalities to improve their cost efficiency. - Abstract: This paper proposed an adjusted “shared-input” version of the popular efficiency measurement technique Data Envelopment Analysis (DEA) that enables evaluating municipality waste collection and processing performances in settings in which one input (waste costs) is shared among treatment efforts of multiple municipal solid waste fractions. The main advantage of this version of DEA is that it not only provides an estimate of the municipalities overall cost efficiency but also estimates of the municipalities’ cost efficiency in the treatment of the different fractions of municipal solid waste (MSW). To illustrate the practical usefulness of the shared input DEA-model, we apply the model to data on 293 municipalities in Flanders, Belgium, for the year 2008.

  6. Eco-Efficient Process Improvement at the Early Development Stage: Identifying Environmental and Economic Process Hotspots for Synergetic Improvement Potential.

    Science.gov (United States)

    Piccinno, Fabiano; Hischier, Roland; Seeger, Stefan; Som, Claudia

    2018-05-15

    We present here a new eco-efficiency process-improvement method to highlight combined environmental and costs hotspots of the production process of new material at a very early development stage. Production-specific and scaled-up results for life cycle assessment (LCA) and production costs are combined in a new analysis to identify synergetic improvement potentials and trade-offs, setting goals for the eco-design of new processes. The identified hotspots and bottlenecks will help users to focus on the relevant steps for improvements from an eco-efficiency perspective and potentially reduce their associated environmental impacts and production costs. Our method is illustrated with a case study of nanocellulose. The results indicate that the production route should start with carrot pomace, use heat and solvent recovery, and deactivate the enzymes with bleach instead of heat. To further improve the process, the results show that focus should be laid on the carrier polymer, sodium alginate, and the production of the GripX coating. Overall, the method shows that the underlying LCA scale-up framework is valuable for purposes beyond conventional LCA studies and is applicable at a very early stage to provide researchers with a better understanding of their production process.

  7. ANALYSIS AND IMPROVEMENT OF PRODUCTION EFFICIENCY IN A CONSTRUCTION MACHINE ASSEMBLY LINE

    Directory of Open Access Journals (Sweden)

    Alidiane Xavier

    2016-07-01

    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  8. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  9. Eco-efficiency of the world cement industry: A data envelopment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oggioni, G., E-mail: oggioni@eco.unibs.i [University of Brescia, Faculty of Economics, Department of Quantitative Methods, IT-25122 Brescia (Italy); Riccardi, R., E-mail: riccardi@ec.unipi.i [University of Brescia, Faculty of Economics, Department of Quantitative Methods, IT-25122 Brescia (Italy); Toninelli, R., E-mail: roberta.toninelli@unifi.i [University of Pisa, Faculty of Economics, Department of Statistics and Applied Mathematics, IT-56124 Pisa (Italy)

    2011-05-15

    Chemical reactions and the combustion of dirty fuels, such as coal and petroleum coke (petcoke), that are used in cement production processes generate a significant amount of CO{sub 2} emissions. In this paper, we provide an eco-efficiency measure for 21 prototypes of cement industries operating in many countries by applying both a data envelopment analysis (DEA) and a directional distance function approach, which are particularly suitable for models where several production inputs and desirable and undesirable outputs are taken into account. To understand whether this eco-efficiency is due to a rational utilization of inputs or to a real carbon dioxide reduction as a consequence of environmental regulation, we analyze the cases where CO{sub 2} emissions can either be considered as an input or as an undesirable output. Empirical results show that countries where cement industries invest in technologically advanced kilns and adopt alternative fuels and raw materials in their production processes are eco-efficient. This gives a comparative advantage to emerging countries, such as India and China, which are incentivized to modernize their production processes.

  10. Eco-efficiency of the world cement industry: A data envelopment analysis

    International Nuclear Information System (INIS)

    Oggioni, G.; Riccardi, R.; Toninelli, R.

    2011-01-01

    Chemical reactions and the combustion of dirty fuels, such as coal and petroleum coke (petcoke), that are used in cement production processes generate a significant amount of CO 2 emissions. In this paper, we provide an eco-efficiency measure for 21 prototypes of cement industries operating in many countries by applying both a data envelopment analysis (DEA) and a directional distance function approach, which are particularly suitable for models where several production inputs and desirable and undesirable outputs are taken into account. To understand whether this eco-efficiency is due to a rational utilization of inputs or to a real carbon dioxide reduction as a consequence of environmental regulation, we analyze the cases where CO 2 emissions can either be considered as an input or as an undesirable output. Empirical results show that countries where cement industries invest in technologically advanced kilns and adopt alternative fuels and raw materials in their production processes are eco-efficient. This gives a comparative advantage to emerging countries, such as India and China, which are incentivized to modernize their production processes.

  11. Correction in the efficiency of uranium purification process by solvent extraction

    International Nuclear Information System (INIS)

    Franca Junior, J.M.

    1981-01-01

    An uranium solvent extraction, of high purification, with full advantage of absorbed uranium in the begining of process, is described. Including a pulsed column, called correction column, the efficiency of whole process is increased, dispensing the recycling of uranium losses from leaching column. With the correction column the uranium losses go in continuity, for reextraction column, increasing the efficiency of process. The purified uranium is removed in the reextraction column in aqueous phase. The correction process can be carried out with full efficiency using pulsed columns or chemical mixer-settlers. (M.C.K.) [pt

  12. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    -groups is that, the performance of the entire process can be evaluated from the contributions of the individual process-groups towards the selected flowsheet property (for example, energy consumed). The developed flowsheet property models include energy consumption, carbon footprint, product recovery, product......Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...

  13. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  14. Comparison of the energy efficiency to produce agroethanol between various industries and processes: Synthesis

    International Nuclear Information System (INIS)

    Chavanne, Xavier; Frangi, Jean-Pierre

    2011-01-01

    The article assesses the energy R required by a system to transform a cereal or sugar plant into ethanol. From the specific consumption r j of each process j and its weight w j in the system, process consumption share R j is deduced and hence R, sum of R j . Depending on w j definition, R j and R are relative to either 100 J of ethanol produced or 100 J of plant harvested. Depending on the nature of r j , R j and R represent either only primary external energies, or all fuel and electricity consumed directly, or external and internal energies. From one definition to another R for average sugar cane based industries is the best or the worst relative to other plants. This results also from the use of cane residues as fuels while operating outdated processes. Through r j the process based analysis allows to examine for each system the impact of modern processes or different use of residues. All systems benefit except sugar beet based industry close to its best efficiency. This flexibility permits even to build a self-sufficient system where existing processes produce from system resources substitutes to external energies. R becomes an unambiguous definition of a system efficiency. It shows that all agroethanol systems are more consuming than petroleum industry. The system can be expanded to the vehicle stage to compare with alternatives to ethanol such as electricity and biogas. Wheat straw burnt to produce electricity used in an electrical vehicle will present R close to that of petroleum industry. -- Highlights: → Study of the energy consumptions of agroethanol industries with a process based analysis. → Different definitions of energy efficiency with potential opposite conclusions. → Previous highlight is overcome using self sufficient systems with existing processes. → Consumptions of average and improved agroethanol industries larger than for petroleum industries. → Electricity from wheat straw combustion can compete with gasoline from crude oil.

  15. Energy efficiency in Germany - a meta-analysis. Analysis and recommendations

    International Nuclear Information System (INIS)

    Bauernhansl, Thomas

    2014-01-01

    The Stuttgart Institute of energy efficiency in production compiled the first meta-analysis ''Energy Efficiency in Germany''.It provides facts and figures on the development atatus and knowledge level of energy efficiency in Germany. The study shows the contribution have been made by individual measures and the potentials which are known, but have not yet been lifted. For this meta-analysis more than 250 items available from research institutions, government departments, professional and industry associations have been identified with the main topic of energy efficiency and evaluated. It provides an overview of the status of development and is an important reference work for industry, associations and politics. [de

  16. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations......This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD...

  17. Improved energy efficiency in the process industries

    Energy Technology Data Exchange (ETDEWEB)

    Pilavachi, P A [Commission of the European Communities, Brussels (Belgium)

    1992-12-31

    The European Commission, through the JOULE Programme, is promoting energy efficient technologies in the process industries; the topics of the various R and D activities are: heat exchangers (enhanced evaporation, shell and tube heat exchangers including distribution of fluids, and fouling), low energy separation processes (adsorption, melt-crystallization and supercritical extraction), chemical reactors (methanol synthesis and reactors with integral heat exchangers), other unit operations (evaporators, glass-melting furnaces, cement kilns and baking ovens, dryers and packed columns and replacements for R12 in refrigeration), energy and system process models (batch processes, simulation and control of transients and energy synthesis), development of advanced sensors.

  18. Working memory capacity and redundant information processing efficiency.

    Science.gov (United States)

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  19. Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example

    Science.gov (United States)

    Sarı, F.; Ceylan, D. A.

    2017-11-01

    Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  20. DEA Window Analysis for Measuring Port Efficiencies in Serbia

    Directory of Open Access Journals (Sweden)

    Danijela Pjevčević

    2012-01-01

    Full Text Available The aim of the paper is to apply Data Envelopment Analysis (DEA method in measuring and analyzing the efficiencies of ports on the river Danube. DEA window analysis is used to determine the efficiency of ports and to observe the possibility of changes in the port efficiency over time. A study is conducted to evaluate the efficiencies of ports on the territory of Serbia in order to identify the sources of inefficiencies and formulate proposals for improving the services of those ports and their operations through a four-year window analysis with port efficiency trends and average efficiencies. The progress is made in the measurement of port efficiency in relation to port productive activities - total area of warehouses, quay length, number of cranes and port throughput, for the Serbian river ports. Keywords: river ports, total area of warehouses, quay length, number of cranes, port throughput, port efficiency, DEA window analysis

  1. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  2. Integration of solar thermal for improved energy efficiency in low-temperature-pinch industrial processes

    International Nuclear Information System (INIS)

    Atkins, Martin J.; Walmsley, Michael R.W.; Morrison, Andrew S.

    2010-01-01

    Solar thermal systems have the potential to provide renewable industrial process heat and are especially suited for low pinch temperature processes such as those in the food, beverage, and textile sectors. When correctly integrated within an industrial process, they can provide significant progress towards both increased energy efficiency and reduction in emissions. However, the integration of renewable solar energy into industrial processes presents a challenge for existing process integration techniques due to the non-continuous nature of the supply. A thorough pinch analysis study of the industrial process, taking in to account non-continuous operating rates, should be performed to evaluate the utility demand profile. Solar collector efficiency data under variable climatic conditions should also be collected for the specific site. A systematic method of combining this information leads to improved design and an optimal operating strategy. This approach has been applied to a New Zealand milk powder plant and benefits of several integration strategies, including mass integration, are investigated. The appropriate placement of the solar heat is analogous to the placement of a hot utility source and an energy penalty will be incurred when the solar thermal system provides heat below the pinch temperature.

  3. Integration of solar thermal for improved energy efficiency in low-temperature-pinch industrial processes

    Energy Technology Data Exchange (ETDEWEB)

    Atkins, Martin J.; Walmsley, Michael R.W.; Morrison, Andrew S. [Energy Research Group, School of Science and Engineering, University of Waikato, Private Bag 3105, Hamilton 3240 (New Zealand)

    2010-05-15

    Solar thermal systems have the potential to provide renewable industrial process heat and are especially suited for low pinch temperature processes such as those in the food, beverage, and textile sectors. When correctly integrated within an industrial process, they can provide significant progress towards both increased energy efficiency and reduction in emissions. However, the integration of renewable solar energy into industrial processes presents a challenge for existing process integration techniques due to the non-continuous nature of the supply. A thorough pinch analysis study of the industrial process, taking in to account non-continuous operating rates, should be performed to evaluate the utility demand profile. Solar collector efficiency data under variable climatic conditions should also be collected for the specific site. A systematic method of combining this information leads to improved design and an optimal operating strategy. This approach has been applied to a New Zealand milk powder plant and benefits of several integration strategies, including mass integration, are investigated. The appropriate placement of the solar heat is analogous to the placement of a hot utility source and an energy penalty will be incurred when the solar thermal system provides heat below the pinch temperature. (author)

  4. A modified indirect mathematical model for evaluation of ethanol production efficiency in industrial-scale continuous fermentation processes.

    Science.gov (United States)

    Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M

    2016-10-01

    To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.

  5. A method to generate fully multi-scale optimal interpolation by combining efficient single process analyses, illustrated by a DINEOF analysis spiced with a local optimal interpolation

    Directory of Open Access Journals (Sweden)

    J.-M. Beckers

    2014-10-01

    Full Text Available We present a method in which the optimal interpolation of multi-scale processes can be expanded into a succession of simpler interpolations. First, we prove how the optimal analysis of a superposition of two processes can be obtained by different mathematical formulations involving iterations and analysis focusing on a single process. From the different mathematical equivalent formulations, we then select the most efficient ones by analyzing the behavior of the different possibilities in a simple and well-controlled test case. The clear guidelines deduced from this experiment are then applied to a real situation in which we combine large-scale analysis of hourly Spinning Enhanced Visible and Infrared Imager (SEVIRI satellite images using data interpolating empirical orthogonal functions (DINEOF with a local optimal interpolation using a Gaussian covariance. It is shown that the optimal combination indeed provides the best reconstruction and can therefore be exploited to extract the maximum amount of useful information from the original data.

  6. Exploring Efficiencies in Data Reduction, Analysis, and Distribution in the Exascale Era

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    DataDirect Networks (DDN) is world leader in massively scalable storage. Fellinger will discuss how the growth of data sets beyond the petabyte boundary presents new challenges to researchers in delivering and extracting usable information. He will also explore a Big Data processing architecture that overcomes constraints in network bandwidth and service layers in large-scale data distribution to enable researchers to request raw data through a filter or an analysis library. This technology, operating directly within the storage device, enables the reduction of service latency and process cycles to provide a more efficient feedback loop in iterative scientific experiments to increase data-intensive processing efficiency by up to 400%. About the speaker Dave Fellinger has over three decades of engineering experience, including film systems, ASIC design and development, GaAs semiconductor manufacture, RAID and storage systems, and video processing devices, and has architected high-performance storage syst...

  7. Exergetic efficiency analysis of hydrogen–air detonation in pulse detonation combustor using computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Pinku Debnath

    2017-03-01

    Full Text Available Exergy losses during the combustion process, heat transfer, and fuel utilization play a vital role in the analysis of the exergetic efficiency of combustion process. Detonation is thermodynamically more efficient than deflagration mode of combustion. Detonation combustion technology inside the pulse detonation engine using hydrogen as a fuel is energetic propulsion system for next generation. In this study, the main objective of this work is to quantify the exergetic efficiency of hydrogen–air combustion for deflagration and detonation combustion process. Further detonation parameters are calculated using 0.25, 0.35, and 0.55 of H2 mass concentrations in the combustion process. The simulations have been performed for converging the solution using commercial computational fluid dynamics package Ansys Fluent solver. The details of combustion physics in chemical reacting flows of hydrogen–air mixture in two control volumes were simulated using species transport model with eddy dissipation turbulence chemistry interaction. From these simulations it was observed that exergy loss in the deflagration combustion process is higher in comparison to the detonation combustion process. The major observation was that pilot fuel economy for the two combustion processes and augmentation of exergetic efficiencies are better in the detonation combustion process. The maximum exergetic efficiency of 55.12%, 53.19%, and 23.43% from deflagration combustion process and from detonation combustion process, 67.55%, 57.49%, and 24.89%, are obtained from aforesaid H2 mass fraction. It was also found that for lesser fuel mass fraction higher exergetic efficiency was observed.

  8. Processing speed training increases the efficiency of attentional resource allocation in young adults

    Directory of Open Access Journals (Sweden)

    Wesley K Burge

    2013-10-01

    Full Text Available Cognitive training has been shown to improve performance on a range of tasks. However, the mechanisms underlying these improvements are still unclear. Given the wide range of transfer effects, it is likely that these effects are due to a factor common to a wide range of tasks. One such factor is a participant’s efficiency in allocating limited cognitive resources. The impact of a cognitive training program, Processing Speed Training (PST, on the allocation of resources to a set of visual tasks was measured using pupillometry in 10 young adults as compared to a control group of a 10 young adults (n = 20. PST is a well-studied computerized training program that involves identifying simultaneously presented central and peripheral stimuli. As training progresses, the task becomes increasingly more difficult, by including peripheral distracting stimuli and decreasing the duration of stimulus presentation. Analysis of baseline data confirmed that pupil diameter reflected cognitive effort. After training, participants randomized to PST used fewer attentional resources to perform complex visual tasks as compared to the control group. These pupil diameter data indicated that PST appears to increase the efficiency of attentional resource allocation. Increases in cognitive efficiency have been hypothesized to underlie improvements following experience with action video games, and improved cognitive efficiency has been hypothesized to underlie the benefits of processing speed training in older adults. These data reveal that these training schemes may share a common underlying mechanism of increasing cognitive efficiency in younger adults.

  9. Exergetic analysis of a biodiesel production process from Jatropha curcas

    International Nuclear Information System (INIS)

    Blanco-Marigorta, A.M.; Suárez-Medina, J.; Vera-Castellano, A.

    2013-01-01

    Highlights: ► Exergetic analysis of a biodiesel production process from Jatropha curcas. ► A 95% of the inefficiencies are located in the transesterification reactor. ► Exergetic efficiency of the steam generator amounts 37.6%. ► Chemical reactions cause most of the irreversibilities of the process. ► Exergetic efficiency of the overall process is over 63%. -- Abstract: As fossil fuels are depleting day by day, it is necessary to find an alternative fuel to fulfill the energy demand of the world. Biodiesel is considered as an environmentally friendly renewable diesel fuel alternative. The interest in using Jatropha curcas as a feedstock for the production of biodiesel is rapidly growing. On the one hand, J. curcas’ oil does not compete with the food sector due to its toxic nature and to the fact that it must be cultivated in marginal/poor soil. On the other, its price is low and stable. In the last decade, the investigation on biodiesel production was centered on the choice of the suitable raw material and on the optimization of the process operation conditions. Nowadays, research is focused on the improvement of the energetic performance and on diminishing the inefficiencies in the different process components. The method of exergy analysis is well suited for furthering this goal, for it is a powerful tool for developing, evaluating and improving an energy conversion system. In this work, we identify the location, magnitude and sources of thermodynamic inefficiencies in a biodiesel production process from J. curcas by means of an exergy analysis. The thermodynamic properties were calculated from existing databases or estimated when necessary. The higher exergy destruction takes places in the transesterification reactor due to chemical reactions. Almost 95% of the exergy of the fuel is destroyed in this reactor. The exergetic efficiency of the overall process is 63%.

  10. Thermodynamic analysis of a milk pasteurization process assisted by geothermal energy

    International Nuclear Information System (INIS)

    Yildirim, Nurdan; Genc, Seda

    2015-01-01

    Renewable energy system is an important concern for sustainable development of the World. Thermodynamic analysis, especially exergy analysis is an intense tool to assess sustainability of the systems. Food processing industry is one of the energy intensive sectors where dairy industry consumes substantial amount of energy among other food industry segments. Therefore, in this study, thermodynamic analysis of a milk pasteurization process assisted by geothermal energy was studied. In the system, a water–ammonia VAC (vapor absorption cycle), a cooling section, a pasteurizer and a regenerator were used for milk pasteurization. Exergetic efficiencies of each component and the whole system were separately calculated. A parametric study was undertaken. In this regard, firstly the effect of the geothermal resource temperature on (i) the total exergy destruction of the absorption cycle and the whole system, (ii) the efficiency of the VAC, the whole system and COP (coefficient of performance) of the VAC, (iii) the flow rate of the pasteurized milk were investigated. Then, the effect of the geothermal resource flow rate on the pasteurization load was analyzed. The exergetic efficiency of the whole system was calculated as 56.81% with total exergy destruction rate of 13.66 kW. The exergetic results were also illustrated through the Grassmann diagram. - Highlights: • Geothermal energy assisted milk pasteurization system was studied thermodynamically. • The first study on exergetic analysis of a milk pasteurization process with VAC. • The thermodynamic properties of water–ammonia mixture were calculated by using EES. • Energetic and exergetic efficiency calculated as 71.05 and 56.81%, respectively.

  11. Increasing the efficiency of designing hemming processes by using an element-based metamodel approach

    Science.gov (United States)

    Kaiser, C.; Roll, K.; Volk, W.

    2017-09-01

    In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.

  12. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  13. Modeling and techno-economic analysis of shale-to-liquid and coal-to-liquid fuels processes

    International Nuclear Information System (INIS)

    Zhou, Huairong; Yang, Siyu; Xiao, Honghua; Yang, Qingchun; Qian, Yu; Gao, Li

    2016-01-01

    To alleviate the conflict between oil supply and demand, Chinese government has accelerated exploration and exploitation of alternative oil productions. STL (Shale-to-liquid) processes and CTL (coal-to-liquid) processes are promising choices to supply oil. However, few analyses have been made on their energy efficiency and economic performance. This paper conducts a detailed analysis of a STL process and a CTL process based on mathematical modeling and simulation. Analysis shows that low efficiency of the STL process is due to low oil yield of the Fushun-type retorting technology. For the CTL process, the utility system provides near to 34% energy consumption of the total. This is because that CTL technologies are in early development and no heat integration between units is implemented. Economic analysis reveals that the total capital investment of the CTL process is higher than that of the STL process. The production cost of the CTL process is right on the same level as that of the STL process. For better techno-economic performance, it is suggested to develop a new retorting technology of high oil yield for the STL process. The remaining retorting gas should be converted to hydrogen and then used for shale oil hydrogenation. For the CTL process, developing an appropriate heat network is an efficient way to apply heat integration. In addition, the CTL process is intended to be integrated with hydrogen rich gas to adjust H_2/CO for better resource utilization. - Highlights: • Aspen Plus software is used for modeling and simulation of a shale-to-liquid (STL) and a coal-to-liquid (CTL) processes. • Techno-economic analysis of STL and CTL processes is conducted. • Suggestions are given for improving energy efficiency and economic performance of STL and CTL processes.

  14. Technical Efficiency of Thai Manufacturing SMEs: A Stochastic Frontier Analysis

    Directory of Open Access Journals (Sweden)

    Teerawat Charoenrat

    2013-03-01

    Full Text Available AbstractA major motivation of this study is to examine the factors that are the most important in contributing to the relatively poor efficiency performance of Thai manufacturing small and medium sized enterprises (SMEs. The results obtained will be significant in devising effective policies aimed at tackling this poor performance.This paper uses data on manufacturing SMEs in the North-eastern region of Thailand in 2007 as a case study, by applying a stochastic frontier analysis (SFA and a technical inefficiency effects model. The empirical results obtained indicate that the mean technical efficiency of all categories of manufacturing SMEs in theNorth-eastern region is 43%, implying that manufacturing SMEs have high levels of technical inefficiency in their production processes.Manufacturing SMEs in the North-eastern region are particularly labour-intensive. The empirical results of the technical inefficiency effects model suggest that skilled labour, the municipal area and ownership characteristics are important firm-specific factors affecting technical efficiency. The paper argues that the government should play a more substantial role in developing manufacturing SMEs in the North-eastern provinces through: providing training programs for employees and employers; encouraging a greater usage of capital and technology in the production process of SMEs; enhancing the efficiency of state-ownedenterprises; encouraging a wide range of ownership forms; and improving information and communications infrastructure.

  15. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  16. SITE SUITABILITY ANALYSIS FOR BEEKEEPING VIA ANALYTHICAL HYREARCHY PROCESS, KONYA EXAMPLE

    Directory of Open Access Journals (Sweden)

    F. Sarı

    2017-11-01

    Full Text Available Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA and Geographical Information Systems (GIS integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  17. Thermodynamic analysis of combined Solid Oxide Electrolyzer and Fischer–Tropsch processes

    International Nuclear Information System (INIS)

    Stempien, Jan Pawel; Ni, Meng; Sun, Qiang; Chan, Siew Hwa

    2015-01-01

    In this paper a thermodynamic analysis and simple optimization of a combined Solid Oxide Electrolyzer Cell and Fisher–Tropsch Synthesis processes for sustainable hydrocarbons fuel production is reported. Comprehensive models are employed to describe effects of temperature, pressure, reactant composition and molar flux and flow on the system efficiency and final production distribution. The electrolyzer model was developed in-house and validated with experimental data of a typical Solid Oxide Electrolyzer. The Fischer–Tropsch Synthesis model employed lumped kinetics of syngas utilization, which includes inhibiting effect of water content and kinetics of Water–Gas Shift reaction. Product distribution model incorporated olefin re-adsorption and varying physisorption and solubility of hydrocarbons with their carbon number. The results were compared with those reported by Becker et al. with simplified analysis of such process. In the present study an opposite effect of operation at elevated pressure was observed. Proposed optimized system achieved overall efficiency of 66.67% and almost equal spread of light- (31%wt), mid-(36%wt) and heavy-hydrocarbons (33%wt). Paraffins contributed the majority of the yield. - Highlights: • Analysis of Solid Oxide Electrolyzer combined with Fisher Tropsch process. • Efficiency of converting water and carbon dioxide into synthetic fuels above 66%. • Effects of process temperature, pressure, gas flux and compositions were analyzed

  18. Healthcare efficiency assessment using DEA analysis in the Slovak Republic.

    Science.gov (United States)

    Stefko, Robert; Gavurova, Beata; Kocisova, Kristina

    2018-03-09

    A regional disparity is becoming increasingly important growth constraint. Policy makers need quantitative knowledge to design effective and targeted policies. In this paper, the regional efficiency of healthcare facilities in Slovakia is measured (2008-2015) using data envelopment analysis (DEA). The DEA is the dominant approach to assessing the efficiency of the healthcare system but also other economic areas. In this study, the window approach is introduced as an extension to the basic DEA models to evaluate healthcare technical efficiency in individual regions and quantify the basic regional disparities and discrepancies. The window DEA method was chosen since it leads to increased discrimination on results especially when applied to small samples and it enables year-by-year comparisons of the results. Two stable inputs (number of beds, number of medical staff), three variable inputs (number of all medical equipment, number of magnetic resonance (MR) devices, number of computed tomography (CT) devices) and two stable outputs (use of beds, average nursing time) were chosen as production variable in an output-oriented 4-year window DEA model for the assessment of technical efficiency in 8 regions. The database was made available from the National Health Information Center and the Slovak Statistical Office, as well as from the online databases Slovstat and DataCube. The aim of the paper is to quantify the impact of the non-standard Data Envelopment Analysis (DEA) variables as the use of medical technologies (MR, CT) on the results of the assessment of the efficiency of the healthcare facilities and their adequacy in the evaluation of the monitored processes. The results of the analysis have shown that there is an indirect dependence between the values of the variables over time and the results of the estimated efficiency in all regions. The regions that had low values of the variables over time achieved a high degree of efficiency and vice versa. Interesting

  19. Design process of an area-efficient photobioreactor

    NARCIS (Netherlands)

    Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such

  20. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  1. The Transfer efficiency analysis and modeling technology of new non - contact power transmission equipment

    Directory of Open Access Journals (Sweden)

    Cao Shi

    2017-01-01

    Full Text Available Due to the shortcomings of current power transmission which is used in ultrasound - assisted machining and the different transfer efficiency caused by the related parameters of the electromagnetic converter, this paper proposes an analysis model of the new non-contact power transmission device with more stable output and higher transmission efficiency. Then By utilizing Maxwell finite element analysis software, this paper studies the law of the transfer efficiency of the new non-contact transformer and compares new type with traditional type with the method of setting the boundary conditions of non-contact power supply device. At last, combining with the practical application, the relevant requirements which have a certain reference value in the application are put forward in the actual processing.

  2. Parallel and Efficient Sensitivity Analysis of Microscopy Image Segmentation Workflows in Hybrid Systems.

    Science.gov (United States)

    Barreiros, Willian; Teodoro, George; Kurc, Tahsin; Kong, Jun; Melo, Alba C M A; Saltz, Joel

    2017-09-01

    We investigate efficient sensitivity analysis (SA) of algorithms that segment and classify image features in a large dataset of high-resolution images. Algorithm SA is the process of evaluating variations of methods and parameter values to quantify differences in the output. A SA can be very compute demanding because it requires re-processing the input dataset several times with different parameters to assess variations in output. In this work, we introduce strategies to efficiently speed up SA via runtime optimizations targeting distributed hybrid systems and reuse of computations from runs with different parameters. We evaluate our approach using a cancer image analysis workflow on a hybrid cluster with 256 nodes, each with an Intel Phi and a dual socket CPU. The SA attained a parallel efficiency of over 90% on 256 nodes. The cooperative execution using the CPUs and the Phi available in each node with smart task assignment strategies resulted in an additional speedup of about 2×. Finally, multi-level computation reuse lead to an additional speedup of up to 2.46× on the parallel version. The level of performance attained with the proposed optimizations will allow the use of SA in large-scale studies.

  3. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  4. Big Data Analysis of Manufacturing Processes

    International Nuclear Information System (INIS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-01-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results. (paper)

  5. Boosting the IGCLC process efficiency by optimizing the desulfurization step

    International Nuclear Information System (INIS)

    Hamers, H.P.; Romano, M.C.; Spallina, V.; Chiesa, P.; Gallucci, F.; Sint Annaland, M. van

    2015-01-01

    Highlights: • Pre-CLC hot gas desulfurization and post-CLC desulfurization are assessed. • Process efficiency increases by 0.5–1% points with alternative desulfurization methods. • Alternative desulfurization methods are more beneficial for CFB configurations. - Abstract: In this paper the influence of the desulfurization method on the process efficiency of an integrated gasification chemical-looping combustion (IGCLC) systems is investigated for both packed beds and circulating fluidized bed CLC systems. Both reactor types have been integrated in an IGCLC power plant, in which three desulfurization methods have been compared: conventional cold gas desulfurization with Selexol (CGD), hot gas desulfurization with ZnO (HGD) and flue gas desulfurization after the CLC reactors (post-CLC). For CLC with packed bed reactors, the efficiency gain of the alternative desulfurization methods is about 0.5–0.7% points. This is relatively small, because of the relatively large amount of steam that has to be mixed with the fuel to avoid carbon deposition on the oxygen carrier. The HGD and post-CLC configurations do not contain a saturator and therefore more steam has to be mixed with a negative influence on the process efficiency. Carbon deposition is not an issue for circulating fluidized bed systems and therefore a somewhat higher efficiency gain of 0.8–1.0% point can be reached for this reactor system, assuming that complete fuel conversion can be reached and no sulfur species are formed on the solid, which is however thermodynamically possible for iron and manganese based oxygen carriers. From this study, it can be concluded that the adaptation of the desulfurization method results in higher process efficiencies, especially for the circulating fluidized bed system, while the number of operating units is reduced.

  6. Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)

    Science.gov (United States)

    Raskovic, Dejan

    Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.

  7. Analysis of efficiency of the Brazilian international airports

    Directory of Open Access Journals (Sweden)

    Marina Rodriguez Brochado

    2008-07-01

    Brazil. Analysis of correlations was used to select the 0 variable most representative of this system and after that, it was calculated efficiency of the system for classic and inverted the borders by means of model BCC guided for output. Finally, it was possible to identify which improvements are necessary for the inefficient airports, in way to identify which improvements are necessary to operational level to reach the efficiency. Key-words: Data Envelopment Analysis, Efficiency, Airports.

  8. Eco-efficiency Analysis of Furniture Product Using Life Cycle Assessment

    Directory of Open Access Journals (Sweden)

    Ika Rinawati Dyah

    2018-01-01

    Full Text Available Furniture is one of Indonesia’s main commodities strategically role in economic growth and employment in Indonesia. In their production process there many wastes resulted, such as such as sawdust, cuttings - pieces of wood, components that do not conform to specifications and the edges of wood from a log. Contrast with requirement of timber for furniture industries, availability of raw material sources decrease because of limited forest areas. Beside that, using electricity and chemical material in furniture production process have impact to environment. This study aim to assess the eco-cost and eco-efficiency ratio of the product so strategic recommendations to improve the eco-efficiency of products can be designed. The results of data processing showed the environmental costs of the furniture production process amount Rp 30.887.84. Eco-efficiency index of furniture products studied was 4,79 with the eco-efficiency ratio of 79,12%. This result means that the measured furniture products already profitable and sustainable, as well as its production process is already fairly efficient. However, improved performance of the production process can still be done to improve the eco-efficiency by minimizing the use of raw materials.

  9. Eco-efficiency Analysis of Furniture Product Using Life Cycle Assessment

    Science.gov (United States)

    Rinawati, Dyah Ika; Sriyanto; Sari, Diana Puspita; Prayodha, Andana Cantya

    2018-02-01

    Furniture is one of Indonesia's main commodities strategically role in economic growth and employment in Indonesia. In their production process there many wastes resulted, such as such as sawdust, cuttings - pieces of wood, components that do not conform to specifications and the edges of wood from a log. Contrast with requirement of timber for furniture industries, availability of raw material sources decrease because of limited forest areas. Beside that, using electricity and chemical material in furniture production process have impact to environment. This study aim to assess the eco-cost and eco-efficiency ratio of the product so strategic recommendations to improve the eco-efficiency of products can be designed. The results of data processing showed the environmental costs of the furniture production process amount Rp 30.887.84. Eco-efficiency index of furniture products studied was 4,79 with the eco-efficiency ratio of 79,12%. This result means that the measured furniture products already profitable and sustainable, as well as its production process is already fairly efficient. However, improved performance of the production process can still be done to improve the eco-efficiency by minimizing the use of raw materials.

  10. Profitability Analysis of Rice Processing and Marketing in Kano State ...

    African Journals Online (AJOL)

    Profitability Analysis of Rice Processing and Marketing in Kano State, Nigeria. ... added to the commodity at each stage in the study area and determine the most efficient services produce. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  11. Workflow efficiency for the treatment planning process in CT-guided high-dose-rate brachytherapy for cervical cancer.

    Science.gov (United States)

    Michaud, Anthony L; Benedict, Stanley; Montemayor, Eliseo; Hunt, Jon Paul; Wright, Cari; Mathai, Mathew; Mayadev, Jyoti S

    2016-01-01

    To investigate process efficiency, we present a prospective investigation of the treatment planning phase of image-guided brachytherapy (BT) for cervical cancer using a specific checklist. From October 2012 to January 2014, 76 BT procedures were consecutively performed. Prospective data on the CT-based treatment planning process was collected using a specific checklist which details the following steps: (1) dosimetry planning, (2) physician review start, (3) physician review time, (4) dosimetry processing, (5) physics review start, (6) physics review, and (7) procedural pause. Variables examined included the use of a pre-BT MRI, clinic duty conflicts, resident teaching, and the use of specific BT planners. Analysis was performed using descriptive statistics, t-test, and analysis of variance. Seventy-five prospectively gathered checklists comprised this analysis. The mean time for treatment planning was 95 minutes (med 94, std 18). The mean intervals in the above steps were (1) = 42, (2) = 5, (3) = 19, (4) = 10, (5) = 6, (6) = 13, and (7) = 26 minutes. There was no statistical difference in patients who had a pre-BT MRI. Resident teaching did not influence time, p = 0.17. Treatment planning time was decreased with a specific planner, p = 0.0015. A skillful team approach is required for treatment planning efficiency in image-guided BT. We have found that the specific BT planners can have a significant effect on the overall planning efficiency. We continue to examine clinical and workflow-related factors that will enhance our safety and workflow process with BT. Published by Elsevier Inc.

  12. Efficient collective influence maximization in cascading processes with first-order transitions

    Science.gov (United States)

    Pei, Sen; Teng, Xian; Shaman, Jeffrey; Morone, Flaviano; Makse, Hernán A.

    2017-01-01

    In many social and biological networks, the collective dynamics of the entire system can be shaped by a small set of influential units through a global cascading process, manifested by an abrupt first-order transition in dynamical behaviors. Despite its importance in applications, efficient identification of multiple influential spreaders in cascading processes still remains a challenging task for large-scale networks. Here we address this issue by exploring the collective influence in general threshold models of cascading process. Our analysis reveals that the importance of spreaders is fixed by the subcritical paths along which cascades propagate: the number of subcritical paths attached to each spreader determines its contribution to global cascades. The concept of subcritical path allows us to introduce a scalable algorithm for massively large-scale networks. Results in both synthetic random graphs and real networks show that the proposed method can achieve larger collective influence given the same number of seeds compared with other scalable heuristic approaches. PMID:28349988

  13. Origin of poor doping efficiency in solution processed organic semiconductors.

    Science.gov (United States)

    Jha, Ajay; Duan, Hong-Guang; Tiwari, Vandana; Thorwart, Michael; Miller, R J Dwayne

    2018-05-21

    Doping is an extremely important process where intentional insertion of impurities in semiconductors controls their electronic properties. In organic semiconductors, one of the convenient, but inefficient, ways of doping is the spin casting of a precursor mixture of components in solution, followed by solvent evaporation. Active control over this process holds the key to significant improvements over current poor doping efficiencies. Yet, an optimized control can only come from a detailed understanding of electronic interactions responsible for the low doping efficiencies. Here, we use two-dimensional nonlinear optical spectroscopy to examine these interactions in the course of the doping process by probing the solution mixture of doped organic semiconductors. A dopant accepts an electron from the semiconductor and the two ions form a duplex of interacting charges known as ion-pair complexes. Well-resolved off-diagonal peaks in the two-dimensional spectra clearly demonstrate the electronic connectivity among the ions in solution. This electronic interaction represents a well resolved electrostatically bound state, as opposed to a random distribution of ions. We developed a theoretical model to recover the experimental data, which reveals an unexpectedly strong electronic coupling of ∼250 cm -1 with an intermolecular distance of ∼4.5 Å between ions in solution, which is approximately the expected distance in processed films. The fact that this relationship persists from solution to the processed film gives direct evidence that Coulomb interactions are retained from the precursor solution to the processed films. This memory effect renders the charge carriers equally bound also in the film and, hence, results in poor doping efficiencies. This new insight will help pave the way towards rational tailoring of the electronic interactions to improve doping efficiencies in processed organic semiconductor thin films.

  14. On the Use of Student Data in Efficiency Analysis--Technical Efficiency in Swedish Upper Secondary School

    Science.gov (United States)

    Waldo, Staffan

    2007-01-01

    While individual data form the base for much empirical analysis in education, this is not the case for analysis of technical efficiency. In this paper, efficiency is estimated using individual data which is then aggregated to larger groups of students. Using an individual approach to technical efficiency makes it possible to carry out studies on a…

  15. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  16. Analysis of the external and internal quantum efficiency of multi-emitter, white organic light emitting diodes

    Science.gov (United States)

    Furno, Mauro; Rosenow, Thomas C.; Gather, Malte C.; Lüssem, Björn; Leo, Karl

    2012-10-01

    We report on a theoretical framework for the efficiency analysis of complex, multi-emitter organic light emitting diodes (OLEDs). The calculation approach makes use of electromagnetic modeling to quantify the overall OLED photon outcoupling efficiency and a phenomenological description for electrical and excitonic processes. From the comparison of optical modeling results and measurements of the total external quantum efficiency, we obtain reliable estimates of internal quantum yield. As application of the model, we analyze high-efficiency stacked white OLEDs and comment on the various efficiency loss channels present in the devices.

  17. Operating Room Efficiency before and after Entrance in a Benchmarking Program for Surgical Process Data.

    Science.gov (United States)

    Pedron, Sara; Winter, Vera; Oppel, Eva-Maria; Bialas, Enno

    2017-08-23

    Operating room (OR) efficiency continues to be a high priority for hospitals. In this context the concept of benchmarking has gained increasing importance as a means to improve OR performance. The aim of this study was to investigate whether and how participation in a benchmarking and reporting program for surgical process data was associated with a change in OR efficiency, measured through raw utilization, turnover times, and first-case tardiness. The main analysis is based on panel data from 202 surgical departments in German hospitals, which were derived from the largest database for surgical process data in Germany. Panel regression modelling was applied. Results revealed no clear and univocal trend of participation in a benchmarking and reporting program for surgical process data. The largest trend was observed for first-case tardiness. In contrast to expectations, turnover times showed a generally increasing trend during participation. For raw utilization no clear and statistically significant trend could be evidenced. Subgroup analyses revealed differences in effects across different hospital types and department specialties. Participation in a benchmarking and reporting program and thus the availability of reliable, timely and detailed analysis tools to support the OR management seemed to be correlated especially with an increase in the timeliness of staff members regarding first-case starts. The increasing trend in turnover time revealed the absence of effective strategies to improve this aspect of OR efficiency in German hospitals and could have meaningful consequences for the medium- and long-run capacity planning in the OR.

  18. Predictive information speeds up visual awareness in an individuation task by modulating threshold setting, not processing efficiency.

    Science.gov (United States)

    De Loof, Esther; Van Opstal, Filip; Verguts, Tom

    2016-04-01

    Theories on visual awareness claim that predicted stimuli reach awareness faster than unpredicted ones. In the current study, we disentangle whether prior information about the upcoming stimulus affects visual awareness of stimulus location (i.e., individuation) by modulating processing efficiency or threshold setting. Analogous research on stimulus identification revealed that prior information modulates threshold setting. However, as identification and individuation are two functionally and neurally distinct processes, the mechanisms underlying identification cannot simply be extrapolated directly to individuation. The goal of this study was therefore to investigate how individuation is influenced by prior information about the upcoming stimulus. To do so, a drift diffusion model was fitted to estimate the processing efficiency and threshold setting for predicted versus unpredicted stimuli in a cued individuation paradigm. Participants were asked to locate a picture, following a cue that was congruent, incongruent or neutral with respect to the picture's identity. Pictures were individuated faster in the congruent and neutral condition compared to the incongruent condition. In the diffusion model analysis, the processing efficiency was not significantly different across conditions. However, the threshold setting was significantly higher following an incongruent cue compared to both congruent and neutral cues. Our results indicate that predictive information about the upcoming stimulus influences visual awareness by shifting the threshold for individuation rather than by enhancing processing efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Process improvement methods increase the efficiency, accuracy, and utility of a neurocritical care research repository.

    Science.gov (United States)

    O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor

    2012-08-01

    Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.

  20. Quantum efficiency harmonic analysis of exciton annihilation in organic light emitting diodes

    Energy Technology Data Exchange (ETDEWEB)

    Price, J. S.; Giebink, N. C., E-mail: ncg2@psu.edu [Department of Electrical Engineering, The Pennsylvania State University, University Park, Pennsylvania 16802 (United States)

    2015-06-29

    Various exciton annihilation processes are known to impact the efficiency roll-off of organic light emitting diodes (OLEDs); however, isolating and quantifying their contribution in the presence of other factors such as changing charge balance continue to be a challenge for routine device characterization. Here, we analyze OLED electroluminescence resulting from a sinusoidal dither superimposed on the device bias and show that nonlinearity between recombination current and light output arising from annihilation mixes the quantum efficiency measured at different dither harmonics in a manner that depends uniquely on the type and magnitude of the annihilation process. We derive a series of analytical relations involving the DC and first harmonic external quantum efficiency that enable annihilation rates to be quantified through linear regression independent of changing charge balance and evaluate them for prototypical fluorescent and phosphorescent OLEDs based on the emitters 4-(dicyanomethylene)-2-methyl-6-(4-dimethylaminostyryl)-4H-pyran and platinum octaethylporphyrin, respectively. We go on to show that, in most cases, it is sufficient to calculate the needed quantum efficiency harmonics directly from derivatives of the DC light versus current curve, thus enabling this analysis to be conducted solely from standard light-current-voltage measurement data.

  1. Prediction of strontium bromide laser efficiency using cluster and decision tree analysis

    Directory of Open Access Journals (Sweden)

    Iliev Iliycho

    2018-01-01

    Full Text Available Subject of investigation is a new high-powered strontium bromide (SrBr2 vapor laser emitting in multiline region of wavelengths. The laser is an alternative to the atom strontium lasers and electron free lasers, especially at the line 6.45 μm which line is used in surgery for medical processing of biological tissues and bones with minimal damage. In this paper the experimental data from measurements of operational and output characteristics of the laser are statistically processed by means of cluster analysis and tree-based regression techniques. The aim is to extract the more important relationships and dependences from the available data which influence the increase of the overall laser efficiency. There are constructed and analyzed a set of cluster models. It is shown by using different cluster methods that the seven investigated operational characteristics (laser tube diameter, length, supplied electrical power, and others and laser efficiency are combined in 2 clusters. By the built regression tree models using Classification and Regression Trees (CART technique there are obtained dependences to predict the values of efficiency, and especially the maximum efficiency with over 95% accuracy.

  2. Measuring sustainability by Energy Efficiency Analysis for Korean Power Companies: A Sequential Slacks-Based Efficiency Measure

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2014-03-01

    Full Text Available Improving energy efficiency has been widely regarded as one of the most cost-effective ways to improve sustainability and mitigate climate change. This paper presents a sequential slack-based efficiency measure (SSBM application to model total-factor energy efficiency with undesirable outputs. This approach simultaneously takes into account the sequential environmental technology, total input slacks, and undesirable outputs for energy efficiency analysis. We conduct an empirical analysis of energy efficiency incorporating greenhouse gas emissions of Korean power companies during 2007–2011. The results indicate that most of the power companies are not performing at high energy efficiency. Sequential technology has a significant effect on the energy efficiency measurements. Some policy suggestions based on the empirical results are also presented.

  3. Manufacturing polymer light emitting diode with high luminance efficiency by solution process

    Science.gov (United States)

    Kim, Miyoung; Jo, SongJin; Yang, Ho Chang; Yoon, Dang Mo; Kwon, Jae-Taek; Lee, Seung-Hyun; Choi, Ju Hwan; Lee, Bum-Joo; Shin, Jin-Koog

    2012-06-01

    While investigating polymer light emitting diodes (polymer-LEDs) fabricated by solution process, surface roughness influences electro-optical (E-O) characteristics. We expect that E-O characteristics such as luminance and power efficiency related to surface roughness and layer thickness of emitting layer with poly-9-Vinylcarbazole. In this study, we fabricated polymer organic light emitting diodes by solution process which guarantees easy, eco-friendly and low cost manufacturing for flexible display applications. In order to obtain high luminescence efficiency, E-O characteristics of these devices by varying parameters for printing process have been investigated. Therefore, we optimized process condition for polymer-LEDs by adjusting annealing temperatures of emission, thickness of emission layer showing efficiency (10.8 cd/A) at 10 mA/cm2. We also checked wavelength dependent electroluminescence spectrum in order to find the correlation between the variation of efficiency and the thickness of the layer.

  4. Hierarchically structured exergetic and exergoeconomic analysis and evaluation of energy conversion processes

    International Nuclear Information System (INIS)

    Hebecker, Dietrich; Bittrich, Petra; Riedl, Karsten

    2005-01-01

    Evaluation of the efficiency and economic benefit of energy conversion processes and technologies requires a scientifically based analysis. The hierarchically structured exergetic analysis provides a detailed characterization of complex technical systems. By defining corresponding evaluation coefficients, the exergetic efficiency can be assessed for units within the whole system. Based on this exergetic analysis, a thermoeconomic evaluation method is developed. A cost function is defined for all units, subsystems and the total plant, so that the cost flow in the system can be calculated. Three dimensionless coefficients, the Pauer factor, the loss coefficient and the cost factor, enable pinpointing cost intensive process units, allocating cost in cases of co-production and gaining insight for future design improvements. The methodology is demonstrated by a biomass gasification plant producing electricity, heat and cold

  5. Milk pasteurization: efficiency of the HTST process according to its bacterial concentration

    Directory of Open Access Journals (Sweden)

    Ari Ajzental

    1993-12-01

    Full Text Available The efficiency of milk pasteurization (HTST related to its standard plate count (SPC values were assessed in 41 milk samples using a laboratory designed pasteurizing equipment. Based on results, it is demonstrated that efficiency of the process is affected by its bacterial concentration, where lower SPC values mean decrease in efficiency and that the performance of the process is not affected in presence of high SPC values in raw product.

  6. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  7. Formation of the Integral Ecological Quality Index of the Technological Processes in Machine Building Based on Their Energy Efficiency

    Science.gov (United States)

    Egorov, Sergey B.; Kapitanov, Alexey V.; Mitrofanov, Vladimir G.; Shvartsburg, Leonid E.; Ivanova, Natalia A.; Ryabov, Sergey A.

    2016-01-01

    The aim of article is to provide development of a unified assessment methodology in relation to various technological processes and the actual conditions of their implementation. To carry the energy efficiency analysis of the technological processes through comparison of the established power and the power consumed by the actual technological…

  8. Product Chemistry and Process Efficiency of Biomass Torrefaction, Pyrolysis and Gasification Studied by High-Throughput Techniques and Multivariate Analysis

    Science.gov (United States)

    Xiao, Li

    Despite the great passion and endless efforts on development of renewable energy from biomass, the commercialization and scale up of biofuel production is still under pressure and facing challenges. New ideas and facilities are being tested around the world targeting at reducing cost and improving product value. Cutting edge technologies involving analytical chemistry, statistics analysis, industrial engineering, computer simulation, and mathematics modeling, etc. keep integrating modern elements into this classic research. One of those challenges of commercializing biofuel production is the complexity from chemical composition of biomass feedstock and the products. Because of this, feedstock selection and process optimization cannot be conducted efficiently. This dissertation attempts to further evaluate biomass thermal decomposition process using both traditional methods and advanced technique (Pyrolysis Molecular Beam Mass Spectrometry). Focus has been made on data base generation of thermal decomposition products from biomass at different temperatures, finding out the relationship between traditional methods and advanced techniques, evaluating process efficiency and optimizing reaction conditions, comparison of typically utilized biomass feedstock and new search on innovative species for economical viable feedstock preparation concepts, etc. Lab scale quartz tube reactors and 80il stainless steel sample cups coupled with auto-sampling system were utilized to simulate the complicated reactions happened in real fluidized or entrained flow reactors. Two main high throughput analytical techniques used are Near Infrared Spectroscopy (NIR) and Pyrolysis Molecular Beam Mass Spectrometry (Py-MBMS). Mass balance, carbon balance, and product distribution are presented in detail. Variations of thermal decomposition temperature range from 200°C to 950°C. Feedstocks used in the study involve typical hardwood and softwood (red oak, white oak, yellow poplar, loblolly pine

  9. Efficient Adoption and Assessment of Multiple Process Improvement Reference Models

    Directory of Open Access Journals (Sweden)

    Simona Jeners

    2013-06-01

    Full Text Available A variety of reference models such as CMMI, COBIT or ITIL support IT organizations to improve their processes. These process improvement reference models (IRMs cover different domains such as IT development, IT Services or IT Governance but also share some similarities. As there are organizations that address multiple domains and need to coordinate their processes in their improvement we present MoSaIC, an approach to support organizations to efficiently adopt and conform to multiple IRMs. Our solution realizes a semantic integration of IRMs based on common meta-models. The resulting IRM integration model enables organizations to efficiently implement and asses multiple IRMs and to benefit from synergy effects.

  10. The Efficient Separations and Processing Integrated Program

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Gephart, J.M.

    1994-08-01

    The Efficient Separations and Processing Integrated Program (ESPIP) was created in 1991 to identify, develop, and perfect separations technologies and processes to treat wastes and address environmental problems throughout the US Department of Energy (DOE) complex. The ESPIP funds several multiyear tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESPIP supports applied R ampersand D leading to demonstration or use of these separations technologies by other organizations within DOE's Office of Environmental Restoration and Waste Management. Examples of current ESPIP-funded separations technologies are described here

  11. Economic Analysis of Factors Affecting Technical Efficiency of ...

    African Journals Online (AJOL)

    Economic Analysis of Factors Affecting Technical Efficiency of Smallholders ... socio-economic characteristics which influence technical efficiency in maize production. ... Ministry of Agriculture and livestock, records, books, reports and internet.

  12. Efficient Processing of Multiple DTW Queries in Time Series Databases

    DEFF Research Database (Denmark)

    Kremer, Hardy; Günnemann, Stephan; Ivanescu, Anca-Maria

    2011-01-01

    . In many of today’s applications, however, large numbers of queries arise at any given time. Existing DTW techniques do not process multiple DTW queries simultaneously, a serious limitation which slows down overall processing. In this paper, we propose an efficient processing approach for multiple DTW...... for multiple DTW queries....

  13. Clean and efficient energy conversion processes (Cecon-project). Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    The objectives of the work programme reported are the development and testing of two optimised energy conversion processes, both consisting of a radiant surface gas burner and a ceramic heat exchanger. The first sub-objective of the programme is related to industrial heating, drying and curing processes requireing low and medium heat fluxes. It is estimated that around one tenth of the total EC industrial energy use is associated with such processes. The majority of these processes currently use convection and conduction as the main heat transfer mechanisms and overall energy efficiencies are typically below 25%. For many drying and finishing processes (such as curing powder coatings and drying paints, varnishes, inks, and for the fabrication of paper and textiles), radiant heating can achieve much faster dyring rates and higher energy efficiency than convective heating. In the project new concepts of natural gas fired radiant heating have been investigated which would be much more efficient than the existing processes. One element of the programme was the evelopment of gas burners having enhanced radiant efficiencies. A second concerned the investigation of the safety of gas burners containing significant volumes of mixed gas and air. Finally the new gas burners were tested in combination with the high temperature heat exchanger to create highly efficient radiant heating systems. The second sub-objective concerned the development of a compact low cost heat exchanger capable of achieving high levels of heat recovery (up to 60%) which could be easily installed on industrial processes. This would make heat recovery a practical proposition on processes where existing heat recovery technology is currently not cost effective. The project will have an impact on industrial processes consuming around 80 MTOE of energy per year within EU countries (1 MTOE equals 41.8 PJ). The overall energy saving potential of the project is estimated to be around 22 MTOE which is around 10

  14. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Park, Dae Woong [Korea Testing and Research Institute, Kwachun (Korea, Republic of)

    2015-03-15

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  15. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    International Nuclear Information System (INIS)

    Park, Dae Woong

    2015-01-01

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  16. Operating Room Efficiency before and after Entrance in a Benchmarking Program for Surgical Process Data

    DEFF Research Database (Denmark)

    Pedron, Sara; Winter, Vera; Oppel, Eva-Maria

    2017-01-01

    Operating room (OR) efficiency continues to be a high priority for hospitals. In this context the concept of benchmarking has gained increasing importance as a means to improve OR performance. The aim of this study was to investigate whether and how participation in a benchmarking and reporting...... program for surgical process data was associated with a change in OR efficiency, measured through raw utilization, turnover times, and first-case tardiness. The main analysis is based on panel data from 202 surgical departments in German hospitals, which were derived from the largest database for surgical...... the availability of reliable, timely and detailed analysis tools to support the OR management seemed to be correlated especially with an increase in the timeliness of staff members regarding first-case starts. The increasing trend in turnover time revealed the absence of effective strategies to improve this aspect...

  17. Energy saving analysis and management modeling based on index decomposition analysis integrated energy saving potential method: Application to complex chemical processes

    International Nuclear Information System (INIS)

    Geng, Zhiqiang; Gao, Huachao; Wang, Yanqing; Han, Yongming; Zhu, Qunxiong

    2017-01-01

    Highlights: • The integrated framework that combines IDA with energy-saving potential method is proposed. • Energy saving analysis and management framework of complex chemical processes is obtained. • This proposed method is efficient in energy optimization and carbon emissions of complex chemical processes. - Abstract: Energy saving and management of complex chemical processes play a crucial role in the sustainable development procedure. In order to analyze the effect of the technology, management level, and production structure having on energy efficiency and energy saving potential, this paper proposed a novel integrated framework that combines index decomposition analysis (IDA) with energy saving potential method. The IDA method can obtain the level of energy activity, energy hierarchy and energy intensity effectively based on data-drive to reflect the impact of energy usage. The energy saving potential method can verify the correctness of the improvement direction proposed by the IDA method. Meanwhile, energy efficiency improvement, energy consumption reduction and energy savings can be visually discovered by the proposed framework. The demonstration analysis of ethylene production has verified the practicality of the proposed method. Moreover, we can obtain the corresponding improvement for the ethylene production based on the demonstration analysis. The energy efficiency index and the energy saving potential of these worst months can be increased by 6.7% and 7.4%, respectively. And the carbon emissions can be reduced by 7.4–8.2%.

  18. Multidirectional analysis of technical efficiency for pig production systems

    DEFF Research Database (Denmark)

    Labajavo, Katarina; Hansson, Helena; Asmild, Mette

    2016-01-01

    Declining profitability and ongoing structural changes in the pig sector require thorough efficiency analysis of individual production factors. In this study we calculated technical efficiency indices for each input and output using multidirectional efficiency analysis and examined the relationship...... between ‘farm-specific characteristics’ and input and output technical efficiencies by production type (piglet, growing-finishing, finish-to-farrow). The results indicated that advisory services and farm location were not significantly correlated with technical efficiency. Similar results were obtained...... for ‘housing practices’, with the exception of the latest technology such as heated floors in relation to input labour technical efficiency for growing-finishing and finish-to-farrow productions. Use of written instructions for feeding for growing-finishing and finish-to-farrow production and written...

  19. ANALYSIS OF RESOURCE USE EFFICIENCY AMONG SOYBEAN ...

    African Journals Online (AJOL)

    2015-02-25

    Feb 25, 2015 ... KEYWORDS: Analysis, Resource use efficiency, Farmers, production function analysis, Benue, Nigeria. ... Soybean seeds also contain about 20% oil on a dry matter basis, and this ..... Manual for training in Seed Technology.

  20. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    Science.gov (United States)

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Automation and efficiency in the operational processes: a case study in a logistics operator

    OpenAIRE

    Nascimento, Dener Gomes do; Silva, Giovanni Henrique da

    2017-01-01

    Globalization has made the automations become increasingly feasible and with the technological development many operations can be optimized, bringing productivity gains. Logistics is a major benefit of all this development, because lives a time extremely competitive, in which being efficient is a requirement to stay alive in the market. Inserted in this context, this article seeks from the analysis of the processes in a distribution center, identify opportunities to automate operations to gai...

  2. Life-cycle cost analysis of energy efficiency design options for residential furnaces and boilers

    International Nuclear Information System (INIS)

    Lutz, James; Lekov, Alex; Chan, Peter; Whitehead, Camilla Dunham; Meyers, Steve; McMahon, James

    2006-01-01

    In 2001, the US Department of Energy (DOE) initiated a rulemaking process to consider whether to amend the existing energy efficiency standards for furnaces and boilers. A key factor in DOE's consideration of new standards is the economic impacts on consumers of possible revisions to energy-efficiency standards. Determining cost-effectiveness requires an appropriate comparison of the additional first cost of energy efficiency design options with the savings in operating costs. DOE's preferred approach involves comparing the total life-cycle cost (LCC) of owning and operating a more efficient appliance with the LCC for a baseline design. This study describes the method used to conduct the LCC analysis and presents the estimated change in LCC associated with more energy-efficient equipment. The results indicate that efficiency improvement relative to the baseline design can reduce the LCC in each of the product classes considered

  3. Image Segmentation and Processing for Efficient Parking Space Analysis

    OpenAIRE

    Tutika, Chetan Sai; Vallapaneni, Charan; R, Karthik; KP, Bharath; Muthu, N Ruban Rajesh Kumar

    2018-01-01

    In this paper, we develop a method to detect vacant parking spaces in an environment with unclear segments and contours with the help of MATLAB image processing capabilities. Due to the anomalies present in the parking spaces, such as uneven illumination, distorted slot lines and overlapping of cars. The present-day conventional algorithms have difficulties processing the image for accurate results. The algorithm proposed uses a combination of image pre-processing and false contour detection ...

  4. Research efficiency assessment of Colombian public universities 2003-2012: data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Gabriel Yáñez Canal

    2015-07-01

    Full Text Available In 2003, the process of public universities evaluation began. For this purpose, a set of performance indicators constructed by the Public University System (SUE by its acronym in Spanish in alliance with the Ministry of National Education (MEN was used. In an effort to know about the research efficiency level that public universities had in the period 2003-2012, an analysis of the results of these indicators was executed using Data Envelopment Analysis. In particular, the product-oriented CCR model was applied. Although many universities have experienced a sustained development in some of the indicators analyzed and show high relative levels of efficiency, the results show that, as a whole, the Public University System has still much to improve regarding its scientific mission, especially, those aspects related to graduate programs and scientific journals.

  5. Energy efficiency opportunities in the production process of cast iron foundries: An experience in Italy

    International Nuclear Information System (INIS)

    Lazzarin, Renato M.; Noro, Marco

    2015-01-01

    Foundry sector is one of the most energy intensive in industry. Energy audits performed in 5 Italian cast iron foundries allowed to identify energy utilization in the various processes that from the melting of the iron arrive at the finishing of the casting. Main equipment was surveyed, evaluating the influence on the overall energy consumption, producing a detailed analysis of energy use per department and energy performance indexes. A separate study was carried out for foundries with induction furnaces and cold or hot blast cupolas. Possibilities of heat recovery was identified particularly in combustion air preheating, but also for building heating or to power direct cycles to produce electricity. Better insulation and new insulating materials can improve the efficiency and the quality of the processes. Suggestions are supplied in the various foundry departments for energy saving. Possible energy saving actions on the service plants will be dealt with in a separate paper. - Highlights: • The Authors performed energy audits in 5 Italian cast iron foundries. • Main equipment was surveyed, evaluating the influence on the overall energy consumption. • An analysis of energy use per department and energy performance indexes was performed. • Possibilities of heat recovery were identified in combustion air preheating and for building heating. • Better and new insulating materials were analyzed to improve the efficiency and process quality.

  6. Development and efficiency assessment of process lubrication for hot forging

    Science.gov (United States)

    Kargin, S.; Artyukh, Viktor; Ignatovich, I.; Dikareva, Varvara

    2017-10-01

    The article considers innovative technologies in testing and production of process lubricants for hot bulk forging. There were developed new compositions of eco-friendly water-graphite process lubricants for hot extrusion and forging. New approaches to efficiency assessment of process lubricants are developed and described in the following article. Laboratory and field results are presented.

  7. Business process reengineering and Nigerian banking system efficiency

    Directory of Open Access Journals (Sweden)

    John N. N. Ugoani

    2017-12-01

    Full Text Available Prior to 2000, and before banks in Nigeria embraced the NBS was inefficient, characterized by frauds, long queues, nonperforming loans, illiquidity and distress. As one way of overcoming these challenges banks started to focus on BPR as a veritable tool to drive efficiency customer satisfaction and improved shareholder value. With the advent of BPR and process improvement efficiency gradually strolled back in to the NBS Against the prereengineering era when the liquidity ratio of the NBS was minus 15.92 percent in 1996 with no bank meeting the 30 percent minimum prudential requirement, the NBS had a positive average liquidity ratio of 65.69 in 2011 with all the banks meeting the 30 percent minimum liquidity ratio. The banks that introduced BPR early in the 2000s have remained without distress, liquid, efficient with high growths in gross earnings, total assets profitability and total equity. The research design was deployed for the study, and it was found that BPR has positive effect on NBS efficiency.

  8. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  9. Efficiency in the Worst Production Situation Using Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Md. Kamrul Hossain

    2013-01-01

    Full Text Available Data envelopment analysis (DEA measures relative efficiency among the decision making units (DMU without considering noise in data. The least efficient DMU indicates that it is in the worst situation. In this paper, we measure efficiency of individual DMU whenever it losses the maximum output, and the efficiency of other DMUs is measured in the observed situation. This efficiency is the minimum efficiency of a DMU. The concept of stochastic data envelopment analysis (SDEA is a DEA method which considers the noise in data which is proposed in this study. Using bounded Pareto distribution, we estimate the DEA efficiency from efficiency interval. Small value of shape parameter can estimate the efficiency more accurately using the Pareto distribution. Rank correlations were estimated between observed efficiencies and minimum efficiency as well as between observed and estimated efficiency. The correlations are indicating the effectiveness of this SDEA model.

  10. The Design-to-Analysis Process at Sandia National Laboratories Observations and Recommendations; TOPICAL

    International Nuclear Information System (INIS)

    BURNS, SHAWN P.; HARRISON, RANDY J.; DOBRANICH, DEAN

    2001-01-01

    The efficiency of the design-to-analysis process for translating solid-model-based design data to computational analysis model data plays a central role in the application of computational analysis to engineering design and certification. A review of the literature from within Sandia as well as from industry shows that the design-to-analysis process involves a number of complex organizational and technological issues. This study focuses on the design-to-analysis process from a business process standpoint and is intended to generate discussion regarding this important issue. Observations obtained from Sandia staff member and management interviews suggest that the current Sandia design-to-analysis process is not mature and that this cross-organizational issue requires committed high-level ownership. A key recommendation of the study is that additional resources should be provided to the computer aided design organizations to support design-to-analysis. A robust community of practice is also needed to continuously improve the design-to-analysis process and to provide a corporate perspective

  11. Energy efficient processing of natural resources; Energieeffiziente Verarbeitung natuerlicher Rohstoffe

    Energy Technology Data Exchange (ETDEWEB)

    Pehlken, Alexandra [Univ. Bremen (Germany). Projekt FU2; Hans, Carl [Bremer Institut fuer Produktion und Logistik GmbH BIBA, Bremen (Germany). Abt. Intelligente Informations- und Kommunikationsumgebungen fuer die kooperative Produktion im Forschungsbereich Informations- und Kommunikationstechnische Anwendungen; Thoben, Klaus-Dieter [Univ. Bremen (Germany). Inst. fuer integrierte Produktentwicklung; Bremer Institut fuer Produktion und Logistik GmbH BIBA, Bremen (Germany). Forschungsbereich Informations- und kommunikationstechnische Anwendungen; Austing, Bernhard [Fa. Austing, Damme (Germany)

    2012-10-15

    Energy efficiency is gaining high importance in production processes. High energy consumption is directly related to high costs. The processing of natural resources is resulting in additional energy input because of defined output quality demands. This paper discussed approaches and IT-solutions for the automatically adjustment of production processes to cope with varying input qualities. The intention is to achieve the lowest energy input into the process without quality restraints.

  12. Wavelet analysis of molecular dynamics: Efficient extraction of time-frequency information in ultrafast optical processes

    International Nuclear Information System (INIS)

    Prior, Javier; Castro, Enrique; Chin, Alex W.; Almeida, Javier; Huelga, Susana F.; Plenio, Martin B.

    2013-01-01

    New experimental techniques based on nonlinear ultrafast spectroscopies have been developed over the last few years, and have been demonstrated to provide powerful probes of quantum dynamics in different types of molecular aggregates, including both natural and artificial light harvesting complexes. Fourier transform-based spectroscopies have been particularly successful, yet “complete” spectral information normally necessitates the loss of all information on the temporal sequence of events in a signal. This information though is particularly important in transient or multi-stage processes, in which the spectral decomposition of the data evolves in time. By going through several examples of ultrafast quantum dynamics, we demonstrate that the use of wavelets provide an efficient and accurate way to simultaneously acquire both temporal and frequency information about a signal, and argue that this greatly aids the elucidation and interpretation of physical process responsible for non-stationary spectroscopic features, such as those encountered in coherent excitonic energy transport

  13. On analysis of operating efficiency of autonomous ventilation systems

    Directory of Open Access Journals (Sweden)

    Kostuganov Arman

    2017-01-01

    Full Text Available The paper deals with the causes and consequences of malfunctioning of natural and mechanical ventilation systems in civil buildings of Russia. Furthermore it gives their classification and analysis based on the literature review. On the basis of the analysis technical solutions for improving the efficiency of ventilation systems in civil buildings are summarized and the field of their application is specified. Among the offered technical solutions the use of autonomous ventilation systems with heat recovery is highlighted as one of the most promising and understudied. Besides it has a wide range of applications. The paper reviews and analyzes the main Russian and foreign designs of ventilation systems with heat recovery that are mostly used in practice. Three types of such systems: UVRK-50, Prana-150, ТеFо are chosen for consideration. The sequence of field tests of selected autonomous ventilation systems have been carried out in order to determine the actual air exchange and efficiency of heat recovery. The paper presents the processed results of the research on the basis of which advantages and disadvantages of the tested ventilation systems are identified and recommendations for engineering and manufacturing of new design models of autonomous ventilation systems with heat recovery are formulated.

  14. Life-cycle cost analysis of energy efficiency design options for residential furnaces and boilers

    Energy Technology Data Exchange (ETDEWEB)

    Lutz, J.; Lekov, A.; Chan, P.; Dunham Whitehead, C.; Meyers, S.; McMahon, J. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States). Environmental Energy Technologies Div.

    2006-03-01

    In 2001, the US Department of Energy (DOE) initiated a rulemaking process to consider whether to amend the existing energy efficiency standards for furnaces and boilers. A key factor in DOE's consideration of new standards is the economic impacts on consumers of possible revisions to energy-efficiency standards. Determining cost-effectiveness requires an appropriate comparison of the additional first cost of energy efficiency design options with the savings in operating costs. DOE's preferred approach involves comparing the total life-cycle cost (LCC) of owning and operating a more efficient appliance with the LCC for a baseline design. This study describes the method used to conduct the LCC analysis and presents the estimated change in LCC associated with more energy-efficient equipment. The results indicate that efficiency improvement relative to the baseline design can reduce the LCC in each of the product classes considered. (author)

  15. Process development and exergy cost sensitivity analysis of a hybrid molten carbonate fuel cell power plant and carbon dioxide capturing process

    Science.gov (United States)

    Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.

    2017-10-01

    An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.

  16. Using Field Data for Energy Efficiency Based on Maintenance and Operational Optimisation. A Step towards PHM in Process Plants

    Directory of Open Access Journals (Sweden)

    Micaela Demichela

    2018-03-01

    Full Text Available Energy saving is an important issue for any industrial sector; in particular, for the process industry, it can help to minimize both energy costs and environmental impact. Maintenance optimization and operational procedures can offer margins to increase energy efficiency in process plants, even if they are seldom explicitly taken into account in the predictive models guiding the energy saving policies. To ensure that the plant achieves the desired performance, maintenance operations and maintenance results should be monitored, and the connection between the inputs and the outcomes of the maintenance process, in terms of total contribution to manufacturing performance, should be explicit. In this study, a model for the energy efficiency analysis was developed, based on cost and benefits balance. It is aimed at supporting the decision making in terms of technical and operational solutions for energy efficiency, through the optimization of maintenance interventions and operational procedures. A case study is here described: the effects on energy efficiency of technical and operational optimization measures for bituminous materials production process equipment. The idea of the Conservation Supply Curve (CSC was used to capture both the cost effectiveness of the measures and the energy efficiency effectiveness. The optimization was thus based on the energy consumption data registered on-site: data collection and modelling of the relevant data were used as a base to implement a prognostic and health management (PHM policy in the company. Based on the results from the analysis, efficiency measures for the industrial case study were proposed, also in relation to maintenance optimization and operating procedures. In the end, the impacts of the implementation of energy saving measures on the performance of the system, in terms of technical and economic feasibility, were demonstrated. The results showed that maintenance optimization could help in reaching

  17. Technical efficiency of women's health prevention programs in Bucaramanga, Colombia: a four-stage analysis.

    Science.gov (United States)

    Ruiz-Rodriguez, Myriam; Rodriguez-Villamizar, Laura A; Heredia-Pi, Ileana

    2016-10-13

    Primary Health Care (PHC) is an efficient strategy to improve health outcomes in populations. Nevertheless, studies of technical efficiency in health care have focused on hospitals, with very little on primary health care centers. The objective of the present study was to use the Data Envelopment Analysis to estimate the technical efficiency of three women's health promotion and disease prevention programs offered by primary care centers in Bucaramanga, Colombia. Efficiency was measured using a four-stage data envelopment analysis with a series of Tobit regressions to account for the effect of quality outcomes and context variables. Input/output information was collected from the institutions' records, chart reviews and personal interviews. Information about contextual variables was obtained from databases from the primary health program in the municipality. A jackknife analysis was used to assess the robustness of the results. The analysis was based on data from 21 public primary health care centers. The average efficiency scores, after adjusting for quality and context, were 92.4 %, 97.5 % and 86.2 % for the antenatal care (ANC), early detection of cervical cancer (EDCC) and family planning (FP) programs, respectively. On each program, 12 of the 21 (57.1 %) health centers were found to be technically efficient; having had the best-practice frontiers. Adjusting for context variables changed the scores and reference rankings of the three programs offered by the health centers. The performance of the women's health prevention programs offered by the centers was found to be heterogeneous. Adjusting for context and health care quality variables had a significant effect on the technical efficiency scores and ranking. The results can serve as a guide to strengthen management and organizational and planning processes related to local primary care services operating within a market-based model such as the one in Colombia.

  18. Efficient processing of two-dimensional arrays with C or C++

    Science.gov (United States)

    Donato, David I.

    2017-07-20

    Because fast and efficient serial processing of raster-graphic images and other two-dimensional arrays is a requirement in land-change modeling and other applications, the effects of 10 factors on the runtimes for processing two-dimensional arrays with C and C++ are evaluated in a comparative factorial study. This study’s factors include the choice among three C or C++ source-code techniques for array processing; the choice of Microsoft Windows 7 or a Linux operating system; the choice of 4-byte or 8-byte array elements and indexes; and the choice of 32-bit or 64-bit memory addressing. This study demonstrates how programmer choices can reduce runtimes by 75 percent or more, even after compiler optimizations. Ten points of practical advice for faster processing of two-dimensional arrays are offered to C and C++ programmers. Further study and the development of a C and C++ software test suite are recommended.Key words: array processing, C, C++, compiler, computational speed, land-change modeling, raster-graphic image, two-dimensional array, software efficiency

  19. [Efficiency indicators to assess the organ donation and transplantation process: systematic review of the literature].

    Science.gov (United States)

    Siqueira, Marina Martins; Araujo, Claudia Affonso; de Aguiar Roza, Bartira; Schirmer, Janine

    2016-08-01

    To search the literature and identify indicators used to monitor and control the organ donation and transplantation process and to group these indicators into categories. In November 2014, a systematic review of the literature was carried out in the following databases: Biblioteca Virtual em Saúde (BVS), EBSCO, Emerald, Proquest, Science Direct, and Web of Science. The following search terms (and the corresponding terms in Brazilian Portuguese) were employed: "efficiency," "indicators," "organ donation," "tissue and organ procurement," and "organ transplantation." Of the 344 articles retrieved, 23 original articles published between 1992 and 2013 were selected and reviewed for analysis of efficiency indicators. The review revealed 117 efficiency indicators, which were grouped according to similarity of content and divided into three categories: 1) 71 indicators related to organ donation, covering mortality statistics, communication of brain death, clinical status of donors and exclusion of donors for medical reasons, attitude of families, confirmation of donations, and extraction of organs and tissues; 2) 22 indicators related to organ transplantation, covering the surgical procedure per se and post-transplantation follow-up; and 3) 24 indicators related to the demand for organs and the resources of hospitals involved in the process. Even if organ transplantation is a recent phenomenon, the high number of efficiency indicators described in the literature suggests that scholars interested in this field have been searching for ways to measure performance. However, there is little standardization of the indicators used. Also, most indicators focus on the donation step, suggesting gaps in the measurement of efficiency at others points in the process. Additional indicators are needed to monitor important stages, such as organ distribution (for example, organ loss indicators) and post-transplantation aspects (for example, survival and quality of life).

  20. Data Envelopment Analysis as an Instrument for Measuring the Efficiency of Courts

    Directory of Open Access Journals (Sweden)

    Wojciech Major

    2015-01-01

    Full Text Available The paper addresses the problem of measuring the efficiency of civil jurisdiction courts. Non-parametric data envelopment analysis (DEA has been proposed as a measurement instrument. Hearing (settling a case within a reasonable time, as seen from the perspective of a citizen, is defined to be a positive output (result of a court action. The production factors considered include human resources directly related to legal processes. The analysis was carried out for the 26 Cracow district courts. The goal has been assumed to be achieving the best possible outputs, without increasing resources. The results obtained prove that there exist reserves within these organizations that would allow them to shorten the queue of pending cases. The proposed method of measuring efficiency may constitute a starting point for further work on trying to create measurable standards of the functioning of the judiciary in Poland. (original abstract

  1. Efficient Analysis of Simulations of the Sun's Magnetic Field

    Science.gov (United States)

    Scarborough, C. W.; Martínez-Sykora, J.

    2014-12-01

    Dynamics in the solar atmosphere, including solar flares, coronal mass ejections, micro-flares and different types of jets, are powered by the evolution of the sun's intense magnetic field. 3D Radiative Magnetohydrodnamics (MHD) computer simulations have furthered our understanding of the processes involved: When non aligned magnetic field lines reconnect, the alteration of the magnetic topology causes stored magnetic energy to be converted into thermal and kinetic energy. Detailed analysis of this evolution entails tracing magnetic field lines, an operation which is not time-efficient on a single processor. By utilizing a graphics card (GPU) to trace lines in parallel, conducting such analysis is made feasible. We applied our GPU implementation to the most advanced 3D Radiative-MHD simulations (Bifrost, Gudicksen et al. 2011) of the solar atmosphere in order to better understand the evolution of the modeled field lines.

  2. Lean manufacturing analysis to reduce waste on production process of fan products

    Science.gov (United States)

    Siregar, I.; Nasution, A. A.; Andayani, U.; Sari, R. M.; Syahputri, K.; Anizar

    2018-02-01

    This research is based on case study that being on electrical company. One of the products that will be researched is the fan, which when running the production process there is a time that is not value-added, among others, the removal of material which is not efficient in the raw materials and component molding fan. This study aims to reduce waste or non-value added activities and shorten the total lead time by using the tools Value Stream Mapping. Lean manufacturing methods used to analyze and reduce the non-value added activities, namely the value stream mapping analysis tools, process mapping activity with 5W1H, and tools 5 whys. Based on the research note that no value-added activities in the production process of a fan of 647.94 minutes of total lead time of 725.68 minutes. Process cycle efficiency in the production process indicates that the fan is still very low at 11%. While estimates of the repair showed a decrease in total lead time became 340.9 minutes and the process cycle efficiency is greater by 24%, which indicates that the production process has been better.

  3. Enhancement of the efficiency of the Open Cycle Phillips Optimized Cascade LNG process

    International Nuclear Information System (INIS)

    Fahmy, M.F.M.; Nabih, H.I.; El-Nigeily, M.

    2016-01-01

    Highlights: • Expanders replaced JT valves in the Phillips Optimized Cascade liquefaction process. • Improvement in plant liquefaction efficiency was evaluated in presence of expanders. • Comparison of the different optimum cases for the liquefaction process was presented. - Abstract: This study aims to improve the performance of the Open Cycle Phillips Optimized Cascade Process for the production of liquefied natural gas (LNG) through the replacement of Joule–Thomson (JT) valves by expanders. The expander has a higher thermodynamic efficiency than the JT valve. Moreover, the produced shaft power from the expander is integrated into the process. The study is conducted using the Aspen HYSYS-V7 simulation software for simulation of the Open Cycle Phillips Optimized Cascade Process having the JT valves. Simulation of several proposed cases in which expanders are used instead of JT valves at different locations in the process as at the propane cycle, ethylene cycle, methane cycle and the upstream of the heavies removal column is conducted. The optimum cases clearly indicate that expanders not only produce power, but also offer significant improvements in the process performance as shown by the total plant power consumption, LNG production, thermal efficiency, plant specific power and CO_2 emissions reduction. Results also reveal that replacing JT valves by expanders in the methane cycle has a dominating influence on all performance criteria and hence, can be considered as the main key contributor affecting the Phillips Optimized Cascade Process leading to a notable enhancement in its efficiency. This replacement of JT valves by liquid expanders at different locations of the methane cycle encounters power savings in the range of 4.92–5.72%, plant thermal efficiency of 92.64–92.97% and an increase in LNG production of 5.77–7.04%. Moreover, applying liquid expanders at the determined optimum cases for the different cycles, improves process performance and

  4. Impact of lean six sigma process improvement methodology on cardiac catheterization laboratory efficiency.

    Science.gov (United States)

    Agarwal, Shikhar; Gallo, Justin J; Parashar, Akhil; Agarwal, Kanika K; Ellis, Stephen G; Khot, Umesh N; Spooner, Robin; Murat Tuzcu, Emin; Kapadia, Samir R

    2016-03-01

    Operational inefficiencies are ubiquitous in several healthcare processes. To improve the operational efficiency of our catheterization laboratory (Cath Lab), we implemented a lean six sigma process improvement initiative, starting in June 2010. We aimed to study the impact of lean six sigma implementation on improving the efficiency and the patient throughput in our Cath Lab. All elective and urgent cardiac catheterization procedures including diagnostic coronary angiography, percutaneous coronary interventions, structural interventions and peripheral interventions performed between June 2009 and December 2012 were included in the study. Performance metrics utilized for analysis included turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start and manual sheath-pulls inside the Cath Lab. After implementation of lean six sigma in the Cath Lab, we observed a significant improvement in turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start as well as sheath-pulls inside the Cath Lab. The percentage of cases with optimal turn-time increased from 43.6% in 2009 to 56.6% in 2012 (p-trendprocess improvement initiative, lean six sigma, on improving and sustaining efficiency of our Cath Lab operation. After the successful implementation of this continuous quality improvement initiative, there was a significant improvement in the selected performance metrics namely turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start as well as sheath-pulls inside the Cath Lab. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Decontamination Efficiency of Fish Bacterial Flora from Processing Surfaces

    Directory of Open Access Journals (Sweden)

    Birna Guðbjörnsdóttir

    2009-01-01

    Full Text Available There are numerous parameters that can influence bacterial decontamination during washing of machinery and equipment in a food processing establishment. Incomplete decontamination of bacteria will increase the risk of biofilm formation and consequently increase the risk of pathogen contamination or prevalence of other undesirable microorganisms such as spoilage bacteria in the processing line. The efficiency of a typical washing protocol has been determined by testing three critical parameters and their effects on bacterial decontamination. Two surface materials (plastic and stainless steel, water temperatures (7 and 25 °C and detergent concentrations (2 and 4 % were used for this purpose in combination with two types of detergents. Biofilm was prepared on the surfaces with undefined bacterial flora obtained from minced cod fillets. The bacterial flora of the biofilm was characterised by cultivation and molecular analysis of 16S rRNA genes. All different combinations of washing protocols tested were able to remove more than 99.9 % of the bacteria in the biofilm and reduce the cell number from 7 to 0 or 2 log units of bacteria/cm2. The results show that it is possible to use less diluted detergents than recommended with comparable success, and it is easier to clean surface material made of stainless steel compared to polyethylene plastic.

  6. Unified Analysis of Multi-Chamber Contact Tanks and Mixing Efficiency Based on Vorticity Field. Part I: Hydrodynamic Analysis

    Directory of Open Access Journals (Sweden)

    Ender Demirel

    2016-11-01

    Full Text Available Multi-chamber contact tanks have been extensively used in industry for water treatment to provide potable water to communities, which is essential for human health. To evaluate the efficiency of this treatment process, flow and tracer transport analysis have been used in the literature using Reynolds averaged Navier–Stokes (RANS and large-eddy simulations (LES. The purpose of this study is two-fold. First a unifying analysis of the flow field is presented and similarities and differences in the numerical results that were reported in the literature are discussed. Second, the vorticity field is identified as the key parameter to use in separating the mean flow (jet zone and the recirculating zones. Based on the concepts of vorticity gradient and flexion product, it is demonstrated that the separation of the recirculation zone and the jet zone, fluid-fluid flow separation, is possible. The separation of the recirculation zones and vortex core lines are characterized using the definition of the Lamb vector. The separated regions are used to characterize the mixing efficiency in the chambers of the contact tank. This analysis indicates that the recirculation zone and jet zone formation are three-dimensional and require simulations over a long period of time to reach stability. It is recognized that the characteristics of the jet zones and the recirculation zones are distinct for each chamber and they follow a particular pattern and symmetry between the alternating chambers. Hydraulic efficiency coefficients calculated for each chamber show that the chambers having an inlet adjacent to the free surface may be designed to have larger volumes than the chambers having wall bounded inlets to improve the efficiency of the contact tank. This is a simple design alternative that would increase the efficiency of the system. Other observations made through the chamber analysis are also informative in redefining the characteristics of the efficiency of the

  7. ASSESSMENT OF REGIONAL EFFICIENCY IN CROATIA USING DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Danijela Rabar

    2013-02-01

    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  8. Evaluation of energy efficiency options in steam assisted gravity drainage oil sands surface facilities via process integration

    International Nuclear Information System (INIS)

    Carreon, Carlos E.; Mahmoudkhani, Maryam; Alva-Argaez, Alberto; Bergerson, Joule

    2015-01-01

    While new technologies are being developed for extracting unconventional oil, in the near term economic benefits and footprint reduction can be achieved by enhancing the energy efficiency of existing facilities. The objective of this work is to evaluate energy efficiency opportunities for in situ extraction of Canada's oil sands resource using pinch analysis. Modifications to an original plant design are analyzed in order to estimate utility savings beyond those obtained for the initial process configuration. The modifications explored in this paper are estimated to deliver energy savings of up to 6% beyond ‘business as usual’. This corresponds to GHG emissions reduction of approximately 5%. However, in some cases, this increase in energy savings comes at the cost of increasing demand for make-up water and volume of disposal water. Surplus generation of steam beyond heating requirements in the water treatment system leads to energy inefficiencies. Additional cost and energy savings are obtained by reducing or eliminating the use of glycol in the cooling circuit. - Highlights: • Pinch analysis performed for unconventional oil recovery process to identify inefficiencies. • Both the removal of pinch violations and process modifications lead to savings. • Effect of energy savings on water consumption for the process is considered. • Greenhouse gas emissions reduction and economic benefit are estimated for the studied cases

  9. Analysis of Wastewater Treatment Efficiency in a Soft Drinks Industry

    Science.gov (United States)

    Boguniewicz-Zabłocka, Joanna; Capodaglio, Andrea G.; Vogel, Daniel

    2017-10-01

    During manufacturing processes, most industrial plants generate wastewater which could become harmful to the environment. Discharge of untreated or improperly treated industrial wastewaters into surface water could, in fact, lead to deterioration of the receiving water body's quality. This paper concerns wastewater treatment solutions used in the soft drink production industry: wastewater treatment plant effectiveness analysis was determined in terms of basic pollution indicators, such as BOD, COD, TSS and variable pH. Initially, the performance of mechanic-biological systems for the treatment of wastewater from a specific beverages production process was studied in different periods, due to wastewater flow fluctuation. The study then showed the positive effects on treatment of wastewater augmentation by methanol, nitrogen and phosphorus salts dosed into it during the treatment process. Results confirm that after implemented modification (methanol, nitrogen and phosphorus additions) pollution removal occurs mostly with higher efficiency.

  10. Highly efficient electroluminescence from a solution-processable thermally activated delayed fluorescence emitter

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Yoshimasa; Kubo, Shosei; Suzuki, Katsuaki; Kaji, Hironori, E-mail: kaji@scl.kyoto-u.ac.jp [Institute for Chemical Research, Kyoto University, Uji, Kyoto 611-0011 (Japan); Shizu, Katsuyuki [Institute for Chemical Research, Kyoto University, Uji, Kyoto 611-0011 (Japan); Center for Organic Photonics and Electronics Research (OPERA), Kyushu University, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan); Tanaka, Hiroyuki [Center for Organic Photonics and Electronics Research (OPERA), Kyushu University, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan); Adachi, Chihaya [Center for Organic Photonics and Electronics Research (OPERA), Kyushu University, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan); Japan Science and Technology Agency (JST), ERATO, Adachi Molecular Exciton Engineering Project, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan)

    2015-11-02

    We developed a thermally activated delayed fluorescence (TADF) emitter, 2,4,6-tris(4-(9,9-dimethylacridan-10-yl)phenyl)-1,3,5-triazine (3ACR-TRZ), suitable for use in solution-processed organic light-emitting diodes (OLEDs). When doped into 4,4′-bis(carbazol-9-yl)biphenyl (CBP) host at 16 wt. %, 3ACR-TRZ showed a high photoluminescence quantum yield of 98%. Transient photoluminescence decay measurements of the 16 wt. % 3ACR-TRZ:CBP film confirmed that 3ACR-TRZ exhibits efficient TADF with a triplet-to-light conversion efficiency of 96%. This high conversion efficiency makes 3ACR-TRZ attractive as an emitting dopant in OLEDs. Using 3ACR-TRZ as an emitter, we fabricated a solution-processed OLED exhibiting a maximum external quantum efficiency of 18.6%.

  11. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    Science.gov (United States)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  12. An Efficient Secret Key Homomorphic Encryption Used in Image Processing Service

    Directory of Open Access Journals (Sweden)

    Pan Yang

    2017-01-01

    Full Text Available Homomorphic encryption can protect user’s privacy when operating on user’s data in cloud computing. But it is not practical for wide using as the data and services types in cloud computing are diverse. Among these data types, digital image is an important personal data for users. There are also many image processing services in cloud computing. To protect user’s privacy in these services, this paper proposed a scheme using homomorphic encryption in image processing. Firstly, a secret key homomorphic encryption (IGHE was constructed for encrypting image. IGHE can operate on encrypted floating numbers efficiently to adapt to the image processing service. Then, by translating the traditional image processing methods into the operations on encrypted pixels, the encrypted image can be processed homomorphically. That is, service can process the encrypted image directly, and the result after decryption is the same as processing the plain image. To illustrate our scheme, three common image processing instances were given in this paper. The experiments show that our scheme is secure, correct, and efficient enough to be used in practical image processing applications.

  13. Fuzzy Rule-based Analysis of Promotional Efficiency in Vietnam’s Tourism Industry

    Directory of Open Access Journals (Sweden)

    Nguyen Quang VINH

    2015-06-01

    Full Text Available This study aims to determine an effective method of measuring the efficiency of promotional strategies for tourist destinations. Complicating factors that influence promotional efficiency (PE, such as promotional activities (PA, destination attribute (DA, and destination image (DI, make it difficult to evaluate the effectiveness of PE. This study develops a rule-based decision support mechanism using fuzzy set theory and the Analytic Hierarchy Process (AHP to evaluate the effectiveness of promotional strategies. Additionally, a statistical analysis is conducted using SPSS (Statistics Package for Social Science to confirm the results of the fuzzy AHP analysis. This study finds that government policy is the most important factor for PE and that service staff (internal beauty is more important than tourism infrastructure (external beauty in terms of customer satisfaction and long-term strategy in PE. With respect to DI, experts are concerned first with tourist perceived value, second with tourist satisfaction and finally with tourist loyalty.

  14. Efficiency of the Slovak forestry in comparison to other European countries: An application of Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Kovalčík Miroslav

    2018-03-01

    Full Text Available Efficiency improvement is important for increasing the competitiveness of any sector and the same is essential for the forestry sector. A non-parametric approach – Data Envelopment Analysis (DEA was used for the assessment of forestry efficiency. The paper presents the results of the efficiency evaluation of forestry in European countries using DEA. One basic and two modified models (labour and wood sale were proposed, based on available input and output data from Integrated Environmental and Economic Accounts for Forests and specific conditions of forestry also. The sample size was 22 countries and the data for 2005–2008 was processed. Obtained results show average efficiency in the range of 69 – 90% (depending on the model. Based on the results of the analysis following can be concluded: Slovak forestry achieved under average efficiency in comparison to other European countries, there were great differences in efficiency among individual countries; state of economy (advanced countries and countries with economy in transition and region did not influence the efficiency statistically significant.

  15. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  16. About numerical analysis of electromagnetic field induce in gear wheels during hardening process

    Directory of Open Access Journals (Sweden)

    Gabriel Cheregi

    2008-05-01

    Full Text Available The paper presents the results of a numericalsimulation using finite element analysis for a coupledmagneto-thermal problem, specific for inductionhardening processes. The analysis takes into account therelative movement between inductor and the heated part.Numerical simulation allows to determine accurately thethermal regime of the induction heating process and theoptimal parameters which offer maximum efficiency.Therefore the experiments number in designing processcan be decreased and a better knowledge of the processcan be obtained.

  17. Systematic, efficient and consistent LCA calculations for chemical and biochemical processes

    DEFF Research Database (Denmark)

    Petchkaewkul, Kaesinee; Malakul, Pomthong; Gani, Rafiqul

    2016-01-01

    that allow a wider coverage of chemical and biochemical processes. Improvements of LCIA calculations and eco-efficiency evaluation are introduced. Also, a new model for photochemical ozone formation has been developed and implemented. Performance of LCSoft in terms of accuracy and reliability is compared......Life Cycle Assessment or LCA is a technique, which is applied for the study and evaluation of quantitative environmental impacts through the entire life cycle of products, processes or services in order to improve and/or evaluate the design of existing as well as new processes. The LCA factors can...... with another well-known LCA-software, SimaPro for a biochemical process – the production of bioethanol from cassava rhizome. The results show a very good match of new added impact categories. Also, the results from a new feature in LCSoft, which is eco-efficiency evaluation, are presented....

  18. Unified Analysis of Multi-Chamber Contact Tanks and Mixing Efficiency Evaluation Based on Vorticity Field. Part II: Transport Analysis

    Directory of Open Access Journals (Sweden)

    Ender Demirel

    2016-11-01

    Full Text Available Mixing characteristics of multi-chambered contact tank are analyzed employing the validated three-dimensional numerical model developed in the companion paper. Based on the flow characterization, novel volumetric mixing efficiency definitions are proposed for the assessment of the hydrodynamic and chemical transport properties of the contact tank and its chambers. Residence time distribution functions are analyzed not only at the outlet of each chamber but also inside the chambers using the efficiency definitions for both Reynolds averaged Navier–Stokes (RANS and large eddy simulation (LES results. A novel tracer mixing index is defined to characterize short circuiting and mixing effects of the contact system. Comparisons of the results of these indexes for RANS and LES solutions indicate that mixing characteristics are stronger in LES due to the unsteady turbulent eddy mixing even though short circuiting effects are also more prominent in LES results. This result indicates that the mixing analysis based on the LES results simulates the mixing characteristics instantaneously, which is more realistic than that in RANS. Since LES analysis can capture turbulent eddy mixing better than RANS analysis, the interaction of recirculation and jet zones are captured more effectively in LES, which tends to predict higher turbulent mixing in the contact system. The analysis also shows that the mixing efficiency of each chamber of the contact tank is different, thus it is necessary to consider distinct chemical release and volumetric designs for each chamber in order to maximize the mixing efficiency of the overall process in a contact tank system.

  19. Evaluating the efficiency of municipalities in collecting and processing municipal solid waste: a shared input DEA-model.

    Science.gov (United States)

    Rogge, Nicky; De Jaeger, Simon

    2012-10-01

    This paper proposed an adjusted "shared-input" version of the popular efficiency measurement technique Data Envelopment Analysis (DEA) that enables evaluating municipality waste collection and processing performances in settings in which one input (waste costs) is shared among treatment efforts of multiple municipal solid waste fractions. The main advantage of this version of DEA is that it not only provides an estimate of the municipalities overall cost efficiency but also estimates of the municipalities' cost efficiency in the treatment of the different fractions of municipal solid waste (MSW). To illustrate the practical usefulness of the shared input DEA-model, we apply the model to data on 293 municipalities in Flanders, Belgium, for the year 2008. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Energy efficiency in the industrial sector. Model based analysis of the efficient use of energy in the EU-27 with focus on the industrial sector

    International Nuclear Information System (INIS)

    Kuder, Ralf

    2014-01-01

    of the industry could be split up into energy intensive subsectors where single production processes dominate the energy consumption, and non-energy intensive subsectors. Ways to reduce the energy consumption in the industrial sector are the use of alternative or improved production or cross cutting technologies and the use of energy saving measures to reduce the demand for useable energy. Based on the analysis within this study, 21 % of the current energy consumption of the industrial sector of the EU and 17 % in Germany could be reduced. Based on the extended understanding of energy efficiency, the model based scenario analysis of the European energy system with the further developed energy system model TIMES PanEU shows that the efficient use of energy at an emission reduction level of 75 % is a slightly increasing primary energy consumption. The primary energy consumption is characterised by a diversified energy carrier and technology mix. Renewable energy sources, nuclear energy and CCS play a key role in the long term. In addition the electricity demand in combination with a strong decarbonisation of the electricity generation is increasing constantly. In the industrial sector the emission reduction is driven by the extended use of electricity, CCS and renewables as well as by the use of improved or alternative process and supply technologies with lower specific energy consumption. Thereby the final energy consumption stays almost on a constant level with increasing importance of electricity and biomass. Both regulatory interventions in the electricity sector and energy saving targets on the primary energy demand lead to higher energy system costs and therewith to a decrease of efficiency based on the extended understanding. The energy demand is reduced stronger than it is efficient and the saving targets lead to the extended use of other resources resulting in totally higher costs. The integrated system analysis in this study points out the interactions

  1. Study on highly efficient seismic data acquisition and processing methods based on sparsity constraint

    Science.gov (United States)

    Wang, H.; Chen, S.; Tao, C.; Qiu, L.

    2017-12-01

    High-density, high-fold and wide-azimuth seismic data acquisition methods are widely used to overcome the increasingly sophisticated exploration targets. The acquisition period is longer and longer and the acquisition cost is higher and higher. We carry out the study of highly efficient seismic data acquisition and processing methods based on sparse representation theory (or compressed sensing theory), and achieve some innovative results. The theoretical principles of highly efficient acquisition and processing is studied. We firstly reveal sparse representation theory based on wave equation. Then we study the highly efficient seismic sampling methods and present an optimized piecewise-random sampling method based on sparsity prior information. At last, a reconstruction strategy with the sparsity constraint is developed; A two-step recovery approach by combining sparsity-promoting method and hyperbolic Radon transform is also put forward. The above three aspects constitute the enhanced theory of highly efficient seismic data acquisition. The specific implementation strategies of highly efficient acquisition and processing are studied according to the highly efficient acquisition theory expounded in paragraph 2. Firstly, we propose the highly efficient acquisition network designing method by the help of optimized piecewise-random sampling method. Secondly, we propose two types of highly efficient seismic data acquisition methods based on (1) single sources and (2) blended (or simultaneous) sources. Thirdly, the reconstruction procedures corresponding to the above two types of highly efficient seismic data acquisition methods are proposed to obtain the seismic data on the regular acquisition network. A discussion of the impact on the imaging result of blended shooting is discussed. In the end, we implement the numerical tests based on Marmousi model. The achieved results show: (1) the theoretical framework of highly efficient seismic data acquisition and processing

  2. Competition Efficiency Analysis of Croatian Junior Wrestlers in European Championship

    Directory of Open Access Journals (Sweden)

    Kristijan Slacanac

    2017-06-01

    Full Text Available Croatian junior wrestler won a bronze medal at the European Championship 2016 year. Considering the potential of our wrestlers there is an obvious need of technical and tactical analysis so our juniors and seniors U23 wrestlers would be able to achieve even better results. Match analysis were conducted by LongoMatch 0.20.1. Seven matches of Croatian wrestlers were analysed. Time parameters, score efficiency, technical efficiency and tactical structure were observed and analysed from the aspect of attack and defence phase and successful/unsuccessful techniques. This paper shows descriptive parameters and competitor efficiency were calculated. The results show a great number of positive score in a standing position in relation to parterre position. The parameters of competitive efficiency (0.49 points per minute show better attacking efficiency (1.32 points per minute in relation to defence efficiency (0.83 points per minute. Croatian wrestlers achieve less score per minute in relation with elite wrestlers, but it is visible a significant progress in technical and tactical efficiency in relation in the past three year. According to place realization of technique, Croatian wrestlers realized more technique in the center, while opponents realized technique in the zone and moving to the zone. Further analysis of efficiency and individualisation training will improve efficiency of Croatian national wrestlers.

  3. A new approach to correlate transport processes and optical efficiency in GaN-based LEDs

    International Nuclear Information System (INIS)

    Pavesi, M; Manfredi, M; Rossi, F; Salviati, G; Meneghini, M; Zanoni, E

    2009-01-01

    Carrier injection and non-radiative processes are determinants of the optical efficiency of InGaN/GaN LEDs. Among transport mechanisms, tunnelling is crucial for device functioning, but other contributions can be decisive on a varying bias. It is not easy to identify the weights and roles of these terms by a simple current-voltage characterization, so it needs a careful investigation by means of complementary experimental techniques. The correlation between luminescence and microscopic transport processes in InGaN/GaN LEDs has been investigated by means of a set of techniques: electroluminescence, cathodoluminescence, current-voltage dc measurements and thermal admittance spectroscopy. Green and blue LEDs, designed with a multi-quantum-well injector layer and an optically active single-quantum-well, have been tested. They showed distinctive current and temperature dependences of the optical efficiency, with a better performance at room temperature observed for green devices. This was discussed in terms of the carrier injection efficiency controlled by electrically active traps. The comparative analysis of the optical and electrical experimental data comes in handy as a methodological approach to correlate the emission properties with the carrier injection mechanisms and to improve the functionality in a large number of quantum well heterostructures for lighting applications.

  4. Measuring efficiency in logistics

    Directory of Open Access Journals (Sweden)

    Milan Milovan Andrejić

    2013-06-01

    Full Text Available Dynamic market and environmental changes greatly affect operating of logistics systems. Logistics systems have to realize their activities and processes in an efficient way. The main objective of this paper is to analyze different aspects of efficiency measurement in logistics and to propose appropriate models of measurement. Measuring efficiency in logistics is a complex process that requires consideration of all subsystems, processes and activities as well as the impact of various financial, operational, environmental, quality and other factors. The proposed models have a basis in the Data Envelopment Analysis method. They could help managers in decision making and corrective actions processes. The tests and results of the model show the importance of input and output variables selection.

  5. Future energy-efficient and low-emissions glass melting processes

    NARCIS (Netherlands)

    Beerkens, R.G.C.; Limpt, J.A.C. van; Lankhorst, A.M.; Santen, P.J. van

    2012-01-01

    All over the world, there is an increasing drive to develop new technologies or concepts for industrial glass melting furnaces, with the main aim to increase the energy efficiency, tabilize production and reduce emissions. The application of new process sensors, improved furnace design, intelligent

  6. Efficiency analysis of energy networks: An international survey of regulators

    International Nuclear Information System (INIS)

    Haney, Aoife Brophy; Pollitt, Michael G.

    2009-01-01

    Incentive regulation for networks has been an important part of the reform agenda in a number of countries. As part of this regulatory process, incentives are put in place to improve the cost efficiency of network companies by rewarding good performance relative to a pre-defined benchmark. The techniques used to establish benchmarks are central to the efficiency improvements that are ultimately achieved. Much experience has been gained internationally in the application of benchmarking techniques and we now have a solid understanding of the main indicators of best practice. What we are lacking is a more complete understanding of the factors that influence choice of methods by regulators. In this paper, we present the results of an international survey of energy regulators in 40 countries conducted electronically between June and October 2008. Regulators from European, Australasian and Latin American countries are represented in the survey. Our results show that benchmarking techniques are now widespread in the regulation of gas and electricity networks. Best practice, however, is limited to a small number of regulators. We conclude by summarising existing trends and offering some recommendations on overcoming barriers to best practice efficiency analysis.

  7. Assessing the technical efficiency of health posts in rural Guatemala: a data envelopment analysis.

    Science.gov (United States)

    Hernández, Alison R; San Sebastián, Miguel

    2014-01-01

    Strengthening health service delivery to the rural poor is an important means of redressing inequities. Meso-level managers can help enhance efficiency in the utilization of existing resources through the application of practical tools to analyze routinely collected data reflecting inputs and outputs. This study aimed to assess the efficiency and change in productivity of health posts over two years in a rural department of Guatemala. Data envelopment analysis was used to measure health posts' technical efficiency and productivity change for 2008 and 2009. Input/output data were collected from the regional health office of Alta Verapaz for 34 health posts from the 19 districts comprising the health region. Technical efficiency varied widely across health posts, with mean scores of 0.78 (SD=0.24) and 0.75 (SD=0.21) in 2008 and 2009, respectively. Overall, productivity increased by 4%, though 47% of health posts experienced a decline in productivity. Results were combined on a bivariate plot to identify health posts at the high and low extremes of efficiency, which should be followed up to determine how and why their production processes are operating differently. Assessing efficiency using the data that are available at the meso-level can serve as a first step in strengthening performance. Further work is required to support managers in the routine application of efficiency analysis and putting the results to use in guiding efforts to improve service delivery and increase utilization.

  8. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    impact everything from strategic logistic operations down to the energy demands at the company level. It also looks at the force structure of the...this requirement. 34. The system shall determine the efficiency of the logistics network with respect to an estimated cost of fuel used to deliver...REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor

  9. Efficiency of Polish metallurgical industry based on data envelopment analysis

    Directory of Open Access Journals (Sweden)

    J. Baran

    2016-04-01

    Full Text Available The main purpose of this paper is to compare the technical efficiency of 12 sectors manufacturing basic metals and metal products in Poland. This article presents the use of Data Envelopment Analysis models, to determine overall technical efficiency, pure technical efficiency and scale efficiency of metallurgical branches in Poland. The average technical efficiency of metallurgical industry in Poland was quite high. The analysis gives a possibility to create a ranking of sectors. Three branches were found to be fully efficient: manufacture of basic iron and steel and of ferroalloys, manufacture of basic precious and other non - ferrous metals and manufacture of tubes, pipes, hollow profiles and related fittings, of steel. The results point out the reasons of the inefficiency and provide improving directions for the inefficient sectors.

  10. Cost-effectiveness analysis and efficient use of the pharmaceutical budget: the key role of clinical pharmacologists.

    Science.gov (United States)

    Edlin, Richard; Round, Jeff; Hulme, Claire; McCabe, Christopher

    2010-09-01

    The purpose of this paper is to provide information about cost-effectiveness analysis and the roles of clinical pharmacologists generally in providing efficient health care. The paper highlights the potential consequences of 'off-label prescribing' and 'indication creep' behaviour given slower growth (or potential cuts) in the NHS budget. This paper highlights the key roles of clinical pharmacologists in delivering an efficient health care system when resources are allocated using cost-effectiveness analyses. It describes what cost-effectiveness analysis (CEA) is and how incremental cost-effectiveness ratios (ICERs) are used to identify efficient options. After outlining the theoretical framework within which using CEA can promote the efficient allocation of the health care budget, it considers the place of disinvestment within achieving efficient resource allocation. Clinical pharmacologists are argued to be critical to providing improved population health under CEA-based resource allocation processes because of their roles in implementation and disinvestment. Given that the challenges facing the United Kingdom National Health Service (NHS) are likely to increase, this paper sets out the stark choices facing clinical pharmacologists.

  11. Detailed analysis of the effect of the turbine and compressor isentropic efficiency on the thermal and exergy efficiency of a Brayton cycle

    Directory of Open Access Journals (Sweden)

    Živić Marija

    2014-01-01

    Full Text Available Energy and exergy analysis of a Brayton cycle with an ideal gas is given. The irreversibility of the adiabatic processes in turbine and compressor is taken into account through their isentropic efficiencies. The net work per cycle, the thermal efficiency and the two exergy efficiencies are expressed as functions of the four dimensionless variables: the isentropic efficiencies of turbine and compressor, the pressure ratio, and the temperature ratio. It is shown that the maximal values of the net work per cycle, the thermal and the exergy efficiency are achieved when the isentropic efficiencies and temperature ratio are as high as possible, while the different values of pressure ratio that maximize the net work per cycle, the thermal and the exergy efficiencies exist. These pressure ratios increase with the increase of the temperature ratio and the isentropic efficiency of compressor and turbine. The increase of the turbine isentropic efficiency has a greater impact on the increase of the net work per cycle and the thermal efficiency of a Brayton cycle than the same increase of compressor isentropic efficiency. Finally, two goal functions are proposed for thermodynamic optimization of a Brayton cycle for given values of the temperature ratio and the compressor and turbine isentropic efficiencies. The first maximizes the sum of the net work per cycle and thermal efficiency while the second the net work per cycle and exergy efficiency. In both cases the optimal pressure ratio is closer to the pressure ratio that maximizes the net work per cycle.

  12. Thermodynamic analysis of the efficiency of high-temperature steam electrolysis system for hydrogen production

    Science.gov (United States)

    Mingyi, Liu; Bo, Yu; Jingming, Xu; Jing, Chen

    High-temperature steam electrolysis (HTSE), a reversible process of solid oxide fuel cell (SOFC) in principle, is a promising method for highly efficient large-scale hydrogen production. In our study, the overall efficiency of the HTSE system was calculated through electrochemical and thermodynamic analysis. A thermodynamic model in regards to the efficiency of the HTSE system was established and the quantitative effects of three key parameters, electrical efficiency (η el), electrolysis efficiency (η es), and thermal efficiency (η th) on the overall efficiency (η overall) of the HTSE system were investigated. Results showed that the contribution of η el, η es, η th to the overall efficiency were about 70%, 22%, and 8%, respectively. As temperatures increased from 500 °C to 1000 °C, the effect of η el on η overall decreased gradually and the η es effect remained almost constant, while the η th effect increased gradually. The overall efficiency of the high-temperature gas-cooled reactor (HTGR) coupled with the HTSE system under different conditions was also calculated. With the increase of electrical, electrolysis, and thermal efficiency, the overall efficiencies were anticipated to increase from 33% to a maximum of 59% at 1000 °C, which is over two times higher than that of the conventional alkaline water electrolysis.

  13. Techno-economic analysis of the coal-to-olefins process in comparison with the oil-to-olefins process

    International Nuclear Information System (INIS)

    Xiang, Dong; Qian, Yu; Man, Yi; Yang, Siyu

    2014-01-01

    Highlights: • Present the opportunities and challenges of coal-to-olefins (CTO) development. • Conduct a techno-economic analysis on CTO compared with oil-to-olefins (OTO). • Suggest approaches for improving energy efficiency and economic performance of CTO. • Analyze effects of plant scale, feedstock price, CO 2 tax on CTO and OTO. - Abstract: Olefins are one of the most important oil derivatives widely used in industry. To reduce the dependence of olefins industry on oil, China is increasing the production of olefins from alternative energy resources, especially from coal. This study is concerned with the opportunities and obstacles of coal-to-olefins development, and focuses on making an overall techno-economic analysis of a coal-to-olefins plant with the capacity of 0.7 Mt/a olefins. Comparison is made with a 1.5 Mt/a oil-to-olefins plant based on three criteria including energy efficiency, capital investment, and product cost. It was found that the coal-based olefins process show prominent advantage in product cost because of the low price of its feedstock. However, it suffers from the limitations of higher capital investment, lower energy efficiency, and higher emissions. The effects of production scale, raw material price, and carbon tax were varied for the two production routes, and thus the operational regions were found for the coal-to-olefins process to be competitive

  14. Multivariate statistical analysis of a multi-step industrial processes

    DEFF Research Database (Denmark)

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  15. Measuring efficiency of lean six sigma project implementation using data envelopment analysis at Nasa

    Directory of Open Access Journals (Sweden)

    David Meza

    2013-06-01

    Full Text Available Purpose: This study aims to review the implementation of the Lean Six Sigma project methodology in the Johnson Space Center (JSC business environment of National Aeronautics and Space Administration (NASA with an objective of evaluating performance of individual projects and to develop recommendation for strategies to improve operational efficiencies based on Data Envelopment Analysis (DEA.Design/methodology/approach: In this study, authors propose the Lean Six Sigma project performance evaluation model (LSS-PPEM based on Data DEA where Critical Success Factors (CSFs and Total Team Hours serve as inputs while Process Sigma and Cost avoidance are used as outputs. The CSFs are factors that critically affect the performance of LSS at JSC. Six of those are identified by the Black Belts through Analytical Hierarchical Process, and the values of those are decided by project leaders and Green Belts through survey. Eighteen LSS projects are evaluated, and their results are analyzed.Findings and Originality/value: Eventually, four out of the six CSFs are adopted for this study based upon Pearson correlation analysis, and those four include Project execution and follow up of results; Top management’s commitment and participation; The use of data analysis with easily obtainable data; Attention given to both long and short term targets. Using data between the years 2009 and 2011, seven of the eighteen projects are found to be efficient. The benchmark analysis and slack analysis are conducted to provide further recommendation for JSC managers. Three out of those seven efficient projects are most frequently used as an efficient peer.Practical implications: Traditionally, DEA has been considered as a data-driven approach. In this study, authors incorporate the survey-based CSFs into the DEA frame. Since many organizations may have different CSFs, the framework presented in this study can be easily applied to other organizations.Originality/value: This study

  16. An efficient approach to the evaluation of mid-term dynamic processes in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Zivanovic, R M [Pretoria Technikon (South Africa); Popovic, D P [Nikola Tesla Inst., Belgrade (Yugoslavia). Power System Dept.

    1993-01-01

    This paper presents some improvements in the methodology for analysing mid-term dynamic processes in power systems. These improvements are: an efficient application of the hierarchical clustering algorithm to adaptive identification of coherent generator groups and a significant reduction of the mathematical model, on the basis of monitoring the state of only one generator in one of the established coherent groups. This enables a flexible, simple and fast transformation from the full to the reduced model and vice versa, a significant acceleration of the simulation while keeping the desired accuracy and the automatic use in continual dynamic analysis. Verification of the above mentioned contributions was performed on examples of the dynamic analysis of New England and Yugoslav power systems. (author)

  17. Design and analysis of nuclear processes with the APROS

    International Nuclear Information System (INIS)

    Haenninen, M.; Puska, E.K.; Nystroem, P.

    1987-01-01

    APROS (Advanced Process Simulator) is the product being developed in the process simulators project of Imatran Voima Co. and Technical Research Centre of Finland. The aim is to design and construct an efficient and easy to use computer simulation system for process and automation system design, evaluation, analysis, testing and training purposes. As halfway of this project a working system exists with a large number of proven routines and models. However, a lot of development is still foreseen before the project will be finished. This article gives an overview of the APROS in general and of the nuclear features in particular. The calculational capabilities of the system are presented with the help of one example. (orig.)

  18. Highly Efficient Reproducible Perovskite Solar Cells Prepared by Low-Temperature Processing

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2016-04-01

    Full Text Available In this work, we describe the role of the different layers in perovskite solar cells to achieve reproducible, ~16% efficient perovskite solar cells. We used a planar device architecture with PEDOT:PSS on the bottom, followed by the perovskite layer and an evaporated C60 layer before deposition of the top electrode. No high temperature annealing step is needed, which also allows processing on flexible plastic substrates. Only the optimization of all of these layers leads to highly efficient and reproducible results. In this work, we describe the effects of different processing conditions, especially the influence of the C60 top layer on the device performance.

  19. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    Science.gov (United States)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  20. Increasing operational efficiency in a radioactive waste processing plant - 16100

    International Nuclear Information System (INIS)

    Turner, T.W.; Watson, S.N.

    2009-01-01

    The solid waste plant at Harwell in Oxfordshire, contains a purpose built facility to input, assay, visually inspect and sort remote handled intermediate level radioactive waste (RHILW). The facility includes a suite of remote handling cells, known as the head-end cells (HEC), which waste must pass through in order to be repackaged. Some newly created waste from decommissioning works on site passes through the cells, but the vast majority of waste for processing is historical waste, stored in below ground tube stores. Existing containers are not suitable for long term storage, many are already badly corroded, so the waste must be efficiently processed and repackaged in order to achieve passive safety. The Harwell site is currently being decommissioned and the land is being restored. The site is being progressively de-licensed, and redeveloped as a business park, which can only be completed when all the nuclear liabilities have been removed. The recovery and processing of old waste in the solid waste plant is a key project linked to de-licensing of a section of the site. Increasing the operational efficiency of the waste processing plant could shorten the time needed to clear the site and has the potential to save money for the Nuclear Decommissioning Authority (NDA). The waste processing facility was constructed in the mid 1990's, and commissioned in 1999. Since operations began, the yearly throughput of the cells has increased significantly every year. To achieve targets set out in the lifetime plan (LTP) for the site, throughput must continue to increase. The operations department has measured the overall equipment effectiveness (OEE) of the process for the last few years, and has used continuous improvement techniques to decrease the average cycle time. Philosophies from operational management practices such as 'lean' and 'kaizen' have been employed successfully to drive out losses and increase plant efficiency. This paper will describe how the solid waste plant

  1. Fit Gap Analysis – The Role of Business Process Reference Models

    Directory of Open Access Journals (Sweden)

    Dejan Pajk

    2013-12-01

    Full Text Available Enterprise resource planning (ERP systems support solutions for standard business processes such as financial, sales, procurement and warehouse. In order to improve the understandability and efficiency of their implementation, ERP vendors have introduced reference models that describe the processes and underlying structure of an ERP system. To select and successfully implement an ERP system, the capabilities of that system have to be compared with a company’s business needs. Based on a comparison, all of the fits and gaps must be identified and further analysed. This step usually forms part of ERP implementation methodologies and is called fit gap analysis. The paper theoretically overviews methods for applying reference models and describes fit gap analysis processes in detail. The paper’s first contribution is its presentation of a fit gap analysis using standard business process modelling notation. The second contribution is the demonstration of a process-based comparison approach between a supply chain process and an ERP system process reference model. In addition to its theoretical contributions, the results can also be practically applied to projects involving the selection and implementation of ERP systems.

  2. Efficient multitasking: parallel versus serial processing of multiple tasks.

    Science.gov (United States)

    Fischer, Rico; Plessow, Franziska

    2015-01-01

    In the context of performance optimizations in multitasking, a central debate has unfolded in multitasking research around whether cognitive processes related to different tasks proceed only sequentially (one at a time), or can operate in parallel (simultaneously). This review features a discussion of theoretical considerations and empirical evidence regarding parallel versus serial task processing in multitasking. In addition, we highlight how methodological differences and theoretical conceptions determine the extent to which parallel processing in multitasking can be detected, to guide their employment in future research. Parallel and serial processing of multiple tasks are not mutually exclusive. Therefore, questions focusing exclusively on either task-processing mode are too simplified. We review empirical evidence and demonstrate that shifting between more parallel and more serial task processing critically depends on the conditions under which multiple tasks are performed. We conclude that efficient multitasking is reflected by the ability of individuals to adjust multitasking performance to environmental demands by flexibly shifting between different processing strategies of multiple task-component scheduling.

  3. Energy efficiency solutions for driers used in the glass manufacturing and processing industry

    Directory of Open Access Journals (Sweden)

    Pătrașcu Roxana

    2017-07-01

    Full Text Available Energy conservation is relevant to increasing efficiency in energy projects, by saving energy, by its’ rational use or by switching to other forms of energy. The goal is to secure energy supply on short and long term, while increasing efficiency. These are enforced by evaluating the companies’ energy status, by monitoring and adjusting energy consumption and organising a coherent energy management. The manufacturing process is described, starting from the state and properties of the raw material and ending with the glass drying technological processes involved. Raw materials are selected considering technological and economic criteria. Manufacturing is treated as a two-stage process, consisting of the logistic, preparation aspect of unloading, transporting, storing materials and the manufacturing process itself, by which the glass is sifted, shredded, deferrized and dried. The interest of analyzing the latter is justified by the fact that it has a big impact on the final energy consumption values, hence, in order to improve the general performance, the driers’ energy losses are to be reduced. Technological, energy and management solutions are stated to meet this problem. In the present paper, the emphasis is on the energy perspective of enhancing the overall efficiency. The case study stresses the effects of heat recovery over the efficiency of a glass drier. Audits are conducted, both before and after its’ implementation, to punctually observe the balance between the entering and exiting heat in the drying process. The reduction in fuel consumption and the increase in thermal performance and fuel usage performances reveal the importance of using all available exiting heat from processes. Technical faults, either in exploitation or in management, lead to additional expenses. Improving them is in congruence with the energy conservation concept and is in accordance with the Energy Efficiency Improvement Program for industrial facilities.

  4. An efficient parallel stochastic simulation method for analysis of nonviral gene delivery systems

    KAUST Repository

    Kuwahara, Hiroyuki

    2011-01-01

    Gene therapy has a great potential to become an effective treatment for a wide variety of diseases. One of the main challenges to make gene therapy practical in clinical settings is the development of efficient and safe mechanisms to deliver foreign DNA molecules into the nucleus of target cells. Several computational and experimental studies have shown that the design process of synthetic gene transfer vectors can be greatly enhanced by computational modeling and simulation. This paper proposes a novel, effective parallelization of the stochastic simulation algorithm (SSA) for pharmacokinetic models that characterize the rate-limiting, multi-step processes of intracellular gene delivery. While efficient parallelizations of the SSA are still an open problem in a general setting, the proposed parallel simulation method is able to substantially accelerate the next reaction selection scheme and the reaction update scheme in the SSA by exploiting and decomposing the structures of stochastic gene delivery models. This, thus, makes computationally intensive analysis such as parameter optimizations and gene dosage control for specific cell types, gene vectors, and transgene expression stability substantially more practical than that could otherwise be with the standard SSA. Here, we translated the nonviral gene delivery model based on mass-action kinetics by Varga et al. [Molecular Therapy, 4(5), 2001] into a more realistic model that captures intracellular fluctuations based on stochastic chemical kinetics, and as a case study we applied our parallel simulation to this stochastic model. Our results show that our simulation method is able to increase the efficiency of statistical analysis by at least 50% in various settings. © 2011 ACM.

  5. DU Processing Efficiency and Reclamation: Plasma Arc Melting

    Energy Technology Data Exchange (ETDEWEB)

    Imhoff, Seth D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Aikin, Jr., Robert M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Swenson, Hunter [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Solis, Eunice Martinez [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    The work described here corresponds to one piece of a larger effort to increase material usage efficiency during DU processing operations. In order to achieve this goal, multiple technologies and approaches are being tested. These technologies occupy a spectrum of technology readiness levels (TRLs). Plasma arc melting (PAM) is one of the technologies being investigated. PAM utilizes a high temperature plasma to melt materials. Depending on process conditions, there are potential opportunities for recycling and material reclamation. When last routinely operational, the LANL research PAM showed extremely promising results for recycling and reclamation of DU and DU alloys. The current TRL is lower due to machine idleness for nearly two decades, which has proved difficult to restart. This report describes the existing results, promising techniques, and the process of bringing this technology back to readiness at LANL.

  6. Thermally Activated Delayed Fluorescence in Polymers: A New Route toward Highly Efficient Solution Processable OLEDs.

    Science.gov (United States)

    Nikolaenko, Andrey E; Cass, Michael; Bourcet, Florence; Mohamad, David; Roberts, Matthew

    2015-11-25

    Efficient intermonomer thermally activated delayed fluorescence is demonstrated for the first time, opening a new route to achieving high-efficiency solution processable polymer light-emitting device materials. External quantum efficiency (EQE) of up to 10% is achieved in a simple fully solution-processed device structure, and routes for further EQE improvement identified. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. GPC Light Shaper for energy efficient laser materials processing

    DEFF Research Database (Denmark)

    Bañas, Andrew Rafael; Palima, Darwin; Villangca, Mark Jayson

    The biggest use of lasers is in materials processing. In manufacturing, lasers are used for cutting, drilling, marking and other machining processes. Similarly, lasers are important in microfabrication processes such as photolithography, direct laser writing, or ablation. Lasers are advantageous...... with steep, well defined edges that would further increase laser cutting precision or allow “single shot” laser engraving of arbitrary 2D profiles, as opposed to point scanning [3,4]. Instead of lossy approaches, GPC beam shaping is achieved with simplified, binary phase-only optics [5] that redistributes...... because they do not wear out, have no physical contact with the processed material, avoid heating or warping effects, and are generally more precise. Since lasers are easier to adapt to different optimized shapes, they can be even more precise and energy efficient for materials processing. The cost...

  8. Efficient analysis of three dimensional EUV mask induced imaging artifacts using the waveguide decomposition method

    Science.gov (United States)

    Shao, Feng; Evanschitzky, Peter; Fühner, Tim; Erdmann, Andreas

    2009-10-01

    This paper employs the Waveguide decomposition method as an efficient rigorous electromagnetic field (EMF) solver to investigate three dimensional mask-induced imaging artifacts in EUV lithography. The major mask diffraction induced imaging artifacts are first identified by applying the Zernike analysis of the mask nearfield spectrum of 2D lines/spaces. Three dimensional mask features like 22nm semidense/dense contacts/posts, isolated elbows and line-ends are then investigated in terms of lithographic results. After that, the 3D mask-induced imaging artifacts such as feature orientation dependent best focus shift, process window asymmetries, and other aberration-like phenomena are explored for the studied mask features. The simulation results can help lithographers to understand the reasons of EUV-specific imaging artifacts and to devise illumination and feature dependent strategies for their compensation in the optical proximity correction (OPC) for EUV masks. At last, an efficient approach using the Zernike analysis together with the Waveguide decomposition technique is proposed to characterize the impact of mask properties for the future OPC process.

  9. Energy-efficient neural information processing in individual neurons and neuronal networks.

    Science.gov (United States)

    Yu, Lianchun; Yu, Yuguo

    2017-11-01

    Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  11. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  12. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  13. Investigation of Processing, Microstructures and Efficiencies of Polycrystalline CdTe Photovoltaic Films and Devices

    Science.gov (United States)

    Munshi, Amit Harenkumar

    CdTe based photovoltaics have been commercialized at multiple GWs/year level. The performance of CdTe thin film photovoltaic devices is sensitive to process conditions. Variations in deposition temperatures as well as other treatment parameters have a significant impact on film microstructure and device performance. In this work, extensive investigations are carried out using advanced microstructural characterization techniques in an attempt to relate microstructural changes due to varying deposition parameters and their effects on device performance for cadmium telluride based photovoltaic cells deposited using close space sublimation (CSS). The goal of this investigation is to apply advanced material characterization techniques to aid process development for higher efficiency CdTe based photovoltaic devices. Several techniques have been used to observe the morphological changes to the microstructure along with materials and crystallographic changes as a function of deposition temperature and treatment times. Traditional device structures as well as advanced structures with electron reflector and films deposited on Mg1-xZnxO instead of conventional CdS window layer are investigated. These techniques include Scanning Electron Microscopy (SEM) with Electron Back Scattered Diffraction (EBSD) and Energy dispersive X-ray spectroscopy (EDS) to study grain structure and High Resolution Transmission Electron Microscopy (TEM) with electron diffraction and EDS. These investigations have provided insights into the mechanisms that lead to change in film structure and device performance with change in deposition conditions. Energy dispersive X-ray spectroscopy (EDS) is used for chemical mapping of the films as well as to understand interlayer material diffusion between subsequent layers. Electrical performance of these devices has been studied using current density vs voltage plots. Devices with efficiency over 18% have been fabricated on low cost commercial glass substrates

  14. Kinetic analysis of the effects of target structure on siRNA efficiency

    Science.gov (United States)

    Chen, Jiawen; Zhang, Wenbing

    2012-12-01

    RNAi efficiency for target cleavage and protein expression is related to the target structure. Considering the RNA-induced silencing complex (RISC) as a multiple turnover enzyme, we investigated the effect of target mRNA structure on siRNA efficiency with kinetic analysis. The 4-step model was used to study the target cleavage kinetic process: hybridization nucleation at an accessible target site, RISC-mRNA hybrid elongation along with mRNA target structure melting, target cleavage, and enzyme reactivation. At this model, the terms accounting for the target accessibility, stability, and the seed and the nucleation site effects are all included. The results are in good agreement with that of experiments which show different arguments about the structure effects on siRNA efficiency. It shows that the siRNA efficiency is influenced by the integrated factors of target's accessibility, stability, and the seed effects. To study the off-target effects, a simple model of one siRNA binding to two mRNA targets was designed. By using this model, the possibility for diminishing the off-target effects by the concentration of siRNA was discussed.

  15. Modeling and optimization of processes for clean and efficient pulverized coal combustion in utility boilers

    Directory of Open Access Journals (Sweden)

    Belošević Srđan V.

    2016-01-01

    Full Text Available Pulverized coal-fired power plants should provide higher efficiency of energy conversion, flexibility in terms of boiler loads and fuel characteristics and emission reduction of pollutants like nitrogen oxides. Modification of combustion process is a cost-effective technology for NOx control. For optimization of complex processes, such as turbulent reactive flow in coal-fired furnaces, mathematical modeling is regularly used. The NOx emission reduction by combustion modifications in the 350 MWe Kostolac B boiler furnace, tangentially fired by pulverized Serbian lignite, is investigated in the paper. Numerical experiments were done by an in-house developed three-dimensional differential comprehensive combustion code, with fuel- and thermal-NO formation/destruction reactions model. The code was developed to be easily used by engineering staff for process analysis in boiler units. A broad range of operating conditions was examined, such as fuel and preheated air distribution over the burners and tiers, operation mode of the burners, grinding fineness and quality of coal, boiler loads, cold air ingress, recirculation of flue gases, water-walls ash deposition and combined effect of different parameters. The predictions show that the NOx emission reduction of up to 30% can be achieved by a proper combustion organization in the case-study furnace, with the flame position control. Impact of combustion modifications on the boiler operation was evaluated by the boiler thermal calculations suggesting that the facility was to be controlled within narrow limits of operation parameters. Such a complex approach to pollutants control enables evaluating alternative solutions to achieve efficient and low emission operation of utility boiler units. [Projekat Ministarstva nauke Republike Srbije, br. TR-33018: Increase in energy and ecology efficiency of processes in pulverized coal-fired furnace and optimization of utility steam boiler air preheater by using in

  16. Analysis the Efficiency and Productivity of Indonesian Pharmaceutical Public Companies Using Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Dewi Hanggraeni

    2014-08-01

    Full Text Available As one of the biggest medicines market in the South East Asia, the pharmaceutical industry inIndonesia has a huge potential market. However, the majority supply of raw materials has been imported.Besides, regulations of the Health Ministry and the Trade Ministry have still hampered mostplayers in Indonesia pharmaceutical industry. Therefore, this study used Data Envelopment Analysis(DEA models to analyze efficiency and productivity change in the Indonesian pharmaceutical industrybetween 2006 and 2011, listed in the Indonesia Stock Exchange and also supported by applyingefficiency financial ratio. This study finds that the decision for the most relatively efficient companyis different using DEA compared to efficiency financial ratios, yet DEA has better measurement ofefficiency. It is proven by one of State-owned Enterprises has been evaluated underperformed by thefinancial ratio analysis, unexpectedly is efficient using the DEA approach. This study has also proposedand tested a hypothesis on the average efficiency to check if the domestic and foreign pharmaceuticalcompanies differ in their efficiency but the result implies that there is no significant statisticaldifference among them. This study indicates that firms having dominant contribution in selling overthe-counter medicines are more efficient than selling ethical medicines. Lastly, technological changecontribution has more influence to productivity change instead of pure technical efficiency change inIndonesia pharmaceutical companies.

  17. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    Science.gov (United States)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  18. Energy use efficiency in the Indian manufacturing sector: An interstate analysis

    International Nuclear Information System (INIS)

    Mukherjee, Kankana

    2008-01-01

    This paper approaches the measurement of energy efficiency from a production theoretic framework and uses Data Envelopment Analysis to measure energy efficiency in the Indian manufacturing sector. Using data from the Annual Survey of Industries for the years 1998-99 through 2003-04, the study compares the energy efficiency in manufacturing across states, based on several models. The results show considerable variation in energy efficiency across states. Comparing the results across our models, we find that the relative pricing of energy does not provide the appropriate incentives for energy conservation. A second-stage regression analysis reveals that states with a larger share of manufacturing output in energy-intensive industries have lower energy efficiency. Also, higher quality labor force associates with higher energy efficiency. Finally, the power sector reforms have not yet had any significant impact on achieving energy efficiency

  19. Clinical process analysis and activity-based costing at a heart center.

    Science.gov (United States)

    Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans

    2002-08-01

    Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.

  20. Global sensitivity analysis of Alkali-Surfactant-Polymer enhanced oil recovery processes

    Energy Technology Data Exchange (ETDEWEB)

    Carrero, Enrique; Queipo, Nestor V.; Pintos, Salvador; Zerpa, Luis E. [Applied Computing Institute, Faculty of Engineering, University of Zulia, Zulia (Venezuela)

    2007-08-15

    After conventional waterflooding processes the residual oil in the reservoir remains as a discontinuous phase in the form of oil drops trapped by capillary forces and is likely to be around 70% of the original oil in place (OOIP). The EOR method so-called Alkaline-Surfactant-Polymer (ASP) flooding has been proved to be effective in reducing the oil residual saturation in laboratory experiments and field projects through reduction of interfacial tension and mobility ratio between oil and water phases. A critical step for the optimal design and control of ASP recovery processes is to find the relative contributions of design variables such as, slug size and chemical concentrations, in the variability of given performance measures (e.g., net present value, cumulative oil recovery), considering a heterogeneous and multiphase petroleum reservoir (sensitivity analysis). Previously reported works using reservoir numerical simulation have been limited to local sensitivity analyses because a global sensitivity analysis may require hundreds or even thousands of computationally expensive evaluations (field scale numerical simulations). To overcome this issue, a surrogate-based approach is suggested. Surrogate-based analysis/optimization makes reference to the idea of constructing an alternative fast model (surrogate) from numerical simulation data and using it for analysis/optimization purposes. This paper presents an efficient global sensitivity approach based on Sobol's method and multiple surrogates (i.e., Polynomial Regression, Kriging, Radial Base Functions and a Weighed Adaptive Model), with the multiple surrogates used to address the uncertainty in the analysis derived from plausible alternative surrogate-modeling schemes. The proposed approach was evaluated in the context of the global sensitivity analysis of a field scale Alkali-Surfactant-Polymer flooding process. The design variables and the performance measure in the ASP process were selected as slug size

  1. Efficient Approach for Improving the Performance of Nonhalogenated Green Solvent-Processed Polymer Solar Cells via Ternary-Blend Strategy.

    Science.gov (United States)

    Kranthiraja, Kakaraparthi; Aryal, Um Kanta; Sree, Vijaya Gopalan; Gunasekar, Kumarasamy; Lee, Changyeon; Kim, Minseok; Kim, Bumjoon J; Song, Myungkwan; Jin, Sung-Ho

    2018-04-10

    The ternary-blend approach has the potential to enhance the power conversion efficiencies (PCEs) of polymer solar cells (PSCs) by providing complementary absorption and efficient charge generation. Unfortunately, most PSCs are processed with toxic halogenated solvents, which are harmful to human health and the environment. Herein, we report the addition of a nonfullerene electron acceptor 3,9-bis(2-methylene-(3-(1,1-dicyanomethylene)-indanone))-5,5,11,11-tetrakis(4-hexylphenyl)-dithieno[2,3- d:2',3'- d']- s-indaceno[1,2- b:5,6- b']dithiophene (ITIC) to a binary blend (poly[4,8-bis(2-(4-(2-ethylhexyloxy)3-fluorophenyl)-5-thienyl)benzo[1,2- b:4,5- b']dithiophene- alt-1,3-bis(4-octylthien-2-yl)-5-(2-ethylhexyl)thieno[3,4- c]pyrrole-4,6-dione] (P1):[6,6]-phenyl-C 71 -butyric acid methyl ester (PC 71 BM), PCE = 8.07%) to produce an efficient nonhalogenated green solvent-processed ternary PSC system with a high PCE of 10.11%. The estimated wetting coefficient value (0.086) for the ternary blend suggests that ITIC could be located at the P1:PC 71 BM interface, resulting in efficient charge generation and charge transport. In addition, the improved current density, sustained open-circuit voltage and PCE of the optimized ternary PSCs were highly correlated with their better external quantum efficiency response and flat-band potential value obtained from the Mott-Schottky analysis. In addition, the ternary PSCs also showed excellent ambient stability over 720 h. Therefore, our results demonstrate the combination of fullerene and nonfullerene acceptors in ternary blend as an efficient approach to improve the performance of eco-friendly solvent-processed PSCs with long-term stability.

  2. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    Science.gov (United States)

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use

  3. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    Science.gov (United States)

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the

  4. Analysis of combustion efficiency in a pelletizing furnace

    Directory of Open Access Journals (Sweden)

    Rafael Simões Vieira de Moura

    Full Text Available Abstract The objective of this research is to assess how much the improvement in the combustion reaction efficiency can reduce fuel consumption, maintaining the same thermal energy rate provided by the reaction in a pelletizing furnace. The furnace for pelletizing iron ore is a complex thermal machine, in terms of energy balance. It contains recirculation fan gases and constant variations in the process, and the variation of a single process variable can influence numerous changes in operating conditions. This study demonstrated how the main variables related to combustion in the burning zone influence fuel consumption (natural gas from the furnace of the Usina de Pelotização de Fábrica (owned by VALE S/A, without changing process conditions that affect production quality. Variables were analyzed regarding the velocity and pressure of the fuel in the burners, the temperature of the combustion air and reactant gases, the conversion rate and the stoichiometric air/fuel ratio of the reaction. For the analysis, actual data of the furnace in operation was used, and for the simulation of chemical reactions, the software Gaseq® was used. The study showed that the adjustment of combustion reaction stoichiometry provides a reduction of 9.25% in fuel consumption, representing a savings of US$ 2.6 million per year for the company.

  5. Process kinetics and digestion efficiency of anaerobic batch fermentation of brewer`s spent grains (BSG)

    Energy Technology Data Exchange (ETDEWEB)

    Ezeonu, F.C.; Okaka, A.N.C. [Nnamdi Azikiwe University, Awka (Nigeria). Dept. of Applied Biochemistry

    1996-12-31

    The process kinetics of optimized anaerobic batch digestion of brewer`s spent grains (BSG) reveal that biomethanation is essentially a first order reaction interrupted intermittently by mixed order reactions. An apparent cellulose degradation efficiency of approximately 60% and a lignin degradation efficiency of about 40% was observed in the optimized process. Using the Ken and Hashimoto model, the operational efficiency of the digester was determined to be 26%. (author)

  6. Infrared pre-drying and dry-dehulling of walnuts for improved processing efficiency and product quality

    Science.gov (United States)

    The walnut industry is faced with an urgent need to improve post-harvest processing efficiency, particularly drying and dehulling operations. This research investigated the feasibility of dry-dehulling and infrared (IR) pre-drying of walnuts for improved processing efficiency and dried product quali...

  7. Innovation and efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Haustein, H D; Maier, H

    1979-01-01

    Innovation, the process of creation, development, use, and diffusion of a new product or process for new or already-identified needs, has become a topic of concern for both developed and developing countries. Although the causes and motivations for the concern differ widely from country to country, the development of effective innovation policies is a universal problem. The International Institute for Applied Systems Analysis (IIASA) has been concerned with this problem for several years. The main purpose of an innovation is to improve the efficiency of the production unit that adopts the innovation, in comparison with the efficiency of the entire production system. To grasp the nature of the innovation process, its impact on the economic performance of the country, and to identify the appropriate managerial actions to shape and stimulate the innovation process, five different stages through which the innovation process usually runs are outlined. The IIASA has been concerned with supplanting the former approach of spontaneous innovation with a systems analysis approach to help implement new forms of social, innovative learning to be beneficial to mankind. 7 references, 2 figures, 1 table. (SAC)

  8. Applying DEA sensitivity analysis to efficiency measurement of Vietnamese universities

    Directory of Open Access Journals (Sweden)

    Thi Thanh Huyen Nguyen

    2015-11-01

    Full Text Available The primary purpose of this study is to measure the technical efficiency of 30 doctorate-granting universities, the universities or the higher education institutes with PhD training programs, in Vietnam, applying the sensitivity analysis of data envelopment analysis (DEA. The study uses eight sets of input-output specifications using the replacement as well as aggregation/disaggregation of variables. The measurement results allow us to examine the sensitivity of the efficiency of these universities with the sets of variables. The findings also show the impact of variables on their efficiency and its “sustainability”.

  9. The Process Synthesis Pyramid: Conceptual design of a Liquefied Energy Chain using Pinch Analysis,Exergy Analysis,Deterministic Optimization and Metaheuristic Searches

    International Nuclear Information System (INIS)

    Aspelund, Audun

    2012-01-01

    Process Synthesis (PS) is a term used to describe a class of general and systematic methods for the conceptual design of processing plants and energy systems. The term also refers to the development of the process flowsheet (structure or topology), the selection of unit operations and the determination of the most important operating conditions.In this thesis an attempt is made to characterize some of the most common methodologies in a PS pyramid and discuss their advantages and disadvantages as well as where in the design phase they could be used most efficiently. The thesis shows how design tools have been developed for subambient processes by combining and expanding PS methods such as Heuristic Rules, sequential modular Process Simulations, Pinch Analysis, Exergy Analysis, Mathematical Programming using Deterministic Optimization methods and optimization using Stochastic Optimization methods. The most important contributions to the process design community are three new methodologies that include the pressure as an important variable in heat exchanger network synthesis (HENS).The methodologies have been used to develop a novel and efficient energy chain based on stranded natural gas including power production with carbon capture and sequestration (CCS). This Liquefied Energy Chain consists of an offshore process a combined gas carrier and an onshore process. This energy chain is capable of efficiently exploiting resources that cannot be utilized economically today with minor Co2 emissions. Finally, a new Stochastic Optimization approach based on a Tabu Search (TS), the Nelder Mead method or Downhill Simplex Method (NMDS) and the sequential process simulator HYSYS is used to search for better solutions for the Liquefied Energy Chain with respect to minimum cost or maximum profit. (au)

  10. The Process Synthesis Pyramid: Conceptual design of a Liquefied Energy Chain using Pinch Analysis,Exergy Analysis,Deterministic Optimization and Metaheuristic Searches

    Energy Technology Data Exchange (ETDEWEB)

    Aspelund, Audun

    2012-07-01

    Process Synthesis (PS) is a term used to describe a class of general and systematic methods for the conceptual design of processing plants and energy systems. The term also refers to the development of the process flowsheet (structure or topology), the selection of unit operations and the determination of the most important operating conditions.In this thesis an attempt is made to characterize some of the most common methodologies in a PS pyramid and discuss their advantages and disadvantages as well as where in the design phase they could be used most efficiently. The thesis shows how design tools have been developed for subambient processes by combining and expanding PS methods such as Heuristic Rules, sequential modular Process Simulations, Pinch Analysis, Exergy Analysis, Mathematical Programming using Deterministic Optimization methods and optimization using Stochastic Optimization methods. The most important contributions to the process design community are three new methodologies that include the pressure as an important variable in heat exchanger network synthesis (HENS).The methodologies have been used to develop a novel and efficient energy chain based on stranded natural gas including power production with carbon capture and sequestration (CCS). This Liquefied Energy Chain consists of an offshore process a combined gas carrier and an onshore process. This energy chain is capable of efficiently exploiting resources that cannot be utilized economically today with minor Co2 emissions. Finally, a new Stochastic Optimization approach based on a Tabu Search (TS), the Nelder Mead method or Downhill Simplex Method (NMDS) and the sequential process simulator HYSYS is used to search for better solutions for the Liquefied Energy Chain with respect to minimum cost or maximum profit. (au)

  11. Analysis of Production Funds Efficiency in the Country’s Crane Sector

    Directory of Open Access Journals (Sweden)

    Edita Valėnaitė

    2012-07-01

    Full Text Available The article deals with methodological aspects of the production funds analysis. In this work the study of literature which examines the basic and transferable funds is presented. Much attention is paid to the analysis of basic production funds in the crane sector. Analysis of the basic production funds in the country’s crane sector is presented. Special attention is paid to the structure of the cranes park and crane utilization efficiency. In the analysis of basic production funds, the factors that impede the efficiency of production funds were established as follows: the greatest negative impact on basic production funds efficiency of businesses was in small tower crane fleet; a small range of tower cranes in the park is reducing the technical possibilities of the lease. Basing on the analysis and observation of tower cranes installation and dismantling, the poor supply of the parts for the work place, lack of competence and motivation of the support staff have been ascertained. After elimination of these factors it is possible to raise the efficiency of the main production funds.Article in Lithuanian

  12. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...... shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy...

  13. An ultra-efficient nonlinear planar integrated platform for optical signal processing and generation

    DEFF Research Database (Denmark)

    Pu, Minhao; Ottaviano, Luisa; Semenova, Elizaveta

    2017-01-01

    This paper will discuss the recently developed integrated platform: AlGaAs-oninsulator and its broad range of nonlinear applications. Recent demonstrations of broadband optical signal processing and efficient frequency comb generations in this platform will be reviewed.......This paper will discuss the recently developed integrated platform: AlGaAs-oninsulator and its broad range of nonlinear applications. Recent demonstrations of broadband optical signal processing and efficient frequency comb generations in this platform will be reviewed....

  14. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  15. Event-driven processing for hardware-efficient neural spike sorting

    Science.gov (United States)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  16. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  17. A Moving-Object Index for Efficient Query Processing with PeerWise Location Privacy

    DEFF Research Database (Denmark)

    Lin, Dan; Jensen, Christian S.; Zhang, Rui

    2011-01-01

    attention has been paid to enabling so-called peer-wise privacy—the protection of a user’s location from unauthorized peer users. This paper identifies an important efficiency problem in existing peer-privacy approaches that simply apply a filtering step to identify users that are located in a query range......, but that do not want to disclose their location to the querying peer. To solve this problem, we propose a novel, privacy-policy enabled index called the PEB-tree that seamlessly integrates location proximity and policy compatibility. We propose efficient algorithms that use the PEB-tree for processing privacy......-aware range and kNN queries. Extensive experiments suggest that the PEB-tree enables efficient query processing....

  18. Efficiency in the Community College Sector: Stochastic Frontier Analysis

    Science.gov (United States)

    Agasisti, Tommaso; Belfield, Clive

    2017-01-01

    This paper estimates technical efficiency scores across the community college sector in the United States. Using stochastic frontier analysis and data from the Integrated Postsecondary Education Data System for 2003-2010, we estimate efficiency scores for 950 community colleges and perform a series of sensitivity tests to check for robustness. We…

  19. FY 1998 annual summary report on photon measuring/processing techniques. Development of the techniques for high-efficiency production processes; 1998 nendo foton keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The objectives are set to develop the techniques for energy-efficient laser-aided processing; techniques for high-precision, real-time measurement to improve quality control for production processes and increase their efficiency; and the techniques for generating/controlling photon of high efficiency and quality as the laser beam sources therefor, in order to promote energy saving at and improve efficiency of production processes consuming large quantities of energy, e.g., welding, joining, surface treatment and production of fine particles. The R and D themes are microscopic processing technology: simulation technology for laser welding phenomena; microscopic processing technology: synthesis of technology for quantum dot functional structures; in-situ status measuring technology: fine particle elements and size measurement technology; high-power all-solid-state laser technology: efficient rod type LD-pumping laser modules and pumping chamber of a slab-type laser; tightly-focusing all-solid-state laser technology: improvement of E/O efficiency of laser diode, high-quality nonlinear crystal growth technology and fabrication technology for nonlinear crystal; and comprehensive investigation of photonics engineering: high-efficiency harmonic generation technology. (NEDO)

  20. ANALYSIS OF THE PERSONNEL POTENTIAL EFFICIENCY OF THE PHARMACEUTICAL INSTITUTION

    Directory of Open Access Journals (Sweden)

    Марина Юрьевна Клищенко

    2017-03-01

    The study revealed common patterns of the personnel potential efficiency of the pharmaceutical institution. It was found that pharmaceutical institutions need qualified professionals with a high level of practical training, ready to immediately join the working process. It was noted that only 47 % of graduates are prepared to work in the specialty chosen. We studied the length of employment with one employer. The analysis of the main problems making pharmaceutical professionals to change their work is performed. During the questioning of pharmaceutical students it was noted that a large number of students (32,89 % feel uncertainty about their further employment and 47 % of respondents are going to work in the profession.

  1. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    Science.gov (United States)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  2. Exergy analysis of wine production: Red wine production process as a case study

    International Nuclear Information System (INIS)

    Genc, Mahmut; Genc, Seda; Goksungur, Yekta

    2017-01-01

    Highlights: • Red wine production process was studied thermodynamically by exergy analysis method. • The first study on exergetic analysis of a red wine production process. • Energetic and exergetic efficiencies are calculated as 57.2 and 41.8%, respectively. • Cumulative exergy loss is computed as 2692.51 kW for 1 kg/s grape. • Specific exergy loss is found as 5080.20 kW/kg wine. - Abstract: This paper performs exergy analysis of a red wine production line and defines the exergy destruction rates to assess the system performance in terms of sustainability. A model study with necessary data is chosen for the calculations. The total exergy destruction rate of the overall system was determined to be 344.08 kW while the greatest destruction rate of the exergy in the whole system occurred in the open fermenter (333.6 kW). The system thermal efficiency was obtained to be 57.2% while the exergy efficiency was calculated as 41.8%. The total exergy destruction rate of the overall system increases with the increase both in the grape flow rate and the reference temperature when the reference pressure is assumed as 101.325 kPa. Furthermore, the chemical exergy of streams was found much higher than the physical exergy for each stream. The exergy results were illustrated through the Grassmann diagram. Furthermore, cumulative exergy loss and specific exergy loss values were determined as 2692.51 kW/1 kg/s grape processed and 5080.20 kW/kg wine, respectively.

  3. The Practice of Using the Discriminant Analysis of the Efficiency of Processes of Functioning of Agricultural Enterprises on the Basis of Indicators of the Constituent Parts of Performance

    Directory of Open Access Journals (Sweden)

    Burennikova Nataliia V.

    2018-02-01

    Full Text Available The article considers the practice of using the method of discriminant analysis to study effectiveness of the processes of functioning of enterprises on the basis of indicators of the constituent parts of performance on the example of specific agricultural enterprises of the grain products subcomplex. It is underlined that when using benchmarking (as a method of competitive analysis in many cases when researching the processes of functioning and development of enterprises (in particular, agricultural there is a need to distribute the studied objects into individual groups according to the main strategic priorities. It is specified that one of the methods used for such distribution is the classic discriminant analysis, which allows to define the quantitative boundary that distinguishes the group of enterprises-leaders from all other enterprises. It has been found that the determining factor in the use of the specified method is the choice of a number of indicators characterizing the objects and processes allocated by using benchmarking. This choice, in turn, requires implementation of an appropriate algorithms based on simulation. As these indicators serve the authors’ indicators of efficiency and scale product, selected as the constituent parts of the performance indicator, characterizing any process and its results from both the qualitative and the quantitative points of view. The authors’ own approaches to the method of grouping of objects and allocation of strategically important groups among them have been proposed.

  4. Non-Markovian quantum processes: Complete framework and efficient characterization

    Science.gov (United States)

    Pollock, Felix A.; Rodríguez-Rosario, César; Frauenheim, Thomas; Paternostro, Mauro; Modi, Kavan

    2018-01-01

    Currently, there is no systematic way to describe a quantum process with memory solely in terms of experimentally accessible quantities. However, recent technological advances mean we have control over systems at scales where memory effects are non-negligible. The lack of such an operational description has hindered advances in understanding physical, chemical, and biological processes, where often unjustified theoretical assumptions are made to render a dynamical description tractable. This has led to theories plagued with unphysical results and no consensus on what a quantum Markov (memoryless) process is. Here, we develop a universal framework to characterize arbitrary non-Markovian quantum processes. We show how a multitime non-Markovian process can be reconstructed experimentally, and that it has a natural representation as a many-body quantum state, where temporal correlations are mapped to spatial ones. Moreover, this state is expected to have an efficient matrix-product-operator form in many cases. Our framework constitutes a systematic tool for the effective description of memory-bearing open-system evolutions.

  5. Comparative analysis of efficiency in cooking with natural gas and electricity

    International Nuclear Information System (INIS)

    Amell Arrieta, Andres; Cadavid Sierra Francisco Javier; Ospina Ospina, Juan Carlos

    2001-01-01

    The natural gas will have, at the Aburra Valley, a massive application in residential process like heating water and cooking, historically doing with electricity. In the study of electricity substitution in necessary to estimate the gas consumption in order to keep satisfying the energetic requirements at the different strata supposing that, alimentary habits in these have not important valuation through the time. Since the volume of natural gas requirements for the electricity substitution at given conditions depend on electrical energy before substitution, electrical equipment efficiency, gas equipment efficiency and gas substitution heating value, the determination of these efficiencies are necessary. This work presents the calculation processes comparing gas heating and cooking processes, versus electrical devises taking in mind several schemes and essay conditions

  6. Data acquisition and processing system for reactor noise analysis

    International Nuclear Information System (INIS)

    Costa Oliveira, J.; Morais Da Veiga, C.; Forjaz Trigueiros, D.; Pombo Duarte, J.

    1975-01-01

    A data acquisition and processing system for reactor noise analysis by time correlation methods is described, consisting in one to four data feeding channels (transducer, associated electronics and V/f converter), a sampling unit, a landline transmission system and a PDP 15 computer. This system is being applied to study the kinetic parameters of the 'Reactor Portugues de Investigacao', a swimming-pool 1MW reactor. The main features that make such a data acquisition and processing system a useful tool to perform noise analysis are: the improved characteristics of analog-to-digital converters employed to quantize the signals; the use of an on-line computer which allows a great accumulation and a rapid treatment of data together with an easy check of the correctness of the experiments; and the adoption of the time cross-correlation technique using two-detectors which by-pass the limitation of low efficiency detectors. (author)

  7. Efficiency Analysis of a Wave Power Generation System by Using Multibody Dynamics

    International Nuclear Information System (INIS)

    Kim, Min Soo; Sohn, Jeong Hyun; Kim, Jung Hee; Sung, Yong Jun

    2016-01-01

    The energy absorption efficiency of a wave power generation system is calculated as the ratio of the wave power to the power of the system. Because absorption efficiency depends on the dynamic behavior of the wave power generation system, a dynamic analysis of the wave power generation system is required to estimate the energy absorption efficiency of the system. In this study, a dynamic analysis of the wave power generation system under wave loads is performed to estimate the energy absorption efficiency. RecurDyn is employed to carry out the dynamic analysis of the system, and the Morison equation is used for the wave load model. According to the results, the lower the wave height and the shorter the period, the higher is the absorption efficiency of the system

  8. Efficiency Analysis of a Wave Power Generation System by Using Multibody Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Min Soo; Sohn, Jeong Hyun [Pukyong National Univ., Busan (Korea, Republic of); Kim, Jung Hee; Sung, Yong Jun [INGINE Inc., Seoul (Korea, Republic of)

    2016-06-15

    The energy absorption efficiency of a wave power generation system is calculated as the ratio of the wave power to the power of the system. Because absorption efficiency depends on the dynamic behavior of the wave power generation system, a dynamic analysis of the wave power generation system is required to estimate the energy absorption efficiency of the system. In this study, a dynamic analysis of the wave power generation system under wave loads is performed to estimate the energy absorption efficiency. RecurDyn is employed to carry out the dynamic analysis of the system, and the Morison equation is used for the wave load model. According to the results, the lower the wave height and the shorter the period, the higher is the absorption efficiency of the system.

  9. Energy efficiency measures in China: A three-stage DEA analysis

    Science.gov (United States)

    Cai, Yu; Xiong, Siqin; Ma, Xiaoming

    2017-04-01

    This paper measures energy efficiency of 30 regions in China during 2010-2014 by using the three-stage data envelopment analysis (DEA) model. The results indict that environmental factors and random error both have significant impacts on energy efficiency. After eliminating these influences, the results present that the energy efficiency in developed regions is almost higher than that in undeveloped or resource-rich regions and low scale technical efficiency is the main constraining factor in inefficient regions. Based on the efficiency characteristics, this paper divides all regions into four types and provide differential energy strategies.

  10. LANL Institutional Decision Support By Process Modeling and Analysis Group (AET-2)

    Energy Technology Data Exchange (ETDEWEB)

    Booth, Steven Richard [Los Alamos National Laboratory

    2016-04-04

    AET-2 has expertise in process modeling, economics, business case analysis, risk assessment, Lean/Six Sigma tools, and decision analysis to provide timely decision support to LANS leading to continuous improvement. This capability is critical during the current tight budgetary environment as LANS pushes to identify potential areas of cost savings and efficiencies. An important arena is business systems and operations, where processes can impact most or all laboratory employees. Lab-wide efforts are needed to identify and eliminate inefficiencies to accomplish Director McMillan’s charge of “doing more with less.” LANS faces many critical and potentially expensive choices that require sound decision support to ensure success. AET-2 is available to provide this analysis support to expedite the decisions at hand.

  11. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  12. Optimizing MRI Logistics: Prospective Analysis of Performance, Efficiency, and Patient Throughput.

    Science.gov (United States)

    Beker, Kevin; Garces-Descovich, Alejandro; Mangosing, Jason; Cabral-Goncalves, Ines; Hallett, Donna; Mortele, Koenraad J

    2017-10-01

    The objective of this study is to optimize MRI logistics through evaluation of MRI workflow and analysis of performance, efficiency, and patient throughput in a tertiary care academic center. For 2 weeks, workflow data from two outpatient MRI scanners were prospectively collected and stratified by value added to the process (i.e., value-added time, business value-added time, or non-value-added time). Two separate time cycles were measured: the actual MRI process cycle as well as the complete length of patient stay in the department. In addition, the impact and frequency of delays across all observations were measured. A total of 305 MRI examinations were evaluated, including body (34.1%), neurologic (28.9%), musculoskeletal (21.0%), and breast examinations (16.1%). The MRI process cycle lasted a mean of 50.97 ± 24.4 (SD) minutes per examination; the mean non-value-added time was 13.21 ± 18.77 minutes (25.87% of the total process cycle time). The mean length-of-stay cycle was 83.51 ± 33.63 minutes; the mean non-value-added time was 24.33 ± 24.84 minutes (29.14% of the total patient stay). The delay with the highest frequency (5.57%) was IV or port placement, which had a mean delay of 22.82 minutes. The delay with the greatest impact on time was MRI arthrography for which joint injection of contrast medium was necessary but was not accounted for in the schedule (mean delay, 42.2 minutes; frequency, 1.64%). Of 305 patients, 34 (11.15%) did not arrive at or before their scheduled time. Non-value-added time represents approximately one-third of the total MRI process cycle and patient length of stay. Identifying specific delays may expedite the application of targeted improvement strategies, potentially increasing revenue, efficiency, and overall patient satisfaction.

  13. Analysis of the Russian Market for Building Energy Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Lychuk, Taras; Evans, Meredydd; Halverson, Mark A.; Roshchanka, Volha

    2012-12-01

    This report provides analysis of the Russian energy efficiency market for the building sector from the perspective of U.S. businesses interested in exporting relevant technologies, products and experience to Russia. We aim to help U.S. energy efficiency and environmental technologies businesses to better understand the Russian building market to plan their market strategy.

  14. A preferential design approach for energy-efficient and robust implantable neural signal processing hardware.

    Science.gov (United States)

    Narasimhan, Seetharam; Chiel, Hillel J; Bhunia, Swarup

    2009-01-01

    For implantable neural interface applications, it is important to compress data and analyze spike patterns across multiple channels in real time. Such a computational task for online neural data processing requires an innovative circuit-architecture level design approach for low-power, robust and area-efficient hardware implementation. Conventional microprocessor or Digital Signal Processing (DSP) chips would dissipate too much power and are too large in size for an implantable system. In this paper, we propose a novel hardware design approach, referred to as "Preferential Design" that exploits the nature of the neural signal processing algorithm to achieve a low-voltage, robust and area-efficient implementation using nanoscale process technology. The basic idea is to isolate the critical components with respect to system performance and design them more conservatively compared to the noncritical ones. This allows aggressive voltage scaling for low power operation while ensuring robustness and area efficiency. We have applied the proposed approach to a neural signal processing algorithm using the Discrete Wavelet Transform (DWT) and observed significant improvement in power and robustness over conventional design.

  15. Advanced Thermoelectric Materials for Efficient Waste Heat Recovery in Process Industries

    Energy Technology Data Exchange (ETDEWEB)

    Adam Polcyn; Moe Khaleel

    2009-01-06

    The overall objective of the project was to integrate advanced thermoelectric materials into a power generation device that could convert waste heat from an industrial process to electricity with an efficiency approaching 20%. Advanced thermoelectric materials were developed with figure-of-merit ZT of 1.5 at 275 degrees C. These materials were not successfully integrated into a power generation device. However, waste heat recovery was demonstrated from an industrial process (the combustion exhaust gas stream of an oxyfuel-fired flat glass melting furnace) using a commercially available (5% efficiency) thermoelectric generator coupled to a heat pipe. It was concluded that significant improvements both in thermoelectric material figure-of-merit and in cost-effective methods for capturing heat would be required to make thermoelectric waste heat recovery viable for widespread industrial application.

  16. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    Science.gov (United States)

    Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui

    2012-01-01

    Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  17. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    Directory of Open Access Journals (Sweden)

    Pooya Zandevakili

    Full Text Available Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  18. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  19. Boiling process modelling peculiarities analysis of the vacuum boiler

    Science.gov (United States)

    Slobodina, E. N.; Mikhailov, A. G.

    2017-06-01

    The analysis of the low and medium powered boiler equipment development was carried out, boiler units possible development directions with the purpose of energy efficiency improvement were identified. Engineering studies for the vacuum boilers applying are represented. Vacuum boiler heat-exchange processes where boiling water is the working body are considered. Heat-exchange intensification method under boiling at the maximum heat- transfer coefficient is examined. As a result of the conducted calculation studies, heat-transfer coefficients variation curves depending on the pressure, calculated through the analytical and numerical methodologies were obtained. The conclusion about the possibility of numerical computing method application through RPI ANSYS CFX for the boiling process description in boiler vacuum volume was given.

  20. Accessibility analysis in manufacturing processes using visibility cones

    Institute of Scientific and Technical Information of China (English)

    尹周平; 丁汉; 熊有伦

    2002-01-01

    Accessibility is a kind of important design feature of products,and accessibility analysis has been acknowledged as a powerful tool for solving computational manufacturing problems arising from different manufacturing processes.After exploring the relations among approachability,accessibility and visibility,a general method for accessibility analysis using visibility cones (VC) is proposed.With the definition of VC of a point,three kinds of visibility of a feature,namely complete visibility cone (CVC),partial visibility cone (PVC) and local visibility cone (LVC),are defined.A novel approach to computing VCs is formulated by identifying C-obstacles in the C-space,for which a general and efficient algorithm is proposed and implemented by making use of visibility culling.Lastly,we discuss briefly how to realize accessibility analysis in numerically controlled (NC) machining planning,coordinate measuring machines (CMMs) inspection planning and assembly sequence planning with the proposed methods.

  1. Efficient Strictness Analysis of Haskell

    DEFF Research Database (Denmark)

    Jensen, Kristian Damm; Hjæresen, Peter; Rosendahl, Mads

    1994-01-01

    Strictness analysis has been a living field of investigation since Mycroft's original work in 1980, and is getting increasingly significant with the still wider use of lazy functional programming languages. This paper focuses on an actual implementation of a strictness analyser for Haskell....... The analyser uses abstract interpretation with chaotic fixpoint iteration. The demand-driven nature of this iteration technique allows us to use large domains including function domains in the style of Burn et al. 1986 and Wadler 87 and retain reasonable efficiency. The implementation, furthermore, allows us...

  2. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    Science.gov (United States)

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  3. Evaluation of economic efficiency of process improvement in food packaging

    Directory of Open Access Journals (Sweden)

    Jana Hron

    2012-01-01

    Full Text Available In general, we make gains in process by the three fundamental ways. First, we define or redefine our process in a strategic sense. Second, once defined or redefined, we commence process operations and use process control methods to target and stabilize our process. Third, we use process improvement methods, as described in this paper, along with process control to fully exploit our process management and/or technology. Process improvement is focused primarily in our subprocesses and sub-subprocesses. Process leverage is the key to process improvement initiatives. This means that small improvements of the basic manufacturing operations can have (with the assumption of mass repetition of the operation a big impact on the functioning of the whole production unit. The complexity within even small organizations, in people, products, and processes, creates significant challenges in effectively and efficiently using these initiatives tools. In this paper we are going to place process purposes in the foreground and initiatives and tools in the background as facilitator to help accomplish process purpose. Initiatives and tools are not the ends we are seeking; result/outcomes in physical, economics, timeliness, and customer service performance matter. In the paper process boundaries (in a generic sense are set by our process purpose and our process definition. Process improvement is initiated within our existing process boundaries. For example, in a fast-food restaurant, if we define our cooking process around a frying technology, then we provide process improvements within our frying technology. On the other hand, if we are considering changing to a broiling technology, then we are likely faced with extensive change, impacting our external customers, and a process redefinition may be required. The result / aim of the paper are based on the example of the process improving of a food packaging quality. Specifically, the integration of two approaches

  4. Energy Efficient Scheduling of Real Time Signal Processing Applications through Combined DVFS and DPM

    OpenAIRE

    Nogues , Erwan; Pelcat , Maxime; Menard , Daniel; Mercat , Alexandre

    2016-01-01

    International audience; This paper proposes a framework to design energy efficient signal processing systems. The energy efficiency is provided by combining Dynamic Frequency and Voltage Scaling (DVFS) and Dynamic Power Management (DPM). The framework is based on Synchronous Dataflow (SDF) modeling of signal processing applications. A transformation to a single rate form is performed to expose the application parallelism. An automated scheduling is then performed, minimizing the constraint of...

  5. High Efficiency Mask Based Laser Materials Processing with TEA-CO2 - and Excimer Laser

    DEFF Research Database (Denmark)

    Bastue, Jens; Olsen, Flemmming Ove

    1997-01-01

    In general, mask based laser materials processing techniques suffer from a very low energy efficiency. We have developed a simple device called an energy enhancer, which is capable of increasing the energy efficiency of typical mask based laser materials processing systems. A short review of the ...... line marking with TEA-CO2 laser of high speed canning lines. The second one is manufactured for marking or microdrilling with excimer laser....

  6. THEORETICAL APPROACHES TO ASSESS EFFICIENCY OF THE TRANSFORMATION OF THE KEY BUSINESS PROCESSES IN THE PUBLISHING AND PRINTING ACTIVITIES IN THE REGION

    Directory of Open Access Journals (Sweden)

    Volodymyr Bazyliuk

    2016-11-01

    Full Text Available The purpose of the paper is the theoretical study and the analysis of the basic methodological approaches to assess the effectiveness of the transformation of key business processes in the PPA (publishing and printing activity in the region in order to choose the best option. Methodology. The overview of the main assessment methods of the effectiveness of business processes: EVA (Economic value added; ABC (Activity-based costing; Tableau of bord and BSC (Balanced Scorecard is provided. In order to ensure the formalization of the intergrated assessment of the effectiveness of the business process in the publishing and printing activities in the region it is suggested to apply to the methodological apparatus of the fuzzy sets. Statistical analysis, comparison and synthesis are necessary to study the efficiency of the transformation of the key business processes in the PPA in the region. Results. The review and analysis of the most common methods for evaluating the effectiveness of the transformation of key business processes were conducted; the basic advantages and disadvantages of each of the proposed methods in the light of PPA were studied. It was proved that a single business process involves the use of a scorecard that is specific and peculiar for it only whereas completeness of its analysis depends on the kind of the business process: basic, developmental, managing or providing one. The approach to the formalization of the integrated assememnt of the effectiveness of business process in PPA in the region, based on the theory of fuzzy sets was formulated. Practical significance. The mathematical formulation of the problem, an integrated assessment of the efficiency of the business process for each of the possible options for its implementation was developed, and the algorithm of assessing the effectiveness of the business process in the PPA in the region was generated by the apparatus of fuzzy sets. Value/originality. Implementing the

  7. In-Depth Analysis of Energy Efficiency Related Factors in Commercial Buildings Using Data Cube and Association Rule Mining

    Directory of Open Access Journals (Sweden)

    Byeongjoon Noh

    2017-11-01

    Full Text Available Significant amounts of energy are consumed in the commercial building sector, resulting in various adverse environmental issues. To reduce energy consumption and improve energy efficiency in commercial buildings, it is necessary to develop effective methods for analyzing building energy use. In this study, we propose a data cube model combined with association rule mining for more flexible and detailed analysis of building energy consumption profiles using the Commercial Buildings Energy Consumption Survey (CBECS dataset, which has accumulated over 6700 existing commercial buildings across the U.S.A. Based on the data cube model, a multidimensional commercial sector building energy analysis was performed based upon on-line analytical processing (OLAP operations to assess the energy efficiency according to building factors with various levels of abstraction. Furthermore, the proposed analysis system provided useful information that represented a set of energy efficient combinations by applying the association rule mining method. We validated the feasibility and applicability of the proposed analysis model by structuring a building energy analysis system and applying it to different building types, weather conditions, composite materials, and heating/cooling systems of the multitude of commercial buildings classified in the CBECS dataset.

  8. QTL analysis by sequencing of Water Use Efficiency (WUE) in potato

    DEFF Research Database (Denmark)

    Kaminski, Kacper Piotr; Sønderkær, Mads; Sørensen, Kirsten Kørup

    2013-01-01

    The traditional approach to potato breeding, the classical “mate and phenotype” approach is relatively costly and because phenotyping and growth capacity is limited, this are being slowly replaced by Marker Assisted Selection (MAS) breeding schemes. MAS is based on the presence of DNA polymorphic.......sparsipilum), phenotyped for water use efficiency. This population has also previously been phenotyped for the total glycoalkaloid (TGA) content....... and time consuming process. Here, a novel method for Quantitative Trait Locus (QTL) analysis has been developed, that allows for development of specific markers by use of genomic sequence reads and the recently published reference genome sequence for potato. Prior to sequencing the mapping population...

  9. Efficiency and productivity analysis of the interstate bus transportation industry in Brazil

    Directory of Open Access Journals (Sweden)

    Antonio G.N. Novaes

    2010-08-01

    Full Text Available Productivity analysis is an important policy making and managerial control tool for assessing the degree to which inputs are utilized in the process of obtaining desired outputs. Data Envelopment Analysis (DEA is a non-parametric method based on piecewise linear frontiers estimated with the aid of mathematical programming techniques and used, in this paper, to investigate technical, scale and managerial efficiencies associated with interstate bus companies in Brazil (ISBT. Data has been obtained from the web-site of the Brazilian National Agency of Land Transportation (ANTT. Since production factors in the application are constrained by technical and operational reasons, weight restrictions were introduced into the DEA models. The analysis has shown three groups of non-efficient bus firms, with clear differences in productivity. The relative managerial efficiencies of the firms in the non-efficient groups were also computed and analyzed. Finally, an example of benchmarking a non-efficient firm with DEA is presented.A análise de produtividade é ferramenta importante para a tomada de decisão e para a gestão de organizações, possibilitando avaliar os efeitos dos inputs na obtenção de níveis desejados de outputs. A Análise Envoltória de Dados (DEA é um método não paramétrico baseado em fronteiras lineares por partes ajustadas através de programação matemática e utilizada para analisar as eficiências técnica, de escala e de gestão de empresas de ônibus que operam nas ligações interestaduais no Brasil (ISBT. Os dados foram obtidos no web-site da Agência Nacional de Transportes Terrestres (ANTT. Como os fatores de produção são restritos por razões técnicas e operacionais, restrições de pesos foram introduzidas nos modelos. A análise mostrou três grupos de empresas não eficientes, com claras diferenças de produtividade. Também são calculadas as eficiências de gestão relativas a esses três grupos. Finalmente

  10. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  11. Efficient high-throughput biological process characterization: Definitive screening design with the ambr250 bioreactor system.

    Science.gov (United States)

    Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam

    2015-01-01

    The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.

  12. A Foundation for Efficient Indoor Distance-Aware Query Processing

    DEFF Research Database (Denmark)

    Lu, Hua; Cao, Xin; Jensen, Christian Søndergaard

    2012-01-01

    model that integrates indoor distance seamlessly. To enable the use of the model as a foundation for query processing, we develop accompanying, efficient algorithms that compute indoor distances for different indoor entities like doors as well as locations. We also propose an indexing framework......Indoor spaces accommodate large numbers of spatial objects, e.g., points of interest (POIs), and moving populations. A variety of services, e.g., location-based services and security control, are relevant to indoor spaces. Such services can be improved substantially if they are capable of utilizing...... that accommodates indoor distances that are pre-computed using the proposed algorithms. On top of this foundation, we develop efficient algorithms for typical indoor, distance-aware queries. The results of an extensive experimental evaluation demonstrate the efficacy of the proposals....

  13. Urban water metabolism efficiency assessment: integrated analysis of available and virtual water.

    Science.gov (United States)

    Huang, Chu-Long; Vause, Jonathan; Ma, Hwong-Wen; Yu, Chang-Ping

    2013-05-01

    Resolving the complex environmental problems of water pollution and shortage which occur during urbanization requires the systematic assessment of urban water metabolism efficiency (WME). While previous research has tended to focus on either available or virtual water metabolism, here we argue that the systematic problems arising during urbanization require an integrated assessment of available and virtual WME, using an indicator system based on material flow analysis (MFA) results. Future research should focus on the following areas: 1) analysis of available and virtual water flow patterns and processes through urban districts in different urbanization phases in years with varying amounts of rainfall, and their environmental effects; 2) based on the optimization of social, economic and environmental benefits, establishment of an indicator system for urban WME assessment using MFA results; 3) integrated assessment of available and virtual WME in districts with different urbanization levels, to facilitate study of the interactions between the natural and social water cycles; 4) analysis of mechanisms driving differences in WME between districts with different urbanization levels, and the selection of dominant social and economic driving indicators, especially those impacting water resource consumption. Combinations of these driving indicators could then be used to design efficient water resource metabolism solutions, and integrated management policies for reduced water consumption. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Efficiency assessment of wind farms in China using two-stage data envelopment analysis

    International Nuclear Information System (INIS)

    Wu, Yunna; Hu, Yong; Xiao, Xinli; Mao, Chunyu

    2016-01-01

    Highlights: • The efficiency of China’s wind farms is assessed by data envelopment analysis. • Tobit model is used to analyze the impact of uncontrollable factors on efficiency. • Sensitivity analysis is conducted to verify the stability of evaluation results. • Efficiency levels of Chinese wind farms are relatively high in general. • Age and wind curtailment rate negatively affect the productive efficiency. - Abstract: China has been the world’s leader in wind power capacity due to the promotion of favorable policies. Given the rare research on the efficiency of China’s wind farms, this study analyzes the productive efficiency of 42 large-scale wind farms in China using a two-stage analysis. In the first stage, efficiency scores of wind farms are determined with data envelopment analysis and the sensitivity analysis is conducted to verify the robustness of efficiency calculation results. In the second stage, the Tobit regression is employed to explore the relationship between the efficiency scores and the environment variables that are beyond the control of wind farms. According to the results, all wind farms studied operate at an acceptable level. However, 50% of them overinvest in the installed capacity and about 48% have the electricity-saving potential. The most important factors affecting the efficiency of wind farms are the installed capacity and the wind power density. In addition, the age of the wind farm and the wind curtailment rate have a negative effect on productive efficiency, whereas the ownership of the wind farm has no significant effect. Findings from this study may be helpful for stakeholders in the wind industry to select wind power projects, optimize operational strategies and make related policies.

  15. Non-parametric analysis of production efficiency of poultry egg ...

    African Journals Online (AJOL)

    Non-parametric analysis of production efficiency of poultry egg farmers in Delta ... analysis of factors affecting the output of poultry farmers showed that stock ... should be put in place for farmers to learn the best farm practices carried out on the ...

  16. Biomass Gasification for Power Generation Internal Combustion Engines. Process Efficiency

    International Nuclear Information System (INIS)

    Lesme-Jaén, René; Garcia-Faure, Luis; Oliva-Ruiz, Luis; Pajarín-Rodríguez, Juan; Revilla-Suarez, Dennis

    2016-01-01

    Biomass is a renewable energy sources worldwide greater prospects for its potential and its lower environmental impact compared to fossil fuels. By different processes and energy conversion technologies is possible to obtain solid, liquid and gaseous fuels from any biomass.In this paper the evaluation of thermal and overall efficiency of the gasification of Integral Forestry Company Santiago de Cuba is presented, designed to electricity generation from waste forest industry. The gasifier is a downdraft reactor, COMBO-80 model of Indian manufacturing and motor (diesel) model Leyland modified to work with producer gas. The evaluation was conducted at different loads (electric power generated) of the motor from experimental measurements of flow and composition of gas supplied to the engine. The results show that the motor operates with a thermal efficiency in the range of 20-32% with an overall efficiency between 12-25 %. (author)

  17. THE ANALYSIS OF POSSIBILITIES OF INTERNET RESOURCES TO IMPROVE THE EFFICIENCY THE EDUCATIONAL PROCESS

    Directory of Open Access Journals (Sweden)

    С А Усманов

    2017-12-01

    Full Text Available In article the analysis of Internet resources of educational appointment is carried out and the short characteristic of the main opportunities of these resources is given. Today it is already impossible to present educational process without use of opportunities of Internet resources. The modern teacher has to be able to work with information which is necessary for realization of his professional activity, the solution of his professional tasks, to have skills of cooperation with pupils on the basis of information exchange. The educational Internet resource is complete, named, interconnected, uniform systemically organized set which includes both the general education formalized and professionally significant knowledge and means of organizational and methodical ensuring educational process and means for their automated storage, accumulation and processing. Internet resources are designed to satisfy needs of the user for various aspects and spheres of educational activity. Quite often concrete resource includes several properties such difficult at once on structure and functioning of means what the Internet is. Use of the Internet allows to establish between the teacher and pupils feedback when performing independent works, allows to carry out mailing of materials and to conduct surveys. Occupations with use of resources of the Internet represent alloy of new information technologies with new pedagogical technologies.

  18. Evaluating ecommerce websites cognitive efficiency: an integrative framework based on data envelopment analysis.

    Science.gov (United States)

    Lo Storto, Corrado

    2013-11-01

    This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Eco-efficiency analysis methodology on the example of the chosen polyolefins production

    OpenAIRE

    K. Czaplicka-Kolarz; D. Burchart-Korol; P. Krawczyk

    2010-01-01

    the chosen polyolefins production. The article presents also main tools of eco-efficiency analysis: Life Cycle Assessment (LCA) and Net Present Value (NPV).Design/methodology/approach: On the basis of LCA and NPV of high density polyethylene (HDPE) and low density polyethylene (LDPE) production, eco-efficiency analysis is conducted.Findings: In this article environmental and economic performance of the chosen polyolefins production was presented. The basis phases of eco-efficiency methodology...

  20. THE EFFICIENCY ANALYSIS OF SINGAPORE REAL ESTATE INVESTMENT TRUSTS

    Directory of Open Access Journals (Sweden)

    Hui Chen Chiang

    2016-11-01

    Full Text Available Since the REIT industry is relatively new in Singapore, the objective of this research is to examine the operation efficiency among firms in the industry through the method of Data Envelopment Analysis (DEA. In addition, the method of Tobit regression is applied to investigate the impact factors on efficiency. The results are as follows. First, none of 14 firms analyzed performs relatively efficiently based on the average efficiency scores over the sample periods of 2007 to first quarter of 2015. Nevertheless, it is found that First REIT and Suntec REIT are the most efficient and least efficient REITs respectively. Second, ROA is positively correlated to efficiency scores while the negative relationship is found with the debt ratio. Third, regarding property-type, retail REITSs perform better than commercial ones on average. However, the most efficient group is “others” which consists of one hospitality/residential REITs, one healthcare REITs, and three industrial REITs. Fourth, geographical diversification may not affect REIT’s efficiency. Meanwhile, REITs holding more properties overseas perform better than their counterparts on average. Last, in regards to size, small-size REITs significantly perform better in efficiency than those in other categories. Especially, medium-size and largesize REITs do not have significant differences on average in efficiency.

  1. Efficiencies and Physical Principles of Various Solar Energy Conversion Processes Leading to the Photolysis of Water

    Energy Technology Data Exchange (ETDEWEB)

    Bergene, T

    1996-12-31

    In the application of solar energy, hydrogen is likely to be used as an energy carrier and a storage medium. Production of molecular hydrogen and oxygen from water requires energy input, which may come from solar energy in various ways. This thesis begins with a literature survey of the different conversion processes and the efficiencies, which is an introduction to a series of enclosed papers. These papers are: (1) Trapping of Minority Charge Carriers at Irradiated Semiconductor/Electrolyte Heterojunctions, (2) Model Calculations on Flat-Plate Solar Heat Collector With Integrated Solar Cells, and (3) Efficiencies and Physical Principles of Photolysis of Water By Microalgae. In the papers, The qualitative features of the ``illumination-current``-characteristic curve are deduced. The hypothesis is that trapping originates in some specific cases because of confinement, which leads to charge injections into energy states above that corresponding to the band edge. The quantitative features of certain hybrid photovoltaic/thermal configuration are deduced. An analysis of the theoretical and realizable efficiencies of the photolysis of water by micro algae is given. 151 refs., 18 figs., 1 table

  2. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    Science.gov (United States)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for

  3. Energy Savings Through Thermally Efficient Crucible Technology: Fundamentals, Process Modeling, and Applications

    Science.gov (United States)

    Shi, Wenwu; Pinto, Brian

    2017-12-01

    Melting and holding molten metals within crucibles accounts for a large portion of total energy demand in the resource-intensive nonferrous foundry industry. Multivariate mathematical modeling aided by detailed material characterization and advancements in crucible technologies can make a significant impact in the areas of cost-efficiency and carbon footprint reduction. Key thermal properties such as conductivity and specific heat capacity were studied to understand their influence on crucible furnace energy consumption during melting and holding processes. The effects of conductivity on thermal stresses and longevity of crucibles were also evaluated. With this information, accurate theoretical models using finite element analysis were developed to study total energy consumption and melting time. By applying these findings to recent crucible developments, considerable improvements in field performance were reported and documented as case studies in applications such as aluminum melting and holding.

  4. Survival Processing Enhances Visual Search Efficiency.

    Science.gov (United States)

    Cho, Kit W

    2018-05-01

    Words rated for their survival relevance are remembered better than when rated using other well-known memory mnemonics. This finding, which is known as the survival advantage effect and has been replicated in many studies, suggests that our memory systems are molded by natural selection pressures. In two experiments, the present study used a visual search task to examine whether there is likewise a survival advantage for our visual systems. Participants rated words for their survival relevance or for their pleasantness before locating that object's picture in a search array with 8 or 16 objects. Although there was no difference in search times among the two rating scenarios when set size was 8, survival processing reduced visual search times when set size was 16. These findings reflect a search efficiency effect and suggest that similar to our memory systems, our visual systems are also tuned toward self-preservation.

  5. Java technology for implementing efficient numerical analysis in intranet

    International Nuclear Information System (INIS)

    Song, Hee Yong; Ko, Sung Ho

    2001-01-01

    This paper introduces some useful Java technologies for utilizing the internet in numerical analysis, and suggests one architecture performing efficient numerical analysis in the intranet by using them. The present work has verified it's possibility by implementing some parts of this architecture with two easy examples. One is based on Servlet-Applet communication, JDBC and swing. The other is adding multi-threads, file transfer and Java remote method invocation to the former. Through this work it has been intended to make the base for the later advanced and practical research that will include efficiency estimates of this architecture and deal with advanced load balancing

  6. Operator-based linearization for efficient modeling of geothermal processes

    OpenAIRE

    Khait, M.; Voskov, D.V.

    2018-01-01

    Numerical simulation is one of the most important tools required for financial and operational management of geothermal reservoirs. The modern geothermal industry is challenged to run large ensembles of numerical models for uncertainty analysis, causing simulation performance to become a critical issue. Geothermal reservoir modeling requires the solution of governing equations describing the conservation of mass and energy. The robust, accurate and computationally efficient implementation of ...

  7. Evaluation of energy efficiency efforts of oil and gas offshore processing

    DEFF Research Database (Denmark)

    Nguyen, Tuong-Van; Voldsund, Mari; Breuhaus, Peter

    2015-01-01

    the energy performance of these facilities, by decreasing the power and heating requirements and designing more efficient processes. Several technologies that have been proposed are to (i) promote energy integration within the oil and gas processing plant, (ii) add an additional pressure extraction level......, (iii) implement multiphase expanders, and (iv) install a waste heat recovery system. The present work builds on two case studies located in the North and Norwegian Seas, which differ by the type of oil processed, operating conditions and strategies. The findings suggest that no generic improvement can...

  8. An Efficient Experimental Design Strategy for Modelling and Characterization of Processes

    DEFF Research Database (Denmark)

    Tajsoleiman, Tannaz; Semenova, Daria; Oliveira Fernandes, Ana Carolina

    2017-01-01

    Designing robust, efficient and economic processes is a main challenge for the biotech industries. To achieve a well-designed bioprocess, understanding the ongoing phenomena and the involved reaction kinetics is crucial. By development of advanced miniaturized reactors, a promising opportunity ar...

  9. Targeting for energy efficiency and improved energy collaboration between different companies using total site analysis (TSA)

    International Nuclear Information System (INIS)

    Hackl, Roman; Andersson, Eva; Harvey, Simon

    2011-01-01

    Rising fuel prices, increasing costs associated with emissions of green house gases and the threat of global warming make efficient use of energy more and more important. Industrial clusters have the potential to significantly increase energy efficiency by energy collaboration. In this paper Sweden's largest chemical cluster is analysed using the total site analysis (TSA) method. TSA delivers targets for the amount of utility consumed and generated through excess energy recovery by the different processes. The method enables investigation of opportunities to deliver waste heat from one process to another using a common utility system. The cluster consists of 5 chemical companies producing a variety of products, including polyethylene (PE), polyvinyl chloride (PVC), amines, ethylene, oxygen/nitrogen and plasticisers. The companies already work together by exchanging material streams. In this study the potential for energy collaboration is analysed in order to reach an industrial symbiosis. The overall heating and cooling demands of the site are around 442 MW and 953 MW, respectively. 122 MW of heat is produced in boilers and delivered to the processes. TSA is used to stepwise design a site-wide utility system which improves energy efficiency. It is shown that heat recovery in the cluster can be increased by 129 MW, i.e. the current utility demand could be completely eliminated and further 7 MW excess steam can be made available. The proposed retrofitted utility system involves the introduction of a site-wide hot water circuit, increased recovery of low pressure steam and shifting of heating steam pressure to lower levels in a number heat exchangers when possible. Qualitative evaluation of the suggested measures shows that 60 MW of the savings potential could to be achieved with moderate changes to the process utility system corresponding to 50% of the heat produced from purchased fuel in the boilers of the cluster. Further analysis showed that after implementation

  10. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  11. Business Process Design Of An Efficient And Effective Literature Review

    Directory of Open Access Journals (Sweden)

    Sayuthi

    2015-08-01

    Full Text Available The objective of this articleis to design business processesan organization efficiently and effectively. Based on our literature review the design of business processes that is best suitable for an organization belongs to Harrington 1992 namely the concept of Business Process Improvement BPI which is a systematic framework that helps organizations in making significant progress in the implementation of business processes. BPI provides a system that will simplify or streamline business processes to provide an assurance that the internal and external customers of the organization will get a better output. One advantage of BPI concept suggested by Harrington is the continuous improvement whereas the other authorsor experts of BPI have not recognize the idea of continuous improvement. With thisidea the products services offered by organization becomes more innovative.

  12. Towards efficient next generation light sources: combined solution processed and evaporated layers for OLEDs

    Science.gov (United States)

    Hartmann, D.; Sarfert, W.; Meier, S.; Bolink, H.; García Santamaría, S.; Wecker, J.

    2010-05-01

    Typically high efficient OLED device structures are based on a multitude of stacked thin organic layers prepared by thermal evaporation. For lighting applications these efficient device stacks have to be up-scaled to large areas which is clearly challenging in terms of high through-put processing at low-cost. One promising approach to meet cost-efficiency, high through-put and high light output is the combination of solution and evaporation processing. Moreover, the objective is to substitute as many thermally evaporated layers as possible by solution processing without sacrificing the device performance. Hence, starting from the anode side, evaporated layers of an efficient white light emitting OLED stack are stepwise replaced by solution processable polymer and small molecule layers. In doing so different solutionprocessable hole injection layers (= polymer HILs) are integrated into small molecule devices and evaluated with regard to their electro-optical performance as well as to their planarizing properties, meaning the ability to cover ITO spikes, defects and dust particles. Thereby two approaches are followed whereas in case of the "single HIL" approach only one polymer HIL is coated and in case of the "combined HIL" concept the coated polymer HIL is combined with a thin evaporated HIL. These HIL architectures are studied in unipolar as well as bipolar devices. As a result the combined HIL approach facilitates a better control over the hole current, an improved device stability as well as an improved current and power efficiency compared to a single HIL as well as pure small molecule based OLED stacks. Furthermore, emitting layers based on guest/host small molecules are fabricated from solution and integrated into a white hybrid stack (WHS). Up to three evaporated layers were successfully replaced by solution-processing showing comparable white light emission spectra like an evaporated small molecule reference stack and lifetime values of several 100 h.

  13. Energy use and implications for efficiency strategies in global fluid-milk processing industry

    International Nuclear Information System (INIS)

    Xu Tengfang; Flapper, Joris

    2009-01-01

    The fluid-milk processing industry around the world processes approximately 60% of total raw milk production to create diverse fresh fluid-milk products. This paper reviews energy usage in existing global fluid-milk markets to identify baseline information that allows comparisons of energy performance of individual plants and systems. In this paper, we analyzed energy data compiled through extensive literature reviews on fluid-milk processing across a number of countries and regions. The study has found that the average final energy intensity of individual plants exhibited significant large variations, ranging from 0.2 to 12.6 MJ per kg fluid-milk product across various plants in different countries and regions. In addition, it is observed that while the majority of larger plants tended to exhibit higher energy efficiency, some exceptions existed for smaller plants with higher efficiency. These significant differences have indicated large potential energy-savings opportunities in the sector across many countries. Furthermore, this paper illustrates a positive correlation between implementing energy-monitoring programs and curbing the increasing trend in energy demand per equivalent fluid-milk product over time in the fluid-milk sector, and suggests that developing an energy-benchmarking framework, along with promulgating new policy options should be pursued for improving energy efficiency in global fluid-milk processing industry.

  14. Thermodynamic, energy efficiency, and power density analysis of reverse electrodialysis power generation with natural salinity gradients.

    Science.gov (United States)

    Yip, Ngai Yin; Vermaas, David A; Nijmeijer, Kitty; Elimelech, Menachem

    2014-05-06

    Reverse electrodialysis (RED) can harness the Gibbs free energy of mixing when fresh river water flows into the sea for sustainable power generation. In this study, we carry out a thermodynamic and energy efficiency analysis of RED power generation, and assess the membrane power density. First, we present a reversible thermodynamic model for RED and verify that the theoretical maximum extractable work in a reversible RED process is identical to the Gibbs free energy of mixing. Work extraction in an irreversible process with maximized power density using a constant-resistance load is then examined to assess the energy conversion efficiency and power density. With equal volumes of seawater and river water, energy conversion efficiency of ∼ 33-44% can be obtained in RED, while the rest is lost through dissipation in the internal resistance of the ion-exchange membrane stack. We show that imperfections in the selectivity of typical ion exchange membranes (namely, co-ion transport, osmosis, and electro-osmosis) can detrimentally lower efficiency by up to 26%, with co-ion leakage being the dominant effect. Further inspection of the power density profile during RED revealed inherent ineffectiveness toward the end of the process. By judicious early discontinuation of the controlled mixing process, the overall power density performance can be considerably enhanced by up to 7-fold, without significant compromise to the energy efficiency. Additionally, membrane resistance was found to be an important factor in determining the power densities attainable. Lastly, the performance of an RED stack was examined for different membrane conductivities and intermembrane distances simulating high performance membranes and stack design. By thoughtful selection of the operating parameters, an efficiency of ∼ 37% and an overall gross power density of 3.5 W/m(2) represent the maximum performance that can potentially be achieved in a seawater-river water RED system with low

  15. An efficient solution of real-time data processing for multi-GNSS network

    Science.gov (United States)

    Gong, Xiaopeng; Gu, Shengfeng; Lou, Yidong; Zheng, Fu; Ge, Maorong; Liu, Jingnan

    2017-12-01

    Global navigation satellite systems (GNSS) are acting as an indispensable tool for geodetic research and global monitoring of the Earth, and they have been rapidly developed over the past few years with abundant GNSS networks, modern constellations, and significant improvement in mathematic models of data processing. However, due to the increasing number of satellites and stations, the computational efficiency becomes a key issue and it could hamper the further development of GNSS applications. In this contribution, this problem is overcome from the aspects of both dense linear algebra algorithms and GNSS processing strategy. First, in order to fully explore the power of modern microprocessors, the square root information filter solution based on the blocked QR factorization employing as many matrix-matrix operations as possible is introduced. In addition, the algorithm complexity of GNSS data processing is further decreased by centralizing the carrier-phase observations and ambiguity parameters, as well as performing the real-time ambiguity resolution and elimination. Based on the QR factorization of the simulated matrix, we can conclude that compared to unblocked QR factorization, the blocked QR factorization can greatly improve processing efficiency with a magnitude of nearly two orders on a personal computer with four 3.30 GHz cores. Then, with 82 globally distributed stations, the processing efficiency is further validated in multi-GNSS (GPS/BDS/Galileo) satellite clock estimation. The results suggest that it will take about 31.38 s per epoch for the unblocked method. While, without any loss of accuracy, it only takes 0.50 and 0.31 s for our new algorithm per epoch for float and fixed clock solutions, respectively.

  16. CLAMP - a toolkit for efficiently building customized clinical natural language processing pipelines.

    Science.gov (United States)

    Soysal, Ergin; Wang, Jingqi; Jiang, Min; Wu, Yonghui; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua

    2017-11-24

    Existing general clinical natural language processing (NLP) systems such as MetaMap and Clinical Text Analysis and Knowledge Extraction System have been successfully applied to information extraction from clinical text. However, end users often have to customize existing systems for their individual tasks, which can require substantial NLP skills. Here we present CLAMP (Clinical Language Annotation, Modeling, and Processing), a newly developed clinical NLP toolkit that provides not only state-of-the-art NLP components, but also a user-friendly graphic user interface that can help users quickly build customized NLP pipelines for their individual applications. Our evaluation shows that the CLAMP default pipeline achieved good performance on named entity recognition and concept encoding. We also demonstrate the efficiency of the CLAMP graphic user interface in building customized, high-performance NLP pipelines with 2 use cases, extracting smoking status and lab test values. CLAMP is publicly available for research use, and we believe it is a unique asset for the clinical NLP community. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Comparative analysis of technical efficiencies between compound ...

    African Journals Online (AJOL)

    This study was designed to compare the level of technical efficiency in the compound and non compound farms in Imo state. A multi-stage random sampling technique was used to select 120 food crop farmers from two out of the three agricultural zones in Imo state. Using the Chow (1960) analysis of covariance technique ...

  18. Econometric analysis of economic and environmental efficiency of Dutch dairy farms

    NARCIS (Netherlands)

    Reinhard, S.

    1999-01-01

    The Dutch government aims for competitive and sustainable farms, that use marketable inputs efficiently as well as apply environmentally detrimental variables efficiently in the production process. The objective of this research is to define, to estimate and to evaluate environmental

  19. ENERGY EFFICIENCY AS A CRITERION IN THE VEHICLE FLEET MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Davor Vujanović

    2010-01-01

    Full Text Available Transport represents an industry sector with intense energy consumption, the road transport sector within is the dominant subsector. The objective of the research presented in this paper is in defining the activities which applied within road freight transport companies contribute to enhancing vehicles' energy efficiency. Vehicle fleet operation management process effects on fuel consumption decrease have been looked into. Operation parameters that influence vehicle fuel consumption were analysed. In this sense, a survey has been realised in order to evaluate the vehicle load factor impact on the specific fuel consumption. Measures for enhancing vehicle's logistics efficiency have been defined. As a tool for those measures' implementation an algorithm for vehicle fleet operation management was developed which represented a basis for a dedicated software package development for vehicle dispatching process decision support. A set of measures has been recommended and their effects in fuel savings were evaluated.

  20. Statistical Analysis of the First Passage Path Ensemble of Jump Processes

    Science.gov (United States)

    von Kleist, Max; Schütte, Christof; Zhang, Wei

    2018-02-01

    The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.

  1. EFFICIENCY ANALYSIS OF HASHING METHODS FOR FILE SYSTEMS IN USER MODE

    Directory of Open Access Journals (Sweden)

    E. Y. Ivanov

    2013-05-01

    Full Text Available The article deals with characteristics and performance of interaction protocols between virtual file system and file system, their influence on processing power of microkernel operating systems. User mode implementation of ext2 file system for MINIX 3 OS is used to show that in microkernel operating systems file object identification time might increase up to 26 times in comparison with monolithic systems. Therefore, we present efficiency analysis of various hashing methods for file systems, running in user mode. Studies have shown that using hashing methods recommended in this paper it is possible to achieve competitive performance of the considered component of I/O stacks in microkernel and monolithic operating systems.

  2. MA-burners efficiency parameters allowing for the duration of transmutation process

    International Nuclear Information System (INIS)

    Gulevich, A.; Zemskov, E.; Kalugin, A.; Ponomarev, L.; Seliverstov, V.; Seregin, M.

    2010-01-01

    Transmutation of minor actinides (MA) means their transforming into the fission products. Usually, MA-burner's transmutation efficiency is characterized by the static parameters only, such as the number of neutrons absorbed and the rate of MA feeding. However, the proper characterization of MA-burner's efficiency additionally requires the consideration of parameters allowing for the duration of the MA transmutation process. Two parameters of that kind are proposed: a) transmutation time τ - mean time period from the moment a mass of MA is loaded into the burner's fuel cycle to be transmuted to the moment this mass is completely transmuted; b) number of reprocessing cycles n rep - effective number of reprocessing cycles a mass of loaded MA has to undergo before being completely transmuted. Some of MA-burners' types have been analyzed from the point of view of these parameters. It turned out that all of them have the value of parameters too high from the practical point of view. It appears that some new approaches to MA-burner's design have to be used to significantly reduce the value of these parameters in order to make the large-scale MA transmutation process practically reasonable. Some of such approaches are proposed and their potential efficiency is discussed. (authors)

  3. Quality and efficiency successes leveraging IT and new processes.

    Science.gov (United States)

    Chaiken, Barry P; Christian, Charles E; Johnson, Liz

    2007-01-01

    Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.

  4. Efficient and stable solution-processed planar perovskite solar cells via contact passivation

    KAUST Repository

    Tan, Hairen; Jain, Ankit; Voznyy, Oleksandr; Lan, Xinzheng; Garcí a de Arquer, F. Pelayo; Fan, James Z.; Quintero-Bermudez, Rafael; Yuan, Mingjian; Zhang, Bo; Zhao, Yicheng; Fan, Fengjia; Li, Peicheng; Quan, Li Na; Zhao, Yongbiao; Lu, Zheng-Hong; Yang, Zhenyu; Hoogland, Sjoerd; Sargent, Edward H.

    2017-01-01

    Planar perovskite solar cells (PSCs) made entirely via solution processing at low temperatures (<150°C) offer promise for simple manufacturing, compatibility with flexible substrates, and perovskite-based tandem devices. However, these PSCs require an electron-selective layer that performs well with similar processing. We report a contact-passivation strategy using chlorine-capped TiO2 colloidal nanocrystal film that mitigates interfacial recombination and improves interface binding in low-temperature planar solar cells. We fabricated solar cells with certified efficiencies of 20.1 and 19.5% for active areas of 0.049 and 1.1 square centimeters, respectively, achieved via low-temperature solution processing. Solar cells with efficiency greater than 20% retained 90% (97% after dark recovery) of their initial performance after 500 hours of continuous room-temperature operation at their maximum power point under 1-sun illumination (where 1 sun is defined as the standard illumination at AM1.5, or 1 kilowatt/square meter).

  5. Efficient and stable solution-processed planar perovskite solar cells via contact passivation

    KAUST Repository

    Tan, Hairen

    2017-02-03

    Planar perovskite solar cells (PSCs) made entirely via solution processing at low temperatures (<150°C) offer promise for simple manufacturing, compatibility with flexible substrates, and perovskite-based tandem devices. However, these PSCs require an electron-selective layer that performs well with similar processing. We report a contact-passivation strategy using chlorine-capped TiO2 colloidal nanocrystal film that mitigates interfacial recombination and improves interface binding in low-temperature planar solar cells. We fabricated solar cells with certified efficiencies of 20.1 and 19.5% for active areas of 0.049 and 1.1 square centimeters, respectively, achieved via low-temperature solution processing. Solar cells with efficiency greater than 20% retained 90% (97% after dark recovery) of their initial performance after 500 hours of continuous room-temperature operation at their maximum power point under 1-sun illumination (where 1 sun is defined as the standard illumination at AM1.5, or 1 kilowatt/square meter).

  6. A critical analysis of energy efficiency improvement potentials in Taiwan's cement industry

    International Nuclear Information System (INIS)

    Huang, Yun-Hsun; Chang, Yi-Lin; Fleiter, Tobias

    2016-01-01

    The cement industry is the second most energy-intensive sector in Taiwan, which underlines the need to understand its potential for energy efficiency improvement. A bottom-up model-based assessment is utilized to conduct a scenario analysis of energy saving opportunities up to the year 2035. The analysis is supported by detailed expert interviews in all cement plants of Taiwan. The simulation results reveal that by 2035, eighteen energy efficient technologies could result in 25% savings for electricity and 9% savings for fuels under the technical diffusion scenario. This potential totally amounts to about 5000 TJ/year, of which 91% can be implemented cost-effectively assuming a discount rate of 10%. Policy makers should support a fast diffusion of these technologies. Additionally, policy makers can tap further saving potentials. First, by decreasing the clinker share, which is currently regulated to a minimum of 95%. Second, by extending the prohibition to build new cement plants by allowing for replacement of existing capacity with new innovative plants in the coming years. Third, by supporting the use of alternative fuels, which is currently still a niche in Taiwan. - Highlights: •We analyze energy efficiency improvement potentials in Taiwan's cement industry. •Eighteen process-specific technologies are analyzed using a bottom-up model. •Our model systematically reflects the diffusion of technologies over time. •We find energy-saving potentials of 25% for electricity and 9% for fuels in 2035. •91% of the energy-saving potentials can be realized cost-effectively.

  7. An efficient visualization method for analyzing biometric data

    Science.gov (United States)

    Rahmes, Mark; McGonagle, Mike; Yates, J. Harlan; Henning, Ronda; Hackett, Jay

    2013-05-01

    We introduce a novel application for biometric data analysis. This technology can be used as part of a unique and systematic approach designed to augment existing processing chains. Our system provides image quality control and analysis capabilities. We show how analysis and efficient visualization are used as part of an automated process. The goal of this system is to provide a unified platform for the analysis of biometric images that reduce manual effort and increase the likelihood of a match being brought to an examiner's attention from either a manual or lights-out application. We discuss the functionality of FeatureSCOPE™ which provides an efficient tool for feature analysis and quality control of biometric extracted features. Biometric databases must be checked for accuracy for a large volume of data attributes. Our solution accelerates review of features by a factor of up to 100 times. Review of qualitative results and cost reduction is shown by using efficient parallel visual review for quality control. Our process automatically sorts and filters features for examination, and packs these into a condensed view. An analyst can then rapidly page through screens of features and flag and annotate outliers as necessary.

  8. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    Science.gov (United States)

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  9. Modified paraffin wax for improvement of histological analysis efficiency.

    Science.gov (United States)

    Lim, Jin Ik; Lim, Kook-Jin; Choi, Jin-Young; Lee, Yong-Keun

    2010-08-01

    Paraffin wax is usually used as an embedding medium for histological analysis of natural tissue. However, it is not easy to obtain enough numbers of satisfactory sectioned slices because of the difference in mechanical properties between the paraffin and embedded tissue. We describe a modified paraffin wax that can improve the histological analysis efficiency of natural tissue, composed of paraffin and ethylene vinyl acetate (EVA) resin (0, 3, 5, and 10 wt %). Softening temperature of the paraffin/EVA media was similar to that of paraffin (50-60 degrees C). The paraffin/EVA media dissolved completely in xylene after 30 min at 50 degrees C. Physical properties such as the amount of load under the same compressive displacement, elastic recovery, and crystal intensity increased with increased EVA content. EVA medium (5 wt %) was regarded as an optimal composition, based on the sectioning efficiency measured by the numbers of unimpaired sectioned slices, amount of load under the same compressive displacement, and elastic recovery test. Based on the staining test of sectioned slices embedded in a 5 wt % EVA medium by hematoxylin and eosin (H&E), Masson trichrome (MT), and other staining tests, it was concluded that the modified paraffin wax can improve the histological analysis efficiency with various natural tissues. (c) 2010 Wiley-Liss, Inc.

  10. An analysis of mobile whole blood collection labor efficiency.

    Science.gov (United States)

    Rose, William N; Dayton, Paula J; Raife, Thomas J

    2011-07-01

    Labor efficiency is desirable in mobile blood collection. There are few published data on labor efficiency. The variability in the labor efficiency of mobile whole blood collections was analyzed. We determined to improve our labor efficiency using lean manufacturing principles. Workflow changes in mobile collections were implemented with the goal of minimizing labor expenditures. To measure success, data on labor efficiency measured by units/hour/full-time equivalent (FTE) were collected. The labor efficiency in a 6-month period before the implementation of changes, and in months 1 to 6 and 7 to 12 after implementation was analyzed and compared. Labor efficiency in the 6-month period preceding implementation was 1.06 ± 0.4 units collected/hour/FTE. In months 1 to 6, labor efficiency declined slightly to 0.92 ± 0.4 units collected/hour/FTE (p = 0.016 vs. preimplementation). In months 7 to 12, the mean labor efficiency returned to preimplementation levels of 1.09 ±0.4 units collected/hour/FTE. Regression analysis correlating labor efficiency with total units collected per drive revealed a strong correlation (R(2) = 0.48 for the aggregate data from all three periods), indicating that nearly half of labor efficiency was associated with drive size. The lean-based changes in workflow were subjectively favored by employees and donors. The labor efficiency of our mobile whole blood drives is strongly influenced by size. Larger drives are more efficient, with diminishing returns above 40 units collected. Lean-based workflow changes were positively received by employees and donors. © 2011 American Association of Blood Banks.

  11. Analysis of Public Sector Efficiency in Developed Countries

    Directory of Open Access Journals (Sweden)

    Ivan Lovre

    2017-06-01

    Full Text Available The public sector in developed countries went through various forms of transformation in the twentieth century. The expansion of the public sector resulted in high levels of public spending in developed countries. The financial crisis of 2008 led to recessions in the economies of developed countries, the public debt growth, and actualized the issue of the public sector optimal size and efficiency. This study analysed the public sector efficiency in 19 developed countries. The analysis focuses on the relationship between the size of public expenditure and economic growth in the global financial crisis and the measures implemented. The aim of the research in this paper is a comparison of total and partial efficiency of the public sector in developed countries, in order to determine the characteristics of the public sector operations. The comparison covers the areas of the public sector operations in order to identify sources of inefficiency. Partial and overall efficiency of countries are analysed with different size and concept of the public sector, to determine the relationship between the public sector size, efficiency and welfare of citizens. The research results clearly indicate (unjustified state intervention in developed countries.

  12. Efficient Analysis of Structures with Rotatable Elements Using Model Order Reduction

    Directory of Open Access Journals (Sweden)

    G. Fotyga

    2016-04-01

    Full Text Available This paper presents a novel full-wave technique which allows for a fast 3D finite element analysis of waveguide structures containing rotatable tuning elements of arbitrary shapes. Rotation of these elements changes the resonant frequencies of the structure, which can be used in the tuning process to obtain the S-characteristics desired for the device. For fast commutations of the response as the tuning elements are rotated, the 3D finite element method is supported by multilevel model-order reduction, orthogonal projection at the boundaries of macromodels and the operation called macromodels cloning. All the time-consuming steps are performed only once in the preparatory stage. In the tuning stage, only small parts of the domain are updated, by means of a special meshing technique. In effect, the tuning process is performed extremely rapidly. The results of the numerical experiments confirm the efficiency and validity of the proposed method.

  13. Efficient analysis of mode profiles in elliptical microcavity using dynamic-thermal electron-quantum medium FDTD method.

    Science.gov (United States)

    Khoo, E H; Ahmed, I; Goh, R S M; Lee, K H; Hung, T G G; Li, E P

    2013-03-11

    The dynamic-thermal electron-quantum medium finite-difference time-domain (DTEQM-FDTD) method is used for efficient analysis of mode profile in elliptical microcavity. The resonance peak of the elliptical microcavity is studied by varying the length ratio. It is observed that at some length ratios, cavity mode is excited instead of whispering gallery mode. This depicts that mode profiles are length ratio dependent. Through the implementation of the DTEQM-FDTD on graphic processing unit (GPU), the simulation time is reduced by 300 times as compared to the CPU. This leads to an efficient optimization approach to design microcavity lasers for wide range of applications in photonic integrated circuits.

  14. Pure sources and efficient detectors for optical quantum information processing

    Science.gov (United States)

    Zielnicki, Kevin

    Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on

  15. Microbial electrolytic disinfection process for highly efficient Escherichia coli inactivation

    DEFF Research Database (Denmark)

    Zhou, Shaofeng; Huang, Shaobin; Li, Xiaohu

    2018-01-01

    extensively studied for recalcitrant organics removal, its application potential towards water disinfection (e.g., inactivation of pathogens) is still unknown. This study investigated the inactivation of Escherichia coli in a microbial electrolysis cell based bio-electro-Fenton system (renamed as microbial......Water quality deterioration caused by a wide variety of recalcitrant organics and pathogenic microorganisms has become a serious concern worldwide. Bio-electro-Fenton systems have been considered as cost-effective and highly efficient water treatment platform technology. While it has been......]OH was identified as one potential mechanism for disinfection. This study successfully demonstrated the feasibility of bio-electro-Fenton process for pathogens inactivation, which offers insight for the future development of sustainable, efficient, and cost-effective biological water treatment technology....

  16. Solution-processed core-shell nanowires for efficient photovoltaic cells.

    Science.gov (United States)

    Tang, Jinyao; Huo, Ziyang; Brittman, Sarah; Gao, Hanwei; Yang, Peidong

    2011-08-21

    Semiconductor nanowires are promising for photovoltaic applications, but, so far, nanowire-based solar cells have had lower efficiencies than planar cells made from the same materials, even allowing for the generally lower light absorption of nanowires. It is not clear, therefore, if the benefits of the nanowire structure, including better charge collection and transport and the possibility of enhanced absorption through light trapping, can outweigh the reductions in performance caused by recombination at the surface of the nanowires and at p-n junctions. Here, we fabricate core-shell nanowire solar cells with open-circuit voltage and fill factor values superior to those reported for equivalent planar cells, and an energy conversion efficiency of ∼5.4%, which is comparable to that of equivalent planar cells despite low light absorption levels. The device is made using a low-temperature solution-based cation exchange reaction that creates a heteroepitaxial junction between a single-crystalline CdS core and single-crystalline Cu2S shell. We integrate multiple cells on single nanowires in both series and parallel configurations for high output voltages and currents, respectively. The ability to produce efficient nanowire-based solar cells with a solution-based process and Earth-abundant elements could significantly reduce fabrication costs relative to existing high-temperature bulk material approaches.

  17. Performance analysis of solar energy integrated with natural-gas-to-methanol process

    International Nuclear Information System (INIS)

    Yang, Sheng; Liu, Zhiqiang; Tang, Zhiyong; Wang, Yifan; Chen, Qianqian; Sun, Yuhan

    2017-01-01

    Highlights: • Solar energy integrated with natural-gas-to-methanol process is proposed. • The two processes are modeled and simulated. • Performance analysis of the two processes are conducted. • The proposed process can cut down the greenhouse gas emission. • The proposed process can save natural gas consumption. - Abstract: Methanol is an important platform chemical. Methanol production using natural gas as raw material has short processing route and well developed equipment and technology. However, natural gas reserves are not large in China. Solar energy power generation system integrated with natural-gas-to-methanol (NGTM) process is developed, which may provide a technical routine for methanol production in the future. The solar energy power generation produces electricity for reforming unit and system consumption in solar energy integrated natural-gas-to-methanol system (SGTM). Performance analysis of conventional natural-gas-to-methanol process and solar energy integrated with natural-gas-to-methanol process are presented based on simulation results. Performance analysis was conducted considering carbon efficiency, production cost, solar energy price, natural gas price, and carbon tax. Results indicate that solar energy integrated with natural-gas-to-methanol process is able to cut down the greenhouse gas (GHG) emission. In addition, solar energy can replace natural gas as fuel. This can reduce the consumption of natural gas, which equals to 9.2% of the total consumed natural gas. However, it is not economical considering the current technology readiness level, compared with conventional natural-gas-to-methanol process.

  18. Evaluating the efficiency of a zakat institution over a period of time using data envelopment analysis

    Science.gov (United States)

    Krishnan, Anath Rau; Hamzah, Ahmad Aizuddin

    2017-08-01

    It is crucial for a zakat institution to evaluate and understand how efficiently they have operated in the past, thus ideal strategies could be developed for future improvement. However, evaluating the efficiency of a zakat institution is actually a challenging process as it involves the presence of multiple inputs or/and outputs. This paper proposes a step-by-step procedure comprising two data envelopment analysis models, namely dual Charnes-Cooper-Rhodes and slack-based model to quantitatively measure the overall efficiency of a zakat institution over a period of time. The applicability of the proposed procedure was demonstrated by evaluating the efficiency of Pusat Zakat Sabah, Malaysia from the year of 2007 up to 2015 by treating each year as a decision making unit. Two inputs (i.e. number of staff and number of branches) and two outputs (i.e. total collection and total distribution) were used to measure the overall efficiency achieved each year. The causes of inefficiency and strategy for future improvement were discussed based on the results.

  19. Prioritization of the Factors Affecting Bank Efficiency Using Combined Data Envelopment Analysis and Analytical Hierarchy Process Methods

    Directory of Open Access Journals (Sweden)

    Mehdi Fallah Jelodar

    2016-01-01

    Full Text Available Bank branches have a vital role in the economy of all countries. They collect assets from various sources and put them in the hand of those sectors that need liquidity. Due to the limited financial and human resources and capitals and also because of the unlimited and new customers’ needs and strong competition between banks and financial and credit institutions, the purpose of this study is to provide an answer to the question of which of the factors affecting performance, creating value, and increasing shareholder dividends are superior to others and consequently managers should pay more attention to them. Therefore, in this study, the factors affecting performance (efficiency in the areas of management, personnel, finance, and customers were segmented and obtained results were ranked using both methods of Data Envelopment Analysis and hierarchical analysis. In both of these methods, the leadership style in the area of management; the recruitment and resource allocation in the area of financing; the employees’ satisfaction, dignity, and self-actualization in the area of employees; and meeting the new needs of customers got more weights.

  20. Efficiency analysis on a two-level three-phase quasi-soft-switching inverter

    DEFF Research Database (Denmark)

    Geng, Pan; Wu, Weimin; Huang, Min

    2013-01-01

    When designing an inverter, an engineer often needs to select and predict the efficiency beforehand. For the standard inverters, plenty of researches are analyzing the power losses and also many software tools are being used for efficiency calculation. In this paper, the efficiency calculation...... for non-conventional inverters with special shoot-through state is introduced and illustrated through the analysis on a special two-level three-phase quasi-soft-switching inverter. Efficiency comparison between the classical two-stage two-level three-phase inverter and the two-level three-phase quasi......-soft-switching inverter is carried out. A 10 kW/380 V prototype is constructed to verify the analysis. The experimental results show that the efficiency of the new inverter is higher than that of the traditional two-stage two- level three-phase inverter....

  1. Sulfide ore looping oxidation : an innovative process that is energy efficient and environmentally friendly

    Energy Technology Data Exchange (ETDEWEB)

    McHugh, L.F.; Balliett, R.; Mozolic, J.A. [Orchard Material Technology, North Andover, MA (United States)

    2008-07-01

    Many sulphide ore processing methods use different types of roasting technologies. These technologies are generally quite effective, however, they represent significant energy use and environmental cost. This paper discussed and validated the use of a two-step looping oxidation process that effectively removes sulphur while producing materials of adequate purity in an energy efficient and environmentally sound manner. This paper described the process in detail and compared it to existing technologies in the area of energy efficiency, and off-gas treatment energy requirements. Validation of the looping oxidation concept was described and the starting chemistries of each chemical were listed. Thermodynamic modeling was used to determine the temperature at which the reaction should begin and to predict the temperature at which the reaction should be complete. The test apparatus and run conditions were also described. It was concluded that there are several critical stages in the looping process where energy recovery is economically attractive and could easily be directed or converted for other plant operations. All reactions were fast and efficient, allowing for reduced equipment size as well as higher throughput rates. 11 refs., 3 tabs., 2 figs.

  2. MULTIPLE LINEAR REGRESSION ANALYSIS FOR PREDICTION OF BOILER LOSSES AND BOILER EFFICIENCY

    OpenAIRE

    Chayalakshmi C.L

    2018-01-01

    MULTIPLE LINEAR REGRESSION ANALYSIS FOR PREDICTION OF BOILER LOSSES AND BOILER EFFICIENCY ABSTRACT Calculation of boiler efficiency is essential if its parameters need to be controlled for either maintaining or enhancing its efficiency. But determination of boiler efficiency using conventional method is time consuming and very expensive. Hence, it is not recommended to find boiler efficiency frequently. The work presented in this paper deals with establishing the statistical mo...

  3. Efficiency Analysis of Financial Management Administration of ABC Hospital using Financial Ratio Analysis Method

    Directory of Open Access Journals (Sweden)

    Jonny Jonny

    2016-05-01

    Full Text Available This paper evaluated the financial performance of ABC hospital within the period of 2012 to 2013. To overcome the problems faced by the hospital related to how to measure and presented its financial performance in which financial ratio analysis was undertaken. These financial ratios were employed to measure the liquidity, assets utilization, long-term solvency and profitability of the hospital. This analysis was conducted in order to prove whether the hospital has been managed efficiently or not in accordance to Indonesian Hospital Quality Accreditation as stated in its clause on Administration Standard No. 5 Parameter No. 3 that the hospital financial management shall be conducted in appropriate way in order to guarantee its operation efficiently. The result showed that overall financial performance of ABC hospital increased considerably in those two years of the analysis. A significant change was occurred on its solvency ratio which was decreased from -2% to -8%, indicating its loose dependency due to its founder’s strong financial support. Therefore, based on this favorable result, the hospital was regarded to have efficient hospital management and thus, together with other standard fulfillment, it was accredited by Indonesian Health Ministry.

  4. Modelling extrudate expansion in a twin-screw food extrusion cooking process through dimensional analysis methodology

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    2010-01-01

    A new phenomenological model is proposed to correlate extrudate expansion and extruder operation parameters in a twin-screw food extrusion cooking process. Buckingham's pi dimensional analysis method is applied to establish the model. Three dimensionless groups, i.e. pump efficiency, water content...

  5. The efficiency of life insurance and family Takaful in Malaysia: Relative efficiency using the stochastic cost frontier analysis

    Science.gov (United States)

    Baharin, Roziana; Isa, Zaidi

    2013-04-01

    This paper focuses on the Stochastic cost Frontier Analysis (SFA) approach, in an attempt to measure the relationship between efficiency and organizational structure for Takaful and insurance operators in Malaysia's dual financial system. This study applied a flexible cost functional form i.e., Fourier Flexible Functional Form, for a sample consisting of 19 firms, chosen between 2002 and 2010, by employing the Battese and Coelli invariant efficiency model. The findings show that on average, there is a significant difference in cost efficiency between the Takaful industry and the insurance industry. It was found that Takaful has lower cost efficiency than conventional insurance, which shows that the organization form has an influence on efficiency. Overall, it was observed that the level of efficiency scores for both life insurance and family Takaful do not vary across time.

  6. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  7. Anaerobic digestion of post-hydrothermal liquefaction wastewater for improved energy efficiency of hydrothermal bioenergy processes.

    Science.gov (United States)

    Zhou, Yan; Schideman, Lance; Zheng, Mingxia; Martin-Ryals, Ana; Li, Peng; Tommaso, Giovana; Zhang, Yuanhui

    2015-01-01

    Hydrothermal liquefaction (HTL) is a promising process for converting wet biomass and organic wastes into bio-crude oil. It also produces an aqueous product referred to as post-hydrothermal liquefaction wastewater (PHWW) containing up to 40% of the original feedstock carbon, which reduces the overall energy efficiency of the HTL process. This study investigated the feasibility of using anaerobic digestion (AD) to treat PHWW, with the aid of activated carbon. Results showed that successful AD occurred at relatively low concentrations of PHWW (≤ 6.7%), producing a biogas yield of 0.5 ml/mg CODremoved, and ∼53% energy recovery efficiency. Higher concentrations of PHWW (≥13.3%) had an inhibitory effect on the AD process, as indicated by delayed, slower, or no biogas production. Activated carbon was shown to effectively mitigate this inhibitory effect by enhancing biogas production and allowing digestion to proceed at higher PHWW concentrations (up to 33.3%), likely due to sequestering toxic organic compounds. The addition of activated carbon also increased the net energy recovery efficiency of AD with a relatively high concentration of PHWW (33.3%), taking into account the energy for producing activated carbon. These results suggest that AD is a feasible approach to treat PHWW, and to improve the energy efficiency of the HTL processes.

  8. Exergy analysis of the biogas sorption-enhanced chemical looping reforming process integrated with a high-temperature proton exchange membrane fuel cell

    International Nuclear Information System (INIS)

    Kasemanand, Sarunyou; Im-orb, Karittha; Tippawan, Phanicha; Wiyaratn, Wisitsree; Arpornwichanop, Amornchai

    2017-01-01

    Highlights: • A biogas reforming and fuel cell integrated process is considered. • Energy and exergy analyses of the integrated process are performed. • Increasing the nickel oxide-to-biogas ratio decreases the exergy efficiency. • The exergy destruction of the fuel cell increases with increasing cell temperature. • The exergy efficiency of the process is improved when heat integration is applied. - Abstract: A biogas sorption-enhanced chemical looping reforming process integrated with a high-temperature proton exchange membrane fuel cell is analyzed. Modeling of such an integrated process is performed by using a flowsheet simulator (Aspen plus). The exergy analysis is performed to evaluate the energy utilization efficiency of each unit and that of the integrated process. The effect of steam and nickel oxide to biogas ratios on the exergetic performance of the stand-alone biogas sorption-enhanced chemical looping reforming process is investigated. The total exergy destruction increases as the steam or nickel oxide to biogas ratio increases. The main exergy destruction is found at the air reactor. For the high-temperature proton exchange membrane fuel cell, the main exergy destruction is found at the cathode. The total exergy destruction increases when cell temperature increases, whereas the inverse effect is found when the current density is considered as a key parameter. Regarding the exergy efficiency, the results show opposite trend to the exergy destruction. The heat integration analysis is performed to improve the exergetic performance. It is found that the integrated process including the heat integration system can improve the exergy destruction and exergy efficiency of 48% and 60%, respectively.

  9. Proposal of Optimization of the Process of Credit Analysis to the Self-Employment Segment

    Directory of Open Access Journals (Sweden)

    Armando Pereira-López

    2017-06-01

    Full Text Available As part of the banking system transformation process in Cuba, from 2011 is provided the granting of credits to persons authorized to engage in self-employment and other forms of non-state management. These new types of credit must be subject to strict rules of risk, analysis by the banking institution, what has caused it to become a slow and complicated process. According to the situation described, an analysis of the credit granted to this segment is carried out at the Branch 8312 of the Popular Saving Bank, determining the main limitations and making proposals to improve this said process through the mapping process tool that Proposes the Microsave Methodology which reduce the response time to the requests of the clients and contribute to the provision of more efficient services. 

  10. COMPARATIVE ANALYSIS OF ECONOMIC AND SOCIAL EFFICIENCY. CUSTOMIZE ON HEALTHCARE SECTOR

    OpenAIRE

    CLAUDIU CICEA

    2011-01-01

    Efficiency in health service sector is very important because the health sector is a major consumer of resources (especially financial). In this paper the author aims to analyze the efficiency in a social sector (healthcare system) based on cost-benefit analysis.

  11. Gut Transcriptome Analysis Shows Different Food Utilization Efficiency by the Grasshopper Oedaleous asiaticus (Orthoptera: Acrididae).

    Science.gov (United States)

    Huang, Xunbing; McNeill, Mark Richard; Ma, Jingchuan; Qin, Xinghu; Tu, Xiongbing; Cao, Guangchun; Wang, Guangjun; Nong, Xiangqun; Zhang, Zehua

    2017-08-01

    Oedaleus asiaticus B. Bienko is a persistent pest occurring in north Asian grasslands. We found that O. asiaticus feeding on Stipa krylovii Roshev. had higher approximate digestibility (AD), efficiency of conversion of ingested food (ECI), and efficiency of conversion of digested food (ECD), compared with cohorts feeding on Leymus chinensis (Trin.) Tzvel, Artemisia frigida Willd., or Cleistogenes squarrosa (Trin.) Keng. Although this indicated high food utilization efficiency for S. krylovii, the physiological processes and molecular mechanisms underlying these biological observations are not well understood. Transcriptome analysis was used to examine how gene expression levels in O. asiaticus gut are altered by feeding on the four plant species. Nymphs (fifth-instar female) that fed on S. krylovii had the largest variation in gene expression profiles, with a total of 88 genes significantly upregulated compared with those feeding on the other three plants, mainly including nutrition digestive genes of protein, carbohydrate, and lipid digestion. GO and KEGG enrichment also showed that feeding S. krylovii could upregulate the nutrition digestion-related molecular function, biological process, and pathways. These changes in transcripts levels indicate that the physiological processes of activating nutrition digestive enzymes and metabolism pathways can well explain the high food utilization of S. krylovii by O. asiaticus. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Energy efficiency analysis: biomass-to-wheel efficiency related with biofuels production, fuel distribution, and powertrain systems.

    Directory of Open Access Journals (Sweden)

    Wei-Dong Huang

    Full Text Available BACKGROUND: Energy efficiency analysis for different biomass-utilization scenarios would help make more informed decisions for developing future biomass-based transportation systems. Diverse biofuels produced from biomass include cellulosic ethanol, butanol, fatty acid ethyl esters, methane, hydrogen, methanol, dimethyether, Fischer-Tropsch diesel, and bioelectricity; the respective powertrain systems include internal combustion engine (ICE vehicles, hybrid electric vehicles based on gasoline or diesel ICEs, hydrogen fuel cell vehicles, sugar fuel cell vehicles (SFCV, and battery electric vehicles (BEV. METHODOLOGY/PRINCIPAL FINDINGS: We conducted a simple, straightforward, and transparent biomass-to-wheel (BTW analysis including three separate conversion elements--biomass-to-fuel conversion, fuel transport and distribution, and respective powertrain systems. BTW efficiency is a ratio of the kinetic energy of an automobile's wheels to the chemical energy of delivered biomass just before entering biorefineries. Up to 13 scenarios were analyzed and compared to a base line case--corn ethanol/ICE. This analysis suggests that BEV, whose electricity is generated from stationary fuel cells, and SFCV, based on a hydrogen fuel cell vehicle with an on-board sugar-to-hydrogen bioreformer, would have the highest BTW efficiencies, nearly four times that of ethanol-ICE. SIGNIFICANCE: In the long term, a small fraction of the annual US biomass (e.g., 7.1%, or 700 million tons of biomass would be sufficient to meet 100% of light-duty passenger vehicle fuel needs (i.e., 150 billion gallons of gasoline/ethanol per year, through up to four-fold enhanced BTW efficiencies by using SFCV or BEV. SFCV would have several advantages over BEV: much higher energy storage densities, faster refilling rates, better safety, and less environmental burdens.

  13. Energy Efficiency Analysis: Biomass-to-Wheel Efficiency Related with Biofuels Production, Fuel Distribution, and Powertrain Systems

    Science.gov (United States)

    Huang, Wei-Dong; Zhang, Y-H Percival

    2011-01-01

    Background Energy efficiency analysis for different biomass-utilization scenarios would help make more informed decisions for developing future biomass-based transportation systems. Diverse biofuels produced from biomass include cellulosic ethanol, butanol, fatty acid ethyl esters, methane, hydrogen, methanol, dimethyether, Fischer-Tropsch diesel, and bioelectricity; the respective powertrain systems include internal combustion engine (ICE) vehicles, hybrid electric vehicles based on gasoline or diesel ICEs, hydrogen fuel cell vehicles, sugar fuel cell vehicles (SFCV), and battery electric vehicles (BEV). Methodology/Principal Findings We conducted a simple, straightforward, and transparent biomass-to-wheel (BTW) analysis including three separate conversion elements -- biomass-to-fuel conversion, fuel transport and distribution, and respective powertrain systems. BTW efficiency is a ratio of the kinetic energy of an automobile's wheels to the chemical energy of delivered biomass just before entering biorefineries. Up to 13 scenarios were analyzed and compared to a base line case – corn ethanol/ICE. This analysis suggests that BEV, whose electricity is generated from stationary fuel cells, and SFCV, based on a hydrogen fuel cell vehicle with an on-board sugar-to-hydrogen bioreformer, would have the highest BTW efficiencies, nearly four times that of ethanol-ICE. Significance In the long term, a small fraction of the annual US biomass (e.g., 7.1%, or 700 million tons of biomass) would be sufficient to meet 100% of light-duty passenger vehicle fuel needs (i.e., 150 billion gallons of gasoline/ethanol per year), through up to four-fold enhanced BTW efficiencies by using SFCV or BEV. SFCV would have several advantages over BEV: much higher energy storage densities, faster refilling rates, better safety, and less environmental burdens. PMID:21765941

  14. Energy efficiency analysis: biomass-to-wheel efficiency related with biofuels production, fuel distribution, and powertrain systems.

    Science.gov (United States)

    Huang, Wei-Dong; Zhang, Y-H Percival

    2011-01-01

    Energy efficiency analysis for different biomass-utilization scenarios would help make more informed decisions for developing future biomass-based transportation systems. Diverse biofuels produced from biomass include cellulosic ethanol, butanol, fatty acid ethyl esters, methane, hydrogen, methanol, dimethyether, Fischer-Tropsch diesel, and bioelectricity; the respective powertrain systems include internal combustion engine (ICE) vehicles, hybrid electric vehicles based on gasoline or diesel ICEs, hydrogen fuel cell vehicles, sugar fuel cell vehicles (SFCV), and battery electric vehicles (BEV). We conducted a simple, straightforward, and transparent biomass-to-wheel (BTW) analysis including three separate conversion elements--biomass-to-fuel conversion, fuel transport and distribution, and respective powertrain systems. BTW efficiency is a ratio of the kinetic energy of an automobile's wheels to the chemical energy of delivered biomass just before entering biorefineries. Up to 13 scenarios were analyzed and compared to a base line case--corn ethanol/ICE. This analysis suggests that BEV, whose electricity is generated from stationary fuel cells, and SFCV, based on a hydrogen fuel cell vehicle with an on-board sugar-to-hydrogen bioreformer, would have the highest BTW efficiencies, nearly four times that of ethanol-ICE. In the long term, a small fraction of the annual US biomass (e.g., 7.1%, or 700 million tons of biomass) would be sufficient to meet 100% of light-duty passenger vehicle fuel needs (i.e., 150 billion gallons of gasoline/ethanol per year), through up to four-fold enhanced BTW efficiencies by using SFCV or BEV. SFCV would have several advantages over BEV: much higher energy storage densities, faster refilling rates, better safety, and less environmental burdens.

  15. Process configuration of Liquid-nitrogen Energy Storage System (LESS) for maximum turnaround efficiency

    Science.gov (United States)

    Dutta, Rohan; Ghosh, Parthasarathi; Chowdhury, Kanchan

    2017-12-01

    Diverse power generation sector requires energy storage due to penetration of variable renewable energy sources and use of CO2 capture plants with fossil fuel based power plants. Cryogenic energy storage being large-scale, decoupled system with capability of producing large power in the range of MWs is one of the options. The drawback of these systems is low turnaround efficiencies due to liquefaction processes being highly energy intensive. In this paper, the scopes of improving the turnaround efficiency of such a plant based on liquid Nitrogen were identified and some of them were addressed. A method using multiple stages of reheat and expansion was proposed for improved turnaround efficiency from 22% to 47% using four such stages in the cycle. The novelty here is the application of reheating in a cryogenic system and utilization of waste heat for that purpose. Based on the study, process conditions for a laboratory-scale setup were determined and presented here.

  16. Research and analysis of modernization processes in food industry enterprises of Ukraine

    Directory of Open Access Journals (Sweden)

    Buzhimska K.O.

    2017-08-01

    Full Text Available The modernization of domestic enterprises is a prerequisite for the integration of Ukraine into the European Union, first of all it concerns food industry enterprises, because they have the greatest potential for access to European markets with their own products. Accelerated modernization will provide an opportunity to improve the quality and safety of domestic food products and bring them closer to world standards. The methods and methodology of economic and statistical analysis remain the focus of the scholars. The analysis of trends, directions of development, and results of activities creates a basis for the adoption of quality management decisions, both at the strategic and operational levels. The study of the modernization process is impossible without the use of methods of economic and statistical analysis for a general evaluation of its state and efficiency. The paper proposes the relative indicators of asset value dynamics, residual value of fixed assets, volumes of sales, financial results before taxation, net profit for a generalizing assessment of the modernization process. It is substantiated that the modernization process is effective if the growth rate of asset value is greater than one unit, the growth rate of the residual value of fixed assets increases the growth rate of assets, the growth rate of sales of products is greater than the growth rate of residual value of fixed assets, the rate of growth of financial results before taxation is higher than the pace, the growth of sales volume, the growth rate of net profit is higher than the growth rate of the financial result before taxation. Using the Spirmeno coefficient, the authors obtained following results: the modernization process was most effective in 2011–2012, the modernization processes in food industry sharply slowed down during 2013–2015, but due to the already formed potential, they continue confirming the integral indices of the state and efficiency of

  17. Fiscal 1999 research report. Research on photonic measurement and processing technology (Development of high- efficiency production process technology); 1999 nendo foton keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This report summarizes the fiscal 1999 research result on R and D of laser processing technology, in-situ measurement technology, and generation and control technology of photon as laser beam source, for energy saving and efficiency improvement of energy-consumptive production processes such as welding, jointing, surface treatment and fine particle fabrication. The research was carried out by a technical center, 9 companies and a university as contract research. The research themes are as follows: (1) Processing technology: simulation technology for laser welding phenomena, synthesis technology for quantum dot functional structures, and fabrication technology for functional composite materials, (2) In-situ measurement technology: fine particle element and size measurement technology, (3) All- solid state laser technology: efficient rod type LD-pumping laser module, pumping chamber of slab type laser, improvement of E/O efficiency of laser diode, high-quality nonlinear crystal growth technology, fabrication technology for nonlinear crystals, and high-efficiency harmonic generation technology. Comprehensive survey was also made on high- efficiency photon generation technologies. (NEDO)

  18. LOW-CALORIES RAISINS OBTAINED BY COMBINED DEHYDRATION: PROCESS OPTIMIZATION AND EVALUATION OF THE ANTIOXIDANT EFFICIENCY

    Directory of Open Access Journals (Sweden)

    Mariana B. Laborde

    2015-03-01

    Full Text Available A healthy dehydrated food of high nutritional-quality and added-value was developed: low-calories raisin obtained by an ultrasonic assisted combined-dehydration with two-stage osmotic treatment (D3S complemented by drying. Pink Red Globe grape produced at Mendoza (Argentina, experienced a substitution of sugar by natural sweetener Stevia in two osmotic stages under different conditions (treatment with/without ultrasound; sweetener concentration 18, 20, 22% w/w; time 35, 75, 115 minutes, evaluating soluble solids (SS, moisture (M, total polyphenols (PF, antioxidant efficiency (AE and sugar profile. The multiple optimization of the process by response surface methodology and desirability analysis, allowed to minimize M, maximize SS (Stevia incorporation, and preserve the maximum amount of PF. After the first stage, the optimal treatment reduced the majority sugars of the grape in 32% (sucrose, glucose, and the 57% at the end of the dehydration process.

  19. Development of a novel processing system for efficient sour water stripping

    International Nuclear Information System (INIS)

    Kazemi, Abolghasem; Mehrabani-Zeinabad, Arjomand; Beheshti, Masoud

    2017-01-01

    Application of vapor recompression systems can result in enhanced energy efficiency and reduced energy requirements of distillation systems. In vapor recompression systems, temperature and dew point temperature of the top product of the column are increased through compression. By transferring heat from top to bottoms product, required boil up and reflux streams for the column are provided. In this paper, a new system is proposed for efficient stripping of sour water based on vapor recompression. Ammonia and H 2 S are the contaminants of sour water. Initially, based on a certain specifications of products, a sour water stripping system is implemented. A novel processing system is then developed and simulated to reduce utility requirements. The two processing systems are economically evaluated by Aspen Economic Evaluation software. There are 89.0% and 83.7% reduction of hot and cold utility requirements for the proposed system in comparison to the base processing system. However, the new processing system requires new equipment such as compressor and corresponding mechanical work that increases its capital and operating costs in comparison to the base case. However, the results indicate that the proposed system results in reduction of 11.4% of total annual costs and 14.9% of operating costs. - Highlights: • A novel system was developed for enhancement of performance of a distillation system based on vapor recompression. • In this system, utility streams are used for providing thermal energy. • A parametric study is carried out on the proposed processing system. • Applying the proposed system resulted in reduction of energy and utility requirements and costs of the separation process. • Environmental performance of the model was investigated.

  20. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  1. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  2. MA-burners efficiency parameters allowing for the duration of transmutation process

    Energy Technology Data Exchange (ETDEWEB)

    Gulevich, A.; Zemskov, E. [Institute of Physics and Power Engineering, Bondarenko Square 1, Obninsk, Kaluga Region 249020 (Russian Federation); Kalugin, A.; Ponomarev, L. [Russian Research Center ' ' Kurchatov Institute' ' Kurchatov Square 1, Moscow 123182 (Russian Federation); Seliverstov, V. [Institute of Theoretical and Experimental Physics ul.B. Cheremushkinskaya 25, Moscow 117259 (Russian Federation); Seregin, M. [Russian Research Institute of Chemical Technology Kashirskoe Shosse 33, Moscow 115230 (Russian Federation)

    2010-07-01

    Transmutation of minor actinides (MA) means their transforming into the fission products. Usually, MA-burner's transmutation efficiency is characterized by the static parameters only, such as the number of neutrons absorbed and the rate of MA feeding. However, the proper characterization of MA-burner's efficiency additionally requires the consideration of parameters allowing for the duration of the MA transmutation process. Two parameters of that kind are proposed: a) transmutation time {tau} - mean time period from the moment a mass of MA is loaded into the burner's fuel cycle to be transmuted to the moment this mass is completely transmuted; b) number of reprocessing cycles n{sub rep} - effective number of reprocessing cycles a mass of loaded MA has to undergo before being completely transmuted. Some of MA-burners' types have been analyzed from the point of view of these parameters. It turned out that all of them have the value of parameters too high from the practical point of view. It appears that some new approaches to MA-burner's design have to be used to significantly reduce the value of these parameters in order to make the large-scale MA transmutation process practically reasonable. Some of such approaches are proposed and their potential efficiency is discussed. (authors)

  3. Energy-efficient biogas reforming process to produce syngas: The enhanced methane conversion by O_2

    International Nuclear Information System (INIS)

    Chen, Xuejing; Jiang, Jianguo; Li, Kaimin; Tian, Sicong; Yan, Feng

    2017-01-01

    Highlights: • The effect of O_2 content from 0 to 15% on Ni/SiO_2 are studied for biogas reforming. • The presence of O_2 in biogas improves CH_4 conversion and stability of biogas reforming. • An obvious carbon-resistance effect is observed due to the carbon gasification effect of O_2 in biogas. • The presence of O_2 in biogas greatly helps inhibit the catalyst sintering. - Abstract: We report an energy-efficient biogas reforming process with high and stable methane conversions by O_2 presence. During this biogas reforming process, the effects of various O_2 concentrations in biogas on initial conversions and stability at various temperatures on a Ni/SiO_2 catalyst were detailed investigated. In addition, theoretical energy consumption and conversions were calculated based on the Gibbs energy minimization method to compare with experimental results. Carbon formation and sintering during the reforming process were characterized by thermal gravity analysis, the Brunauer-Emmett-Teller method, X-ray diffraction, and high-resolution transmission electron microscopy to investigate the feasibility of applying this process to an inexpensive nickel catalyst. The results showed that 5% O_2 in biogas improved the CH_4 conversion and stability of biogas reforming. The enhancement of stability was attributed to the inhibited sintering, our first finding, and the reduced carbon deposition at the same time, which sustained a stable conversion of CH_4, and proved the applicability of base Ni catalyst to this process. Higher O_2 concentrations (⩾10%) in biogas resulted in severe decrease in CO_2 conversion and greater H_2O productivity. Our proposed biogas reforming process, with a high and stable conversion of CH_4, reduced energy input, and the applicability to inexpensive base metal catalyst, offers a good choice for biogas reforming with low O_2 concentrations (⩽5%) to produce syngas with high energy efficiency.

  4. Structure model of energy efficiency indicators and applications

    International Nuclear Information System (INIS)

    Wu, Li-Ming; Chen, Bai-Sheng; Bor, Yun-Chang; Wu, Yin-Chin

    2007-01-01

    For the purposes of energy conservation and environmental protection, the government of Taiwan has instigated long-term policies to continuously encourage and assist industry in improving the efficiency of energy utilization. While multiple actions have led to practical energy saving to a limited extent, no strong evidence of improvement in energy efficiency was observed from the energy efficiency indicators (EEI) system, according to the annual national energy statistics and survey. A structural analysis of EEI is needed in order to understand the role that energy efficiency plays in the EEI system. This work uses the Taylor series expansion to develop a structure model for EEI at the level of the process sector of industry. The model is developed on the premise that the design parameters of the process are used in comparison with the operational parameters for energy differences. The utilization index of production capability and the variation index of energy utilization are formulated in the model to describe the differences between EEIs. Both qualitative and quantitative methods for the analysis of energy efficiency and energy savings are derived from the model. Through structural analysis, the model showed that, while the performance of EEI is proportional to the process utilization index of production capability, it is possible that energy may develop in a direction opposite to that of EEI. This helps to explain, at least in part, the inconsistency between EEI and energy savings. An energy-intensive steel plant in Taiwan was selected to show the application of the model. The energy utilization efficiency of the plant was evaluated and the amount of energy that had been saved or over-used in the production process was estimated. Some insights gained from the model outcomes are helpful to further enhance energy efficiency in the plant

  5. ANALYSIS OF CONSTRAINTS IN RESOURCE USE EFFICIENCY IN ...

    African Journals Online (AJOL)

    ANALYSIS OF CONSTRAINTS IN RESOURCE USE EFFICIENCY IN MULTIPLE CROPPING SYSTEM BY SMALL-HOLDER FARMERS IN EBONYI STATE OF ... high cost of modern inputs, lack of adequate finance and lack of collaterals among others served as major constraints, which constituted29%, 36%, 33% and 22% ...

  6. An Efficient Data Compression Model Based on Spatial Clustering and Principal Component Analysis in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yihang Yin

    2015-08-01

    Full Text Available Wireless sensor networks (WSNs have been widely used to monitor the environment, and sensors in WSNs are usually power constrained. Because inner-node communication consumes most of the power, efficient data compression schemes are needed to reduce the data transmission to prolong the lifetime of WSNs. In this paper, we propose an efficient data compression model to aggregate data, which is based on spatial clustering and principal component analysis (PCA. First, sensors with a strong temporal-spatial correlation are grouped into one cluster for further processing with a novel similarity measure metric. Next, sensor data in one cluster are aggregated in the cluster head sensor node, and an efficient adaptive strategy is proposed for the selection of the cluster head to conserve energy. Finally, the proposed model applies principal component analysis with an error bound guarantee to compress the data and retain the definite variance at the same time. Computer simulations show that the proposed model can greatly reduce communication and obtain a lower mean square error than other PCA-based algorithms.

  7. An Efficient Data Compression Model Based on Spatial Clustering and Principal Component Analysis in Wireless Sensor Networks.

    Science.gov (United States)

    Yin, Yihang; Liu, Fengzheng; Zhou, Xiang; Li, Quanzhong

    2015-08-07

    Wireless sensor networks (WSNs) have been widely used to monitor the environment, and sensors in WSNs are usually power constrained. Because inner-node communication consumes most of the power, efficient data compression schemes are needed to reduce the data transmission to prolong the lifetime of WSNs. In this paper, we propose an efficient data compression model to aggregate data, which is based on spatial clustering and principal component analysis (PCA). First, sensors with a strong temporal-spatial correlation are grouped into one cluster for further processing with a novel similarity measure metric. Next, sensor data in one cluster are aggregated in the cluster head sensor node, and an efficient adaptive strategy is proposed for the selection of the cluster head to conserve energy. Finally, the proposed model applies principal component analysis with an error bound guarantee to compress the data and retain the definite variance at the same time. Computer simulations show that the proposed model can greatly reduce communication and obtain a lower mean square error than other PCA-based algorithms.

  8. Efficient Fixed-Offset GPR Scattering Analysis

    DEFF Research Database (Denmark)

    Meincke, Peter; Chen, Xianyao

    2004-01-01

    The electromagnetic scattering by buried three-dimensional penetrable objects, as involved in the analysis of ground penetrating radar systems, is calculated using the extended Born approximation. The involved scattering tensor is calculated using fast Fourier transforms (FFT's). We incorporate...... in the scattering calculation the correct radiation patterns of the ground penetrating radar antennas by using their plane-wave transmitting and receiving spectra. Finally, we derive an efficient FFT-based method to analyze a fixed-offset configuration in which the location of the transmitting antenna is different...

  9. Urban Land Use Efficiency and Coordination in China

    Directory of Open Access Journals (Sweden)

    Xiaodong Yang

    2017-03-01

    Full Text Available Due to the focused pursuit of economic growth in the process of the large-scale urban development of China, the phenomena of low land use efficiency and discordance of land use induce unwanted economic, social, and environmental costs. This paper presents a comprehensive study of urban land use efficiency and of the degree of land use coordination of 33 cities in China, using theoretical analysis, data envelopment analysis, principal component analysis, the coordination coefficient method, and four-quadrant analysis. The findings of this study suggest a gradually increasing proportion of land use efficiency from eastern to central and western regions of China, coinciding with China’s pattern of socioeconomic development. No correlation was found between high levels of urban land use efficiency and the degree of land use coordination; however, a significant correlation was found between low land use efficiency and low degrees of land use coordination. Rational land use planning and policy design can effectively improve both urban land use efficiency and coordination.

  10. High efficiency grating couplers based on shared process with CMOS MOSFETs

    International Nuclear Information System (INIS)

    Qiu Chao; Sheng Zhen; Wu Ai-Min; Wang Xi; Zou Shi-Chang; Gan Fu-Wan; Li Le; Albert Pang

    2013-01-01

    Grating couplers are widely investigated as coupling interfaces between silicon-on-insulator waveguides and optical fibers. In this work, a high-efficiency and complementary metal—oxide—semiconductor (CMOS) process compatible grating coupler is proposed. The poly-Si layer used as a gate in the CMOS metal—oxide—semiconductor field effect transistor (MOSFET) is combined with a normal fully etched grating coupler, which greatly enhances its coupling efficiency. With optimal structure parameters, a coupling efficiency can reach as high as ∼ 70% at a wavelength of 1550 nm as indicated by simulation. From the angle of fabrication, all masks and etching steps are shared between MOSFETs and grating couplers, thereby making the high performance grating couplers easily integrated with CMOS circuits. Fabrication errors such as alignment shift are also simulated, showing that the device is quite tolerant in fabrication. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  11. TDat: An Efficient Platform for Processing Petabyte-Scale Whole-Brain Volumetric Images.

    Science.gov (United States)

    Li, Yuxin; Gong, Hui; Yang, Xiaoquan; Yuan, Jing; Jiang, Tao; Li, Xiangning; Sun, Qingtao; Zhu, Dan; Wang, Zhenyu; Luo, Qingming; Li, Anan

    2017-01-01

    Three-dimensional imaging of whole mammalian brains at single-neuron resolution has generated terabyte (TB)- and even petabyte (PB)-sized datasets. Due to their size, processing these massive image datasets can be hindered by the computer hardware and software typically found in biological laboratories. To fill this gap, we have developed an efficient platform named TDat, which adopts a novel data reformatting strategy by reading cuboid data and employing parallel computing. In data reformatting, TDat is more efficient than any other software. In data accessing, we adopted parallelization to fully explore the capability for data transmission in computers. We applied TDat in large-volume data rigid registration and neuron tracing in whole-brain data with single-neuron resolution, which has never been demonstrated in other studies. We also showed its compatibility with various computing platforms, image processing software and imaging systems.

  12. Highly efficient red phosphorescent organic light-emitting diodes based on solution processed emissive layer

    International Nuclear Information System (INIS)

    Liu, Baiquan; Xu, Miao; Tao, Hong; Ying, Lei; Zou, Jianhua; Wu, Hongbin; Peng, Junbiao

    2013-01-01

    Highly efficient red phosphorescent organic polymer light-emitting diodes (PhOLEDs) were fabricated based on a solution-processed small-molecule host 4,4′-bis(N-carbazolyl)-1,1′-biphenyl (CBP) by doping an iridium complex, tris(1-(2,6-dimethylphenoxy)-4-(4-chlorophenyl)phthalazine)iridium (III) (Ir(MPCPPZ) 3 ). A hole blocking layer 1,3,5-tri(1-phenyl-1H-benzo[d]imidazol-2-yl)phenyl (TPBI) with a function of electron transport was thermally deposited onto the top of CBP layer. The diode with the structure of ITO/PEDOT:PSS (50 nm)/CBP:Ir(MPCPPZ) 3 (55 nm)/TPBI (30 nm)/Ba (4 nm)/Al (120 nm) showed an external quantum efficiency (QE ext ) of 19.3% and luminous efficiency (LE) of 18.3 cd/A at a current density of 0.16 mA/cm 2 , and Commission International de I'Eclairage (CIE) coordinates of (0.607, 0.375). It was suggested that the diodes using TPBI layer exhibited nearly 100% internal quantum efficiency and one order magnitude enhanced LE or QE ext efficiencies. -- Highlights: • Efficient red PhOLEDs based on a solution-processed small-molecule host were fabricated. • By altering volume ratio of chloroform/chlorobenzene solvent, we got best film quality of CBP. • EQE of the diode was 19.3%, indicating nearly 100% internal quantum yield was achieved

  13. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  14. Word Recognition Processing Efficiency as a Component of Second Language Listening

    Science.gov (United States)

    Joyce, Paul

    2013-01-01

    This study investigated the application of the speeded lexical decision task to L2 aural processing efficiency. One-hundred and twenty Japanese university students completed an aural word/nonword task. When the variation of lexical decision time (CV) was correlated with reaction time (RT), the results suggested that the single-word recognition…

  15. Using Data Envelopment Analysis to Measure International Agricultural Efficiency and Productivity

    OpenAIRE

    Arnade, Carlos Anthony

    1994-01-01

    Numerous methods for measuring multifactor productivity have been used by economists. This report uses a recently developed approach, data envlopment analysis, to measure productivity. This method can be used not only to calculate productivity but also to divide productivity measures into indices that measure technical efficiency and technical change. Technical efficiency measures the efficiency with which resources are used. Technical change measures changes in output arising from improved t...

  16. Resource efficiency of urban sanitation systems. A comparative assessment using material and energy flow analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meinzinger, Franziska

    2010-07-01

    fertiliser could be substituted by nutrients recovered from wastewater; for the case study of Arba Minch this substitution amounts to a maximum of 16%. Factors such as the transport of source separated flows or complex nutrient recovery processes can result in an increasing energy demand. However, source separation and recovery processes can also lead to energy reduction, for example, by urine diversion (minus 12% for the case of Hamburg) or by the use of biogas from anaerobic treatment plants (minus 38% for the case of Arba Minch). The energy efficiency depends on determinant parameters, e.g. the amount of co-digested organic waste. The impact of these parameters can be simulated in the model. Source-separating wastewater systems can reduce the use of natural water resources, for example, by reduced flush water consumption or greywater recycling. The integration of cost estimates with material and energy flow analyses, allows a cost-effectiveness appraisal of the system developments. Assumptions such as whether the costs refer to a new development or the modification of existing infrastructure have a major impact on the cost comparison. Where the sanitation system is improved, there is invariably an increase in costs when compared to the current situation. But in addition, financial benefits can be generated. For each case study, a discussion of the driving forces, preconditions and starting points for implementation, complements the comparative assessment. In addition, potential obstacles for transformation are discussed. The study shows that the method of using combined cost, energy and material flow analysis yields purposeful insights into the resource efficiency of alternative sanitation systems. This can contribute comprehensively to system analysis and decision support. (orig.)

  17. Measuring the relative efficiency of Ilam hospitals using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Ehsan Fazeli

    2012-08-01

    Full Text Available Measuring the relative efficiency is one of the most important issues among hospitals in today's economy. These days, we hear that cost reduction is a necessity for survival of business owners and one primary to reduce the expenditures is to increase relative efficiency. The proposed study of this paper first uses output oriented data envelopment analysis (DEA to measure the relative efficiencies of nine hospitals. The proposed model uses four types of employee namely specialists, physicians, technicians and other staffs as input parameters. The model also uses the number of surgeries, hospitalized and radiography as the outputs of the proposed model. Since the implementation of DEA leads us to have more than one single efficient unit, we implement supper efficiency technique to measure the relative efficiency of efficient units.

  18. Super-Efficiency and Sensitivity Analysis Based on Input-Oriented DEA-R

    Directory of Open Access Journals (Sweden)

    M. R. Mozaffari∗

    2012-03-01

    Full Text Available This paper suggests a method of finding super-efficiency scores and modification of input-oriented models for sensitivity analysis of decision making units. First, by using DEA-R (ratiobased DEA models in the input orientation, the models of superefficiency and also models of super-efficiency modification are suggested. Second, the worst-case scenarios are considered where the efficiency of the test DMU is deteriorating while the efficiencies of the other DMUs are improving. Then, by combining these two ideas, a model is suggested which increases the super-efficiency score and modifies the change ranges in order to preserve the performance class. In the end, the super-efficiency and change interval of efficient decision making units for 23 branches of Zone 1 of the Islamic Azad University are calculated

  19. Pilot-scale investigation of the robustness and efficiency of a copper-based treated wood wastes recycling process

    Energy Technology Data Exchange (ETDEWEB)

    Coudert, Lucie [INRS-ETE (Canada); Blais, Jean-François, E-mail: blaisjf@ete.inrs.ca [INRS-ETE (Canada); Mercier, Guy [INRS-ETE (Canada); Cooper, Paul [University of Toronto (Canada); Gastonguay, Louis [IREQ (Canada); Morris, Paul [FPInnovations (Canada); Janin, Amélie; Reynier, Nicolas [INRS-ETE (Canada)

    2013-10-15

    Highlights: • A leaching process was studied for metals removal from CCA-treated wood wastes. • This decontamination process was studied at pilot scale (130-L reactor). • Removals up to 98% of As, 88% of Cr, and 96% of Cu were obtained from wood wastes. • The produced leachates can be treated by chemical precipitation. -- Abstract: The disposal of metal-bearing treated wood wastes is becoming an environmental challenge. An efficient recycling process based on sulfuric acid leaching has been developed to remove metals from copper-based treated wood chips (0 < x < 12 mm). The present study explored the performance and the robustness of this technology in removing metals from copper-based treated wood wastes at a pilot plant scale (130-L reactor tank). After 3× 2 h leaching steps followed by 3× 7 min rinsing steps, up to 97.5% of As, 87.9% of Cr, and 96.1% of Cu were removed from CCA-treated wood wastes with different initial metal loading (>7.3 kg m{sup −3}) and more than 94.5% of Cu was removed from ACQ-, CA- and MCQ-treated wood. The treatment of effluents by precipitation–coagulation was highly efficient; allowing removals more than 93% for the As, Cr, and Cu contained in the effluent. The economic analysis included operating costs, indirect costs and revenues related to remediated wood sales. The economic analysis concluded that CCA-treated wood wastes remediation can lead to a benefit of 53.7 US$ t{sup −1} or a cost of 35.5 US$ t{sup −1} and that ACQ-, CA- and MCQ-treated wood wastes recycling led to benefits ranging from 9.3 to 21.2 US$ t{sup −1}.

  20. Biomass gasification for electricity generation with internal combustion engines. Process efficiency

    International Nuclear Information System (INIS)

    Lesme-Jaén, René; Garcia Faure, Luis; Recio Recio, Angel; Oliva Ruiz, Luis; Pajarín Rodríguez, Juan; Revilla Suarez, Dennis

    2015-01-01

    Biomass is a renewable source of energy worldwide increased prospects for its potential and its lower environmental impact compared to fossil fuels. By processes and energy conversion technologies it is possible to obtain fuels in solid, liquid and gaseous form from any biomass. The biomass gasification is the thermal conversion thereof into a gas, which can be used for electricity production with the use of internal combustion engines with a certain level of efficiency, which depends on the characteristics of biomass and engines used. In this work the evaluation of thermal and overall efficiency of the gasification in Integrated Forestry Enterprise of Santiago de Cuba, designed to generate electricity from waste from the forest industry is presented. Is a downdraft gasifier reactor, COMBO-80 model and engine manufacturing Hindu (diesel) model Leyland modified to work with producer gas. The evaluation was carried out for different loads (electric power generated) engine from experimental measurements of flow and composition of the gas supplied to the engine. The results show that the motor operates with a thermal efficiency in the range of 20-32% with an overall efficiency between 12-25%. (full text)

  1. Women process multisensory emotion expressions more efficiently than men.

    Science.gov (United States)

    Collignon, O; Girard, S; Gosselin, F; Saint-Amour, D; Lepore, F; Lassonde, M

    2010-01-01

    Despite claims in the popular press, experiments investigating whether female are more efficient than male observers at processing expression of emotions produced inconsistent findings. In the present study, participants were asked to categorize fear and disgust expressions displayed auditorily, visually, or audio-visually. Results revealed an advantage of women in all the conditions of stimulus presentation. We also observed more nonlinear probabilistic summation in the bimodal conditions in female than male observers, indicating greater neural integration of different sensory-emotional informations. These findings indicate robust differences between genders in the multisensory perception of emotion expression.

  2. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  3. CAMS: OLAPing Multidimensional Data Streams Efficiently

    Science.gov (United States)

    Cuzzocrea, Alfredo

    In the context of data stream research, taming the multidimensionality of real-life data streams in order to efficiently support OLAP analysis/mining tasks is a critical challenge. Inspired by this fundamental motivation, in this paper we introduce CAMS (C ube-based A cquisition model for M ultidimensional S treams), a model for efficiently OLAPing multidimensional data streams. CAMS combines a set of data stream processing methodologies, namely (i) the OLAP dimension flattening process, which allows us to obtain dimensionality reduction of multidimensional data streams, and (ii) the OLAP stream aggregation scheme, which aggregates data stream readings according to an OLAP-hierarchy-based membership approach. We complete our analytical contribution by means of experimental assessment and analysis of both the efficiency and the scalability of OLAPing capabilities of CAMS on synthetic multidimensional data streams. Both analytical and experimental results clearly connote CAMS as an enabling component for next-generation Data Stream Management Systems.

  4. Thermodynamic analysis of tar reforming through auto-thermal reforming process

    Energy Technology Data Exchange (ETDEWEB)

    Nurhadi, N., E-mail: nurhadi@tekmira.esdm.go.id; Diniyati, Dahlia; Efendi, M. Ade Andriansyah [R& D Centre for Mineral and Coal Technology, Jln. Jend.Sudirman no. 623, Bandung. Telp. 022-6030483 (Malaysia); Istadi, I. [Department of Chemical Engineering, Diponegoro University, Jln. Jl. Prof. Soedarto, SH, Semarang (Malaysia)

    2015-12-29

    Fixed bed gasification is a simple and suitable technology for small scale power generation. One of the disadvantages of this technology is producing tar. So far, tar is not utilized yet and being waste that should be treated into a more useful product. This paper presents a thermodynamic analysis of tar conversion into gas producer through non-catalytic auto-thermal reforming technology. Tar was converted into components, C, H, O, N and S, and then reacted with oxidant such as mixture of air or pure oxygen. Thus, this reaction occurred auto-thermally and reached chemical equilibrium. The sensitivity analysis resulted that the most promising process performance occurred at flow rate of air was reached 43% of stoichiometry while temperature of process is 1100°C, the addition of pure oxygen is 40% and preheating of oxidant flow is 250°C. The yield of the most promising process performance between 11.15-11.17 kmol/h and cold gas efficiency was between 73.8-73.9%.The results of this study indicated that thermodynamically the conversion of tar into producer gas through non-catalytic auto-thermal reformingis more promising.

  5. Efficiency of the health extension programme in Tigray, Ethiopia: a data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Lemma Hailemariam

    2010-06-01

    Full Text Available Abstract Background Since 2004, the government of Ethiopia has made a bold decision to strengthen and expand its primary health care system by launching the Health Extension Program (HEP. While the scaling up of the HEP is necessary to achieve the aim of universal access to primary health care, close attention should be paid to the performance of the program. Using a data envelopment analysis this study aimed at (i to estimate the technical efficiency of a sample of health posts in rural Tigray, ii to identify those factors which might be explaining the efficiency results. Methods Efficiency was measured using a data envelopment analysis model. A Tobit model was performed to identify factors associated with efficiency. Seven rural districts (out of 35 were purposely chosen. Input/output information was collected from the database of the Tigray Health Bureau during July 2007-June 2008. Information was also collected on environmental factors that might influence the efficiency outcomes through a structured questionnaire from the correspondent district health officers. Results Analysis was based on data from 60 health posts. The mean scores for technical and scale efficiency were 0.57 (SD = 0.32 and 0.95 (SD = 0.11 respectively. Out of the 60 health posts, 15 (25.0% were found to be technically efficient constituting the best practice frontier. Thirty eight (63.3% were operating at their most productive scale size. In the regression analysis, none of the variables was significantly associated with the efficiency outcome. Conclusion There is a need to review the management of the health information system in the region. The findings have also revealed that only a quarter of the health posts are working efficiently and pointed the need for improvement. A closer monitoring of the health extension programme is required in order to achieve the best possible performance.

  6. The Efficiency of Split Panel Designs in an Analysis of Variance Model

    Science.gov (United States)

    Wang, Wei-Guo; Liu, Hai-Jun

    2016-01-01

    We consider split panel design efficiency in analysis of variance models, that is, the determination of the cross-sections series optimal proportion in all samples, to minimize parametric best linear unbiased estimators of linear combination variances. An orthogonal matrix is constructed to obtain manageable expression of variances. On this basis, we derive a theorem for analyzing split panel design efficiency irrespective of interest and budget parameters. Additionally, relative estimator efficiency based on the split panel to an estimator based on a pure panel or a pure cross-section is present. The analysis shows that the gains from split panel can be quite substantial. We further consider the efficiency of split panel design, given a budget, and transform it to a constrained nonlinear integer programming. Specifically, an efficient algorithm is designed to solve the constrained nonlinear integer programming. Moreover, we combine one at time designs and factorial designs to illustrate the algorithm’s efficiency with an empirical example concerning monthly consumer expenditure on food in 1985, in the Netherlands, and the efficient ranges of the algorithm parameters are given to ensure a good solution. PMID:27163447

  7. Thermodynamic modelling and efficiency analysis of a class of real indirectly fired gas turbine cycles

    Directory of Open Access Journals (Sweden)

    Ma Zheshu

    2009-01-01

    Full Text Available Indirectly or externally-fired gas-turbines (IFGT or EFGT are novel technology under development for small and medium scale combined power and heat supplies in combination with micro gas turbine technologies mainly for the utilization of the waste heat from the turbine in a recuperative process and the possibility of burning biomass or 'dirty' fuel by employing a high temperature heat exchanger to avoid the combustion gases passing through the turbine. In this paper, by assuming that all fluid friction losses in the compressor and turbine are quantified by a corresponding isentropic efficiency and all global irreversibilities in the high temperature heat exchanger are taken into account by an effective efficiency, a one dimensional model including power output and cycle efficiency formulation is derived for a class of real IFGT cycles. To illustrate and analyze the effect of operational parameters on IFGT efficiency, detailed numerical analysis and figures are produced. The results summarized by figures show that IFGT cycles are most efficient under low compression ratio ranges (3.0-6.0 and fit for low power output circumstances integrating with micro gas turbine technology. The model derived can be used to analyze and forecast performance of real IFGT configurations.

  8. Integrated electrocoagulation-electrooxidation process for the treatment of soluble coffee effluent: Optimization of COD degradation and operation time analysis.

    Science.gov (United States)

    Ibarra-Taquez, Harold N; GilPavas, Edison; Blatchley, Ernest R; Gómez-García, Miguel-Ángel; Dobrosz-Gómez, Izabela

    2017-09-15

    Soluble coffee production generates wastewater containing complex mixtures of organic macromolecules. In this work, a sequential Electrocoagulation-Electrooxidation (EC-EO) process, using aluminum and graphite electrodes, was proposed as an alternative way for the treatment of soluble coffee effluent. Process operational parameters were optimized, achieving total decolorization, as well as 74% and 63.5% of COD and TOC removal, respectively. The integrated EC-EO process yielded a highly oxidized (AOS = 1.629) and biocompatible (BOD 5 /COD ≈ 0.6) effluent. The Molecular Weight Distribution (MWD) analysis showed that during the EC-EO process, EC effectively decomposed contaminants with molecular weight in the range of 10-30 kDa. In contrast, EO was quite efficient in mineralization of contaminants with molecular weight higher than 30 kDa. A kinetic analysis allowed determination of the time required to meet Colombian permissible discharge limits. Finally, a comprehensive operational cost analysis was performed. The integrated EC-EO process was demonstrated as an efficient alternative for the treatment of industrial effluents resulting from soluble coffee production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. The lower limit of interval efficiency in Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Bijan Rahmani Parchikolaei

    2015-05-01

    Full Text Available In data envelopment analysis technique, the relative efficiency of the homogenous decision making units is calculated. These calculations are done based on the classical model of linear programming such as CCR,BCC,.... Because of maximizing the weighted sum of outputs to that in inputs of one unit under certain conditions, the obtained efficiency in all of these models is the upper limit of exact relative efficiency. In other words, the efficiency is calculatedfrom the optimistic viewpoint. To caculated the lower limit of efficiency, i.e. the efficiency obtained from a pessimistic viewpoint for certain weights, the existing models cannot calculate the exact lower limit and in some cases, there exist some models that show an incorrect lower limit. Through the model introduced in the present study, we can calculate the exact lower limit of the interval efficiency. The designed model can be obtained by minimizing the ratio of weighted sum of outputs to that of inputs for every unit under certion conditions. The exact lower limit can be calculated in all states through our adopted model.

  10. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  11. Heat supply analysis of steam reforming hydrogen production process in conventional and nuclear

    International Nuclear Information System (INIS)

    Siti Alimah; Djati Hoesen Salimy

    2015-01-01

    Tile analysis of heat energy supply in the production of hydrogen by natural gas steam reforming process has been done. The aim of the study is to compare the energy supply system of conventional and nuclear heat. Methodology used in this study is an assessment of literature and analysis based on the comparisons. The study shows that the heat sources of fossil fuels (natural gas) is able to provide optimum operating conditions of temperature and pressure of 850-900 °C and 2-3 MPa, as well as the heat transfer is dominated by radiation heat transfer, so that the heat flux that can be achieved on the catalyst tube relatively high (50-80 kW/m"2) and provide high thermal efficiency of about 85 %. While in the system with nuclear energy, due to the demands of safety, process operating at less than optimum conditions of temperature and pressure of 800-850 °C and 4.5 MPa, as well as the heat transfer is dominated by convection heat transfer, so that the heat flux that can be achieved catalyst tube is relatively low (1020 kW/m"2) and it provides a low thermal efficiency of about 50 %. Modifications of reformer and heat utilization can increase the heat flux up to 40 kW/m"2 so that the thermal efficiency can reach 78 %. Nevertheless, the application of nuclear energy to hydrogen production with steam reforming process is able to reduce the burning of fossil fuels which has implications for the potential decrease in the rate of CO2 emissions into the environment. (author)

  12. Efficient Separations and Processing Crosscutting Program. Technology summary

    International Nuclear Information System (INIS)

    1995-06-01

    The Efficient Separations and Processing (ESP) Crosscutting Program was created in 1991 to identify, develop, and perfect separations technologies and processes to treat wastes and address environmental problems throughout the DOE Complex. The ESP funds several multi-year tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESP supports applied research and development (R and D) leading to demonstration or use of these separations technologies by other organizations within DOE-EM. Treating essentially all DOE defense wastes requires separation methods that concentrate the contaminants and/or purify waste streams for release to the environment or for downgrading to a waste form less difficult and expensive to dispose of. Initially, ESP R and D efforts focused on treatment of high-level waste (HLW) from underground storage tanks (USTs) because of the potential for large reductions in disposal costs and hazards. As further separations needs emerge and as waste management and environmental restoration priorities change, the program has evolved to encompass the breadth of waste management and environmental remediation problems

  13. Reducing Bottlenecks to Improve the Efficiency of the Lung Cancer Care Delivery Process: A Process Engineering Modeling Approach to Patient-Centered Care.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Yu, Xinhua; Faris, Nicholas R; Rugless, Fedoria; Jiang, Shan; Li, Jingshan; Osarogiagbon, Raymond U

    2017-12-01

    The process of lung cancer care from initial lesion detection to treatment is complex, involving multiple steps, each introducing the potential for substantial delays. Identifying the steps with the greatest delays enables a focused effort to improve the timeliness of care-delivery, without sacrificing quality. We retrospectively reviewed clinical events from initial detection, through histologic diagnosis, radiologic and invasive staging, and medical clearance, to surgery for all patients who had an attempted resection of a suspected lung cancer in a community healthcare system. We used a computer process modeling approach to evaluate delays in care delivery, in order to identify potential 'bottlenecks' in waiting time, the reduction of which could produce greater care efficiency. We also conducted 'what-if' analyses to predict the relative impact of simulated changes in the care delivery process to determine the most efficient pathways to surgery. The waiting time between radiologic lesion detection and diagnostic biopsy, and the waiting time from radiologic staging to surgery were the two most critical bottlenecks impeding efficient care delivery (more than 3 times larger compared to reducing other waiting times). Additionally, instituting surgical consultation prior to cardiac consultation for medical clearance and decreasing the waiting time between CT scans and diagnostic biopsies, were potentially the most impactful measures to reduce care delays before surgery. Rigorous computer simulation modeling, using clinical data, can provide useful information to identify areas for improving the efficiency of care delivery by process engineering, for patients who receive surgery for lung cancer.

  14. Ownership and technical efficiency of hospitals: evidence from Ghana using data envelopment analysis.

    Science.gov (United States)

    Jehu-Appiah, Caroline; Sekidde, Serufusa; Adjuik, Martin; Akazili, James; Almeida, Selassi D; Nyonator, Frank; Baltussen, Rob; Asbu, Eyob Zere; Kirigia, Joses Muthuri

    2014-04-08

    In order to measure and analyse the technical efficiency of district hospitals in Ghana, the specific objectives of this study were to (a) estimate the relative technical and scale efficiency of government, mission, private and quasi-government district hospitals in Ghana in 2005; (b) estimate the magnitudes of output increases and/or input reductions that would have been required to make relatively inefficient hospitals more efficient; and (c) use Tobit regression analysis to estimate the impact of ownership on hospital efficiency. In the first stage, we used data envelopment analysis (DEA) to estimate the efficiency of 128 hospitals comprising of 73 government hospitals, 42 mission hospitals, 7 quasi-government hospitals and 6 private hospitals. In the second stage, the estimated DEA efficiency scores are regressed against hospital ownership variable using a Tobit model. This was a retrospective study. In our DEA analysis, using the variable returns to scale model, out of 128 district hospitals, 31 (24.0%) were 100% efficient, 25 (19.5%) were very close to being efficient with efficiency scores ranging from 70% to 99.9% and 71 (56.2%) had efficiency scores below 50%. The lowest-performing hospitals had efficiency scores ranging from 21% to 30%.Quasi-government hospitals had the highest mean efficiency score (83.9%) followed by public hospitals (70.4%), mission hospitals (68.6%) and private hospitals (55.8%). However, public hospitals also got the lowest mean technical efficiency scores (27.4%), implying they have some of the most inefficient hospitals.Regarding regional performance, Northern region hospitals had the highest mean efficiency score (83.0%) and Volta Region hospitals had the lowest mean score (43.0%).From our Tobit regression, we found out that while quasi-government ownership is positively associated with hospital technical efficiency, private ownership negatively affects hospital efficiency. It would be prudent for policy-makers to examine the

  15. Assessment of academic departments efficiency using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Salah R. Agha

    2011-07-01

    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  16. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    Science.gov (United States)

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  17. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  18. Analysis of the efficiency-integration nexus of Japanese stock market

    Science.gov (United States)

    Rizvi, Syed Aun R.; Arshad, Shaista

    2017-03-01

    This paper attempts a novel approach in analysing the Japanese economy through a dual-dimension analysis of its stock market, examining the efficiency and market integration. Taking a period of 24 years, this study employs MFDFA and MGARCH to understand how the efficiency and integration of the stock market faired during different business cycle phases of the Japanese economy. The results showed improving efficiency over the time period. For the case of market integration, our findings conform to recent literature on business cycles and stock market integration that every succeeding recession creates a break into integration levels resulting in a decrease.

  19. Multilevel Control System of Regional Power Consumption: Analysis of Infrastructure Elements Interconnections, Efficiency Evaluation

    Directory of Open Access Journals (Sweden)

    Marina Nikolaevna Myznikova

    2016-10-01

    Full Text Available Fundamental strategic programs in the spheres of power and economics adopted at various levels of management, are not always capable to solve the problem of power efficiency. The changes of systemic connections of economy and power elements are one of the basic problems of management at the regional level. The development of market relations has caused the growth of uncertainty factors at all levels of power consumption management. An insufficient estimation of system infrastructural interrelations and an individualization of organizational-economic relations of economic subjects and their localization, have generated the intersystem conflictness in distribution of power resources and have aggravated the problem of estimating power consumption efficiency at a systemic level. The restriction of application of the traditional management methods based on the principles of technological efficiency of the processes of energy manufacture and consumption, is connected with the information ruptures caused by the growth of factors of uncertainty and inconsistency of efficiency criteria. Application of modern methods of power consumption forecasting has a number of essential restrictions. At the present stage the management of power consumption in multilevel systems is aimed at realisation of system integrity and economic coordination of manufacture elements, transfer and consumption at regional level and demands working out of the new effective management methods based on the analysis of system interrelations. Allocation of system interrelations depends on features of development of electropower sector, active and passive elements of the structure of consumption, power balance. The analysis and estimation of interrelations of power and economic sphere allow to improve methodology of management of power consumption at the regional level in the conditions of uncertainty.

  20. Analysis Platform for Energy Efficiency Enhancement in Hybrid and Full Electric Vehicles

    Directory of Open Access Journals (Sweden)

    NICOLAICA, M.-O.

    2016-02-01

    Full Text Available The current paper presents a new virtual analysis method that is applied both on hybrid and electric vehicle architectures with the purpose of contributing to the improvement of energy efficiency. The study is based on Matlab modeling and simulation. A set of parameters are considered in order to assess the system performance. The benefit is given by the comparative overview obtained after the completed analysis. The effectiveness of the analysis method is confirmed by a sequence of simulation results combined in several case studies. The impulse of the research is given by the fact that the automotive market is focusing on wider simulation techniques and better control strategies that lead to more efficient vehicles. Applying the proposed method during design would improve the battery management and controls strategy. The advantage of this method is that the system behavior with regards to energy efficiency can be evaluated from an early concept phase. The results contribute to the actual necessity of driving more efficient and more environmental friendly vehicles.

  1. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  2. Thermodynamic and energy efficiency analysis of power generation from natural salinity gradients by pressure retarded osmosis.

    Science.gov (United States)

    Yip, Ngai Yin; Elimelech, Menachem

    2012-05-01

    The Gibbs free energy of mixing dissipated when fresh river water flows into the sea can be harnessed for sustainable power generation. Pressure retarded osmosis (PRO) is one of the methods proposed to generate power from natural salinity gradients. In this study, we carry out a thermodynamic and energy efficiency analysis of PRO work extraction. First, we present a reversible thermodynamic model for PRO and verify that the theoretical maximum extractable work in a reversible PRO process is identical to the Gibbs free energy of mixing. Work extraction in an irreversible constant-pressure PRO process is then examined. We derive an expression for the maximum extractable work in a constant-pressure PRO process and show that it is less than the ideal work (i.e., Gibbs free energy of mixing) due to inefficiencies intrinsic to the process. These inherent inefficiencies are attributed to (i) frictional losses required to overcome hydraulic resistance and drive water permeation and (ii) unutilized energy due to the discontinuation of water permeation when the osmotic pressure difference becomes equal to the applied hydraulic pressure. The highest extractable work in constant-pressure PRO with a seawater draw solution and river water feed solution is 0.75 kWh/m(3) while the free energy of mixing is 0.81 kWh/m(3)-a thermodynamic extraction efficiency of 91.1%. Our analysis further reveals that the operational objective to achieve high power density in a practical PRO process is inconsistent with the goal of maximum energy extraction. This study demonstrates thermodynamic and energetic approaches for PRO and offers insights on actual energy accessible for utilization in PRO power generation through salinity gradients. © 2012 American Chemical Society

  3. Development of system analysis code for pyrochemical process using molten salt electrorefining

    International Nuclear Information System (INIS)

    Tozawa, K.; Matsumoto, T.; Kakehi, I.

    2000-04-01

    This report describes accomplishment of development of a cathode processor calculation code to simulate the mass and heat transfer phenomena with the distillation process and development of an analytical model for cooling behavior of the pyrochemical process cell on personal computers. The pyrochemical process using molten salt electrorefining would introduce new technologies for new fuels of particle oxide, particle nitride and metallic fuels. The cathode processor calculation code with distillation process was developed. A code validation calculation has been conducted on the basic of the benchmark problem for natural convection in a square cavity. Results by using the present code agreed well for the velocity-temperature fields, the maximum velocity and its location with the benchmark solution published in a paper. The functions have been added to advance the reality in simulation and to increase the efficiency in utilization. The test run has been conducted using the code with the above modification for an axisymmetric enclosed vessel simulating a cathode processor, and the capability of the distillation process simulation with the code has been confirmed. An analytical model for cooling behavior of the pyrochemical process cell was developed. The analytical model was selected by comparing benchmark analysis with detailed analysis on engineering workstation. Flow and temperature distributions were confirmed by the result of steady state analysis. In the result of transient cooling analysis, an initial transient peak of temperature occurred at balanced heat condition in the steady-state analysis. Final gas temperature distribution was dependent on gas circulation flow in transient condition. Then there were different final gas temperature distributions on the basis of the result of steady-state analysis. This phenomenon has a potential for it's own metastable condition. Therefore it was necessary to design gas cooling flow pattern without cooling gas circulation

  4. Toward High-Efficiency Solution-Processed Planar Heterojunction Sb2S3 Solar Cells.

    Science.gov (United States)

    Zimmermann, Eugen; Pfadler, Thomas; Kalb, Julian; Dorman, James A; Sommer, Daniel; Hahn, Giso; Weickert, Jonas; Schmidt-Mende, Lukas

    2015-05-01

    Low-cost hybrid solar cells have made tremendous steps forward during the past decade owing to the implementation of extremely thin inorganic coatings as absorber layers, typically in combination with organic hole transporters. Using only extremely thin films of these absorbers reduces the requirement of single crystalline high-quality materials and paves the way for low-cost solution processing compatible with roll-to-roll fabrication processes. To date, the most efficient absorber material, except for the recently introduced organic-inorganic lead halide perovskites, has been Sb 2 S 3 , which can be implemented in hybrid photovoltaics using a simple chemical bath deposition. Current high-efficiency Sb 2 S 3 devices utilize absorber coatings on nanostructured TiO 2 electrodes in combination with polymeric hole transporters. This geometry has so far been the state of the art, even though flat junction devices would be conceptually simpler with the additional potential of higher open circuit voltages due to reduced charge carrier recombination. Besides, the role of the hole transporter is not completely clarified yet. In particular, additional photocurrent contribution from the polymers has not been directly shown, which points toward detrimental parasitic light absorption in the polymers. This study presents a fine-tuned chemical bath deposition method that allows fabricating solution-processed low-cost flat junction Sb 2 S 3 solar cells with the highest open circuit voltage reported so far for chemical bath devices and efficiencies exceeding 4%. Characterization of back-illuminated solar cells in combination with transfer matrix-based simulations further allows to address the issue of absorption losses in the hole transport material and outline a pathway toward more efficient future devices.

  5. A novel vortex tube-based N2-expander liquefaction process for enhancing the energy efficiency of natural gas liquefaction

    Directory of Open Access Journals (Sweden)

    Qyyum Muhammad Abdul

    2017-01-01

    Full Text Available This research work unfolds a simple, safe, and environment-friendly energy efficient novel vortex tube-based natural gas liquefaction process (LNG. A vortex tube was introduced to the popular N2-expander liquefaction process to enhance the liquefaction efficiency. The process structure and condition were modified and optimized to take a potential advantage of the vortex tube on the natural gas liquefaction cycle. Two commercial simulators ANSYS® and Aspen HYSYS® were used to investigate the application of vortex tube in the refrigeration cycle of LNG process. The Computational fluid dynamics (CFD model was used to simulate the vortex tube with nitrogen (N2 as a working fluid. Subsequently, the results of the CFD model were embedded in the Aspen HYSYS® to validate the proposed LNG liquefaction process. The proposed natural gas liquefaction process was optimized using the knowledge-based optimization (KBO approach. The overall energy consumption was chosen as an objective function for optimization. The performance of the proposed liquefaction process was compared with the conventional N2-expander liquefaction process. The vortex tube-based LNG process showed a significant improvement of energy efficiency by 20% in comparison with the conventional N2-expander liquefaction process. This high energy efficiency was mainly due to the isentropic expansion of the vortex tube. It turned out that the high energy efficiency of vortex tube-based process is totally dependent on the refrigerant cold fraction, operating conditions as well as refrigerant cycle configurations.

  6. A novel vortex tube-based N2-expander liquefaction process for enhancing the energy efficiency of natural gas liquefaction

    Science.gov (United States)

    Qyyum, Muhammad Abdul; Wei, Feng; Hussain, Arif; Ali, Wahid; Sehee, Oh; Lee, Moonyong

    2017-11-01

    This research work unfolds a simple, safe, and environment-friendly energy efficient novel vortex tube-based natural gas liquefaction process (LNG). A vortex tube was introduced to the popular N2-expander liquefaction process to enhance the liquefaction efficiency. The process structure and condition were modified and optimized to take a potential advantage of the vortex tube on the natural gas liquefaction cycle. Two commercial simulators ANSYS® and Aspen HYSYS® were used to investigate the application of vortex tube in the refrigeration cycle of LNG process. The Computational fluid dynamics (CFD) model was used to simulate the vortex tube with nitrogen (N2) as a working fluid. Subsequently, the results of the CFD model were embedded in the Aspen HYSYS® to validate the proposed LNG liquefaction process. The proposed natural gas liquefaction process was optimized using the knowledge-based optimization (KBO) approach. The overall energy consumption was chosen as an objective function for optimization. The performance of the proposed liquefaction process was compared with the conventional N2-expander liquefaction process. The vortex tube-based LNG process showed a significant improvement of energy efficiency by 20% in comparison with the conventional N2-expander liquefaction process. This high energy efficiency was mainly due to the isentropic expansion of the vortex tube. It turned out that the high energy efficiency of vortex tube-based process is totally dependent on the refrigerant cold fraction, operating conditions as well as refrigerant cycle configurations.

  7. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    the provision of high-quality data for LCA studies of products using these unit process datasets for the manufacturing processes, as well as the in-depth analysis of individual manufacturing unit processes.In addition, the accruing availability of data for a range of similar machines (same process, different......This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...... and resource efficiency improvements of the manufacturing unit process. To ensure optimal reproducibility and applicability, documentation guidelines for data and metadata are included in both approaches. Guidance on definition of functional unit and reference flow as well as on determination of system...

  8. Production yield analysis in food processing. Applications in the French-fries and the poultry-processing industries

    NARCIS (Netherlands)

    Somsen, D.J.

    2004-01-01

    Food processors face increasing demands to improve their raw material yield efficiency. To really understand the raw material yield efficiency of food processing, mass losses need to be divided in wanted (desired) and unwanted ones. The basic approach to increase the raw material yield efficiency is

  9. Spatial econometric analysis of factors influencing regional energy efficiency in China.

    Science.gov (United States)

    Song, Malin; Chen, Yu; An, Qingxian

    2018-05-01

    Increased environmental pollution and energy consumption caused by the country's rapid development has raised considerable public concern, and has become the focus of the government and public. This study employs the super-efficiency slack-based model-data envelopment analysis (SBM-DEA) to measure the total factor energy efficiency of 30 provinces in China. The estimation model for the spatial interaction intensity of regional total factor energy efficiency is based on Wilson's maximum entropy model. The model is used to analyze the factors that affect the potential value of total factor energy efficiency using spatial dynamic panel data for 30 provinces during 2000-2014. The study found that there are differences and spatial correlations of energy efficiency among provinces and regions in China. The energy efficiency in the eastern, central, and western regions fluctuated significantly, and was mainly because of significant energy efficiency impacts on influences of industrial structure, energy intensity, and technological progress. This research is of great significance to China's energy efficiency and regional coordinated development.

  10. Bayesian sensitivity analysis of a 1D vascular model with Gaussian process emulators.

    Science.gov (United States)

    Melis, Alessandro; Clayton, Richard H; Marzo, Alberto

    2017-12-01

    One-dimensional models of the cardiovascular system can capture the physics of pulse waves but involve many parameters. Since these may vary among individuals, patient-specific models are difficult to construct. Sensitivity analysis can be used to rank model parameters by their effect on outputs and to quantify how uncertainty in parameters influences output uncertainty. This type of analysis is often conducted with a Monte Carlo method, where large numbers of model runs are used to assess input-output relations. The aim of this study was to demonstrate the computational efficiency of variance-based sensitivity analysis of 1D vascular models using Gaussian process emulators, compared to a standard Monte Carlo approach. The methodology was tested on four vascular networks of increasing complexity to analyse its scalability. The computational time needed to perform the sensitivity analysis with an emulator was reduced by the 99.96% compared to a Monte Carlo approach. Despite the reduced computational time, sensitivity indices obtained using the two approaches were comparable. The scalability study showed that the number of mechanistic simulations needed to train a Gaussian process for sensitivity analysis was of the order O(d), rather than O(d×103) needed for Monte Carlo analysis (where d is the number of parameters in the model). The efficiency of this approach, combined with capacity to estimate the impact of uncertain parameters on model outputs, will enable development of patient-specific models of the vascular system, and has the potential to produce results with clinical relevance. © 2017 The Authors International Journal for Numerical Methods in Biomedical Engineering Published by John Wiley & Sons Ltd.

  11. Efficient processing of MPEG-21 metadata in the binary domain

    Science.gov (United States)

    Timmerer, Christian; Frank, Thomas; Hellwagner, Hermann; Heuer, Jörg; Hutter, Andreas

    2005-10-01

    XML-based metadata is widely adopted across the different communities and plenty of commercial and open source tools for processing and transforming are available on the market. However, all of these tools have one thing in common: they operate on plain text encoded metadata which may become a burden in constrained and streaming environments, i.e., when metadata needs to be processed together with multimedia content on the fly. In this paper we present an efficient approach for transforming such kind of metadata which are encoded using MPEG's Binary Format for Metadata (BiM) without additional en-/decoding overheads, i.e., within the binary domain. Therefore, we have developed an event-based push parser for BiM encoded metadata which transforms the metadata by a limited set of processing instructions - based on traditional XML transformation techniques - operating on bit patterns instead of cost-intensive string comparisons.

  12. Simple solution-processed CuOX as anode buffer layer for efficient organic solar cells

    International Nuclear Information System (INIS)

    Shen, Wenfei; Yang, Chunpeng; Bao, Xichang; Sun, Liang; Wang, Ning; Tang, Jianguo; Chen, Weichao; Yang, Renqiang

    2015-01-01

    Graphical abstract: - Highlights: • Simple solution-processed CuO X hole transport layer for efficient organic solar cell. • Good photovoltaic performances as hole transport layer in OSCs with P3HT and PBDTTT-C as donor materials. • The device with CuO X as hole transport layer shows great improved stability compared with that of device with PEDOT:PSS as hole transport layer. - Abstract: A simple, solution-processed ultrathin CuO X anode buffer layer was fabricated for high performance organic solar cells (OSCs). XPS measurement demonstrated that the CuO X was the composite of CuO and Cu 2 O. The CuO X modified ITO glass exhibit a better surface contact with the active layer. The photovoltaic performance of the devices with CuO X layer was optimized by varying the thickness of CuO X films through changing solution concentration. With P3HT:PC 61 BM as the active layer, we demonstrated an enhanced PCE of 4.14% with CuO X anode buffer layer, compared with that of PEDOT:PSS layer. The CuO X layer also exhibits efficient photovoltaic performance in devices with PBDTTT-C:PC 71 BM as the active layer. The long-term stability of CuO X device is better than that of PEDOT:PSS device. The results indicate that the easy solution-processed CuO X film can act as an efficient anode buffer layer for high-efficiency OSCs

  13. Measuring Eco-efficiency of Production with Data Envelopment Analysis

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Kortelainen, M.

    2005-01-01

    Aggregation of environmental pressures into a single environmental damage index is a major challenge of eco-efficiency measurement. This article examines how the data envelopment analysis (DEA) method can be adapted for this purpose. DEA accounts for substitution possibilities between different

  14. An analysis of factors that influence the technical efficiency of Malaysian thermal power plants

    International Nuclear Information System (INIS)

    See, Kok Fong; Coelli, Tim

    2012-01-01

    The main objectives of this paper are to measure the technical efficiency levels of Malaysian thermal power plants and to investigate the degree to which various factors influence efficiency levels in these plants. Stochastic frontier analysis (SFA) methods are applied to plant-level data over an eight year period from 1998 to 2005. This is the first comprehensive analysis (to our knowledge) of technical efficiency in the Malaysian electricity generation industry using parametric method. Our empirical results indicate that ownership, plant size and fuel type have a significant influence on technical efficiency levels. We find that publicly-owned power plants obtain average technical efficiencies of 0.68, which is lower than privately-owned power plants, which achieve average technical efficiencies of 0.88. We also observe that larger power plants with more capacity and gas-fired power plants tend to be more technically efficient than other power plants. Finally, we find that plant age and peaking plant type have no statistically significant influence on the technical efficiencies of Malaysian thermal power plants. - Highlights: ► We examine the technical efficiency (TE) levels of Malaysian thermal power plants. ► We also investigate the degree to which various factors influence efficiency levels in these plants. ► Stochastic frontier analysis methods are used. ► Average plant would have to increase their TE level by 21% to reach the efficient frontier. ► Ownership, plant size and fuel type have a significant influence on the TE levels.

  15. Photonic efficiency of the photodegradation of paracetamol in water by the photo-Fenton process.

    Science.gov (United States)

    Yamal-Turbay, E; Ortega, E; Conte, L O; Graells, M; Mansilla, H D; Alfano, O M; Pérez-Moya, M

    2015-01-01

    An experimental study of the homogeneous Fenton and photo-Fenton degradation of 4-amidophenol (paracetamol, PCT) is presented. For all the operation conditions evaluated, PCT degradation is efficiently attained by both Fenton and photo-Fenton processes. Also, photonic efficiencies of PCT degradation and mineralization are determined under different experimental conditions, characterizing the influence of hydrogen peroxide (H2O2) and Fe(II) on both contaminant degradation and sample mineralization. The maximum photonic degradation efficiencies for 5 and 10 mg L(-1) Fe(II) were 3.9 (H2O2 = 189 mg L(-1)) and 5 (H2O2 = 378 mg L(-1)), respectively. For higher concentrations of oxidant, H2O2 acts as a "scavenger" radical, competing in pollutant degradation and reducing the reaction rate. Moreover, in order to quantify the consumption of the oxidizing agent, the specific consumption of the hydrogen peroxide was also evaluated. For all operating conditions of both hydrogen peroxide and Fe(II) concentration, the consumption values obtained for Fenton process were always higher than the corresponding values observed for photo-Fenton. This implies a less efficient use of the oxidizing agent for dark conditions.

  16. Efficient computation of aerodynamic influence coefficients for aeroelastic analysis on a transputer network

    Science.gov (United States)

    Janetzke, David C.; Murthy, Durbha V.

    1991-01-01

    Aeroelastic analysis is multi-disciplinary and computationally expensive. Hence, it can greatly benefit from parallel processing. As part of an effort to develop an aeroelastic capability on a distributed memory transputer network, a parallel algorithm for the computation of aerodynamic influence coefficients is implemented on a network of 32 transputers. The aerodynamic influence coefficients are calculated using a 3-D unsteady aerodynamic model and a parallel discretization. Efficiencies up to 85 percent were demonstrated using 32 processors. The effect of subtask ordering, problem size, and network topology are presented. A comparison to results on a shared memory computer indicates that higher speedup is achieved on the distributed memory system.

  17. Knowledge based decision making method for the selection of mixed refrigerant systems for energy efficient LNG processes

    International Nuclear Information System (INIS)

    Khan, Mohd Shariq; Lee, Sanggyu; Rangaiah, G.P.; Lee, Moonyong

    2013-01-01

    Highlights: • Practical method for finding optimum refrigerant composition is proposed for LNG plant. • Knowledge of boiling point differences in refrigerant component is employed. • Implementation of process knowledge notably makes LNG process energy efficient. • Optimization of LNG plant is more transparent using process knowledge. - Abstract: Mixed refrigerant (MR) systems are used in many industrial applications because of their high energy efficiency, compact design and energy-efficient heat transfer compared to other processes operating with pure refrigerants. The performance of MR systems depends strongly on the optimum refrigerant composition, which is difficult to obtain. This paper proposes a simple and practical method for selecting the appropriate refrigerant composition, which was inspired by (i) knowledge of the boiling point difference in MR components, and (ii) their specific refrigeration effect in bringing a MR system close to reversible operation. A feasibility plot and composite curves were used for full enforcement of the approach temperature. The proposed knowledge-based optimization approach was described and applied to a single MR and a propane precooled MR system for natural gas liquefaction. Maximization of the heat exchanger exergy efficiency was considered as the optimization objective to achieve an energy efficient design goal. Several case studies on single MR and propane precooled MR processes were performed to show the effectiveness of the proposed method. The application of the proposed method is not restricted to liquefiers, and can be applied to any refrigerator and cryogenic cooler where a MR is involved

  18. Separating environmental efficiency into production and abatement efficiency. A nonparametric model with application to U.S. power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hampf, Benjamin

    2011-08-15

    In this paper we present a new approach to evaluate the environmental efficiency of decision making units. We propose a model that describes a two-stage process consisting of a production and an end-of-pipe abatement stage with the environmental efficiency being determined by the efficiency of both stages. Taking the dependencies between the two stages into account, we show how nonparametric methods can be used to measure environmental efficiency and to decompose it into production and abatement efficiency. For an empirical illustration we apply our model to an analysis of U.S. power plants.

  19. IDENTIFYING AND SELECTING THE STRATEGIC PROCESS USING THE CROSS-EFFICIENCY APPROACH BASED ON SATISFACTION LEVEL AND EXTENDDED BALANCED SCORECARD

    Directory of Open Access Journals (Sweden)

    Ardeshir Bazrkar

    2018-03-01

    Full Text Available The strategy is a macro and strategic plan, and will only be implemented when it is defined in the form of various projects. In order to exploit the benefits of lean six sigma projects, these projects should be in line with the strategic goals of the organization. Organizations should select projects which are compatible with the organization overall goals and fulfill the strategic requirements of the organization. The purpose of this study is to identify the strategic process among the bank facility processes to use it in lean six sigma methodology in order to improve process performance and efficiency using a combination of cross-efficiency and extended balanced scorecard methods. In the first step, the criteria for selecting the strategic process were identified using the six measures of the balanced scorecard method. In the second step, after collecting information using the cross-efficiency model based on satisfaction level, the bank facility processes are ranked based on the efficiency score. The results show that the ranking of the processes under consideration is carried out without any interference, and one of the processes (process 3 is considered as the strategic process to use in the six sigma methodology.

  20. Energy efficiency determinants: An empirical analysis of Spanish innovative firms

    International Nuclear Information System (INIS)

    Costa-Campi, María Teresa; García-Quevedo, José; Segarra, Agustí

    2015-01-01

    This paper examines the extent to which innovative Spanish firms pursue improvements in energy efficiency (EE) as an objective of innovation. The increase in energy consumption and its impact on greenhouse gas emissions justifies the greater attention being paid to energy efficiency and especially to industrial EE. The ability of manufacturing companies to innovate and improve their EE has a substantial influence on attaining objectives regarding climate change mitigation. Despite the effort to design more efficient energy policies, the EE determinants in manufacturing firms have been little studied in the empirical literature. From an exhaustive sample of Spanish manufacturing firms and using a logit model, we examine the energy efficiency determinants for those firms that have innovated. To carry out the econometric analysis, we use panel data from the Community Innovation Survey for the period 2008–2011. Our empirical results underline the role of size among the characteristics of firms that facilitate energy efficiency innovation. Regarding company behaviour, firms that consider the reduction of environmental impacts to be an important objective of innovation and that have introduced organisational innovations are more likely to innovate with the objective of increasing energy efficiency. -- Highlights: •Drivers of innovation in energy efficiency at firm-level are examined. •Tangible investments have a greater influence on energy efficiency than R&D. •Environmental and energy efficiency innovation objectives are complementary. •Organisational innovation favors energy efficiency innovation. •Public policies should be implemented to improve firms’ energy efficiency

  1. Processing-Efficient Distributed Adaptive RLS Filtering for Computationally Constrained Platforms

    Directory of Open Access Journals (Sweden)

    Noor M. Khan

    2017-01-01

    Full Text Available In this paper, a novel processing-efficient architecture of a group of inexpensive and computationally incapable small platforms is proposed for a parallely distributed adaptive signal processing (PDASP operation. The proposed architecture runs computationally expensive procedures like complex adaptive recursive least square (RLS algorithm cooperatively. The proposed PDASP architecture operates properly even if perfect time alignment among the participating platforms is not available. An RLS algorithm with the application of MIMO channel estimation is deployed on the proposed architecture. Complexity and processing time of the PDASP scheme with MIMO RLS algorithm are compared with sequentially operated MIMO RLS algorithm and liner Kalman filter. It is observed that PDASP scheme exhibits much lesser computational complexity parallely than the sequential MIMO RLS algorithm as well as Kalman filter. Moreover, the proposed architecture provides an improvement of 95.83% and 82.29% decreased processing time parallely compared to the sequentially operated Kalman filter and MIMO RLS algorithm for low doppler rate, respectively. Likewise, for high doppler rate, the proposed architecture entails an improvement of 94.12% and 77.28% decreased processing time compared to the Kalman and RLS algorithms, respectively.

  2. Arx: a toolset for the efficient simulation and direct synthesis of high-performance signal processing algorithms

    NARCIS (Netherlands)

    Hofstra, K.L.; Gerez, Sabih H.

    2007-01-01

    This paper addresses the efficient implementation of highperformance signal-processing algorithms. In early stages of such designs many computation-intensive simulations may be necessary. This calls for hardware description formalisms targeted for efficient simulation (such as the programming

  3. Towards a utilisation of transient processing in the technology of high efficiency silicon solar cells

    International Nuclear Information System (INIS)

    Eichhammer, W.

    1989-01-01

    The utilization of transient processing in the technology of high efficient silicon solar cells is investigated. An ultraviolet laser (an ArF pulsed excimer laser working at 193 nm) is applied. Laser processing induces only a short superficial melting of the material and does not modify the transport properties in the base of the material. This mode of processing associated to ion implantation to form the junction as well as an oxide layer in an atmosphere of oxygen. The volume was left entirely cold in this process. The results of the investigation show: that an entirely cold process of solar cell fabrication needs a thermal treatment at a temperature around 600 C; that the oxides obtained are not satisfying as passivating layers; and that the Rapid Thermal Processing (RTP) induced recombination centers are not directly related to the quenching step but a consequence of the presence of metal impurities. The utilisation of transient processing in the adiabatic regime (laser) and in the rapid isothermal regime (RTP) are possible as two complementary techniques for the realization of high efficiency solar cells

  4. Evaluating Technical Efficiency of Nursing Care Using Data Envelopment Analysis and Multilevel Modeling.

    Science.gov (United States)

    Min, Ari; Park, Chang Gi; Scott, Linda D

    2016-05-23

    Data envelopment analysis (DEA) is an advantageous non-parametric technique for evaluating relative efficiency of performance. This article describes use of DEA to estimate technical efficiency of nursing care and demonstrates the benefits of using multilevel modeling to identify characteristics of efficient facilities in the second stage of analysis. Data were drawn from LTCFocUS.org, a secondary database including nursing home data from the Online Survey Certification and Reporting System and Minimum Data Set. In this example, 2,267 non-hospital-based nursing homes were evaluated. Use of DEA with nurse staffing levels as inputs and quality of care as outputs allowed estimation of the relative technical efficiency of nursing care in these facilities. In the second stage, multilevel modeling was applied to identify organizational factors contributing to technical efficiency. Use of multilevel modeling avoided biased estimation of findings for nested data and provided comprehensive information on differences in technical efficiency among counties and states. © The Author(s) 2016.

  5. Reduced Syntactic Processing Efficiency in Older Adults During Sentence Comprehension

    Directory of Open Access Journals (Sweden)

    Zude Zhu

    2018-03-01

    Full Text Available Researchers have frequently reported an age-related decline in semantic processing during sentence comprehension. However, it remains unclear whether syntactic processing also declines or whether it remains constant as people age. In the present study, 26 younger adults and 20 older adults were recruited and matched in terms of working memory, general intelligence, verbal intelligence and fluency. They were then asked to make semantic acceptability judgments while completing a Chinese sentence reading task. The behavioral results revealed that the older adults had significantly lower accuracy on measures of semantic and syntactic processing compared to younger adults. Event-related potential (ERP results showed that during semantic processing, older adults had a significantly reduced amplitude and delayed peak latency of the N400 compared to the younger adults. During syntactic processing, older adults also showed delayed peak latency of the P600 relative to younger adults. Moreover, while P600 amplitude was comparable between the two age groups, larger P600 amplitude was associated with worse performance only in the older adults. Together, the behavioral and ERP data suggest that there is an age-related decline in both semantic and syntactic processing, with a trend toward lower efficiency in syntactic ability.

  6. Academic Performance and Burnout: An Efficient Frontier Analysis of Resource Use Efficiency among Employed University Students

    Science.gov (United States)

    Galbraith, Craig S.; Merrill, Gregory B.

    2015-01-01

    We examine the impact of university student burnout on academic achievement. With a longitudinal sample of working undergraduate university business and economics students, we use a two-step analytical process to estimate the efficient frontiers of student productivity given inputs of labour and capital and then analyse the potential determinants…

  7. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy.

    Science.gov (United States)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-19

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3D-MIP platform when a larger number of cores is available.

  8. Increasing a large petrochemical company efficiency by improvement of decision making process

    OpenAIRE

    Kirin Snežana D.; Nešić Lela G.

    2010-01-01

    The paper shows the results of a research conducted in a large petrochemical company, in a state under transition, with the aim to "shed light" on the decision making process from the aspect of personal characteristics of the employees, in order to use the results to improve decision making process and increase company efficiency. The research was conducted by a survey, i.e. by filling out a questionnaire specially made for this purpose, in real conditions, during working hours. The sample of...

  9. An Efficient Process for Pretreatment of Lignocelluloses in Functional Ionic Liquids

    Directory of Open Access Journals (Sweden)

    Shi-Jia Dong

    2015-01-01

    Full Text Available Background and Aims. The complex structure of the lignocelluloses is the main obstacle in the conversion of lignocellulosic biomass into valuable products. Ionic liquids provide the opportunities for their efficient pretreatment for biomass. Therefore, in this work, pretreatment of corn stalk was carried out in ultrasonic-assisted ionic liquid including 1-butyl-3-methylimidazolium chloride [BMIM]Cl, 1-H-3-methylimidazolium chloride [HMIM]Cl, and 1-(1-propylsulfonic-3-imidazolium chloride [HSO3-pMIM]Cl at 70°C for 2 h. We compared the pretreatments by ionic liquid with and without the addition of deionized water. Fourier transform infrared spectroscopy (FTIR and scanning electron microscopy (SEM were employed to analyze the chemical characteristics of regenerated cellulose-rich materials. Results. [HMIM]Cl and [HSO3-pMIM]Cl were effective in lignin extraction to obtain cellulose-rich materials. FTIR analysis and SEM analysis indicated the effective lignin removal and the reduced crystallinity of cellulose-rich materials. Enzymatic hydrolysis of cellulose-rich materials was performed efficiently. High yields of reducing sugar and glucose were obtained when the corn stalk was pretreated by [HMIM]Cl and [HSO3-pMIM]Cl. Conclusions. Ionic liquids provided the ideal environment for lignin extraction and enzymatic hydrolysis of corn stalk and [HMIM]Cl and [HSO3-pMIM]Cl proved the most efficient ionic liquids. This simple and environmentally acceptable method has a great potential for the preparation of bioethanol for industrial production.

  10. Modeling technical efficiency of inshore fishery using data envelopment analysis

    Science.gov (United States)

    Rahman, Rahayu; Zahid, Zalina; Khairi, Siti Shaliza Mohd; Hussin, Siti Aida Sheikh

    2016-10-01

    Fishery industry contributes significantly to the economy of Malaysia. This study utilized Data Envelopment Analysis application in estimating the technical efficiency of fishery in Terengganu, a state on the eastern coast of Peninsular Malaysia, based on multiple output, i.e. total fish landing and income of fishermen with six inputs, i.e. engine power, vessel size, number of trips, number of workers, cost and operation distance. The data were collected by survey conducted between November and December 2014. The decision making units (DMUs) involved 100 fishermen from 10 fishery areas. The result showed that the technical efficiency in Season I (dry season) and Season II (rainy season) were 90.2% and 66.7% respectively. About 27% of the fishermen were rated to be efficient during Season I, meanwhile only 13% of the fishermen achieved full efficiency 100% during Season II. The results also found out that there was a significance difference in the efficiency performance between the fishery areas.

  11. Efficiency of alfalfa seed processing with different seed purity

    OpenAIRE

    Đokić, Dragoslav; Stanisavljević, Rade; Terzić, Dragan; Milenković, Jasmina; Radivojević, Gordana; Koprivica, Ranko; Štrbanović, Ratibor

    2015-01-01

    The work was carried out analysis of the impact of the initial purity of raw alfalfa seed on the resulting amount of processed seed in the processing. Alfalfa is very important perennial forage legume which is used for fodder and seed production. Alfalfa seed is possible to achieve high yields and very good financial effects. To obtain the seed material with good characteristics complex machines for cleaning and sorting seeds are used. In the processing center of the Institute for forage crop...

  12. Technical- and environmental-efficiency analysis of irrigated cotton-cropping systems in Punjab, Pakistan using data envelopment analysis.

    Science.gov (United States)

    Ullah, Asmat; Perret, Sylvain R

    2014-08-01

    Cotton cropping in Pakistan uses substantial quantities of resources and adversely affects the environment with pollutants from the inputs, particularly pesticides. A question remains regarding to what extent the reduction of such environmental impact is possible without compromising the farmers' income. This paper investigates the environmental, technical, and economic performances of selected irrigated cotton-cropping systems in Punjab to quantify the sustainability of cotton farming and reveal options for improvement. Using mostly primary data, our study quantifies the technical, cost, and environmental efficiencies of different farm sizes. A set of indicators has been computed to reflect these three domains of efficiency using the data envelopment analysis technique. The results indicate that farmers are broadly environmentally inefficient; which primarily results from poor technical inefficiency. Based on an improved input mix, the average potential environmental impact reduction for small, medium, and large farms is 9, 13, and 11 %, respectively, without compromising the economic return. Moreover, the differences in technical, cost, and environmental efficiencies between small and medium and small and large farm sizes were statistically significant. The second-stage regression analysis identifies that the entire farm size significantly affects the efficiencies, whereas exposure to extension and training has positive effects, and the sowing methods significantly affect the technical and environmental efficiencies. Paradoxically, the formal education level is determined to affect the efficiencies negatively. This paper discusses policy interventions that can improve the technical efficiency to ultimately increase the environmental efficiency and reduce the farmers' operating costs.

  13. Measuring the efficiency of dental departments in medical centers: a nonparametric analysis approach.

    Science.gov (United States)

    Wang, Su-Chen; Tsai, Chi-Cheng; Huang, Shun-Te; Hong, Yu-Jue

    2002-12-01

    Data envelopment analysis (DEA), a cross-sectional study design based on secondary data analysis, was used to evaluate the relative operational efficiency of 16 dental departments in medical centers in Taiwan in 1999. The results indicated that 68.7% of all dental departments in medical centers had poor performance in terms of overall efficiency and scale efficiency. All relatively efficient dental departments were in private medical centers. Half of these dental departments were unable to fully utilize available medical resources. 75.0% of public medical centers did not take full advantage of medical resources at their disposal. In the returns to scale, 56.3% of dental departments in medical centers exhibited increasing returns to scale, due to the insufficient scale influencing overall hospital operational efficiency. Public medical centers accounted for 77.8% of the institutions affected. The scale of dental departments in private medical centers was more appropriate than those in public medical centers. In the sensitivity analysis, the numbers of residents, interns, and published papers were used to assess teaching and research. Greater emphasis on teaching and research in medical centers has a large effect on the relative inefficiency of hospital operation. Dental departments in private medical centers had a higher mean overall efficiency score than those in public medical centers, and the overall efficiency of dental departments in non-university hospitals was greater than those in university hospitals. There was no information to evaluate the long-term efficiency of each dental department in all hospitals. A different combination of input and output variables, using common multipliers for efficiency value measurements in DEA, may help establish different pioneering dental departments in hospitals.

  14. Efficient Separations and Processing Integrated Program (ESP-IP): Technology summary

    International Nuclear Information System (INIS)

    1994-02-01

    The Efficient Separations and Processing Integrated Program (ESPIP) was created in 1991 to identify, develop and perfect separations technologies and processes to treat wastes and address environmental problems throughout the DOE Complex. These wastes and environmental problems, located at more than 100 contaminated installations in 36 states and territories, are the result of half a century of nuclear processing activities by DOE and its predecessor organizations. The cost of cleaning up this legacy has been estimated to be of the order of hundreds of billions of dollars, and ESPIP's origin came with the realization that if new separations and processes can produce even a marginal reduction in cost then billions of dollars will be saved. The ultimate mission for ESPIP, as outlined in the ESPIP Strategic Plan, is: to provide Separations Technologies and Processes (STPS) to process and immobilize a wide spectrum of radioactive and hazardous defense wastes; to coordinate STP research and development efforts within DOE; to explore the potential uses of separated radionuclides; to transfer demonstrated separations and processing technologies developed by DOE to the US industrial sector, and to facilitate competitiveness of US technology and industry in the world market. Technology research and development currently under investigation by ESPIP can be divided into four broad areas: cesium and strontium removal; TRU and other HLW separations; sludge technology, and other technologies

  15. Can Intrinsic Fluctuations Increase Efficiency in Neural Information Processing?

    Science.gov (United States)

    Liljenström, Hans

    2003-05-01

    All natural processes are accompanied by fluctuations, characterized as noise or chaos. Biological systems, which have evolved during billions of years, are likely to have adapted, not only to cope with such fluctuations, but also to make use of them. We investigate how the complex dynamics of the brain, including oscillations, chaos and noise, can affect the efficiency of neural information processing. In particular, we consider the amplification and functional role of internal fluctuations. Using computer simulations of a neural network model of the olfactory cortex and hippocampus, we demonstrate how microscopic fluctuations can result in global effects at the network level. We show that the rate of information processing in associative memory tasks can be maximized for optimal noise levels, analogous to stochastic resonance phenomena. Noise can also induce transitions between different dynamical states, which could be of significance for learning and memory. A chaotic-like behavior, induced by noise or by an increase in neuronal excitability, can enhance system performance if it is transient and converges to a limit cycle memory state. We speculate whether this dynamical behavior perhaps could be related to (creative) thinking.

  16. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  17. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  18. Efficient Option Pricing under Levy Processes, with CVA and FVA

    Directory of Open Access Journals (Sweden)

    Jimmy eLaw

    2015-07-01

    Full Text Available We generalize the Piterbarg (2010 model to include 1 bilateral default risk as in Burgard and Kjaer (2012, and 2 jumps in the dynamics of the underlying asset using general classes of L'evy processes of exponential type. We develop an efficient explicit-implicit scheme for European options and barrier options taking CVA-FVA into account. We highlight the importance of this work in the context of trading, pricing and management a derivative portfolio given the trajectory of regulations.

  19. Emergency preparedness planning: A process to insure effectiveness and efficiency

    International Nuclear Information System (INIS)

    Schroeder, A.J. Jr.

    1994-01-01

    Prevention is undoubtedly the preferred policy regarding emergency response. Unfortunately, despite best intentions, emergencies do occur. It is the prudent operator that has well written and exercised plans in place to respond to the full suite of possible situations. This paper presents a planning process to help personnel develop and/or maintain emergency management capability. It is equally applicable at the field location, the district/regional office, or the corporate headquarters. It is not limited in scope and can be useful for planners addressing incidents ranging from fires, explosions, spills/releases, computer system failure, terrorist threats and natural disasters. By following the steps in the process diagram, the planner will document emergency management capability in a logical and efficient manner which should result in effective emergency response and recovery plans. The astute planner will immediately see that the process presented is a continuing one, fully compatible with the principles of continuous improvement

  20. Efficient Learning Design

    DEFF Research Database (Denmark)

    Godsk, Mikkel

    This paper presents the current approach to implementing educational technology with learning design at the Faculty of Science and Technology, Aarhus University, by introducing the concept of ‘efficient learning design’. The underlying hypothesis is that implementing learning design is more than...... engaging educators in the design process and developing teaching and learning, it is a shift in educational practice that potentially requires a stakeholder analysis and ultimately a business model for the deployment. What is most important is to balance the institutional, educator, and student...... perspectives and to consider all these in conjunction in order to obtain a sustainable, efficient learning design. The approach to deploying learning design in terms of the concept of efficient learning design, the catalyst for educational development, i.e. the learning design model and how it is being used...

  1. Antimicrobial nanocapsules: from new solvent-free process to in vitro efficiency

    Directory of Open Access Journals (Sweden)

    Steelandt J

    2014-09-01

    Full Text Available Julie Steelandt,1 Damien Salmon,1,2 Elodie Gilbert,1 Eyad Almouazen,3 François NR Renaud,4 Laurène Roussel,1 Marek Haftek,5 Fabrice Pirot1,2 1University Claude Bernard Lyon 1, Faculty of Pharmacy, Fundamental, Clinical and Therapeutic Aspects of Skin Barrier Function, FRIPharm, Laboratoire de Pharmacie Galénique Industrielle, 2Hospital Pharmacy, FRIPharm, Hospital Edouard Herriot, Hospices Civils de Lyon, 3Laboratoire d’Automatique et de Génie des Procédés, University Claude Bernard Lyon 1, 4University Claude Bernard Lyon 1, UMR CNRS 5510/MATEIS, 5University Claude Bernard Lyon 1, Faculty of Pharmacy, Fundamental, Clinical and Therapeutic Aspects of Skin Barrier Function, FRIPharm, Laboratoire de Dermatologie, Lyon, France Abstract: Skin and mucosal infections constitute recurrent pathologies resulting from either inappropriate antiseptic procedures or a lack of efficacy of antimicrobial products. In this field, nanomaterials offer interesting antimicrobial properties (eg, long-lasting activity; intracellular and tissular penetration as compared to conventional products. The aim of this work was to produce, by a new solvent-free process, a stable and easily freeze-dryable chlorhexidine-loaded polymeric nanocapsule (CHX-NC suspension, and then to assess the antimicrobial properties of nanomaterials. The relevance of the process and the physicochemical properties of the CHX-NCs were examined by the assessment of encapsulation efficiency, stability of the nanomaterial suspension after 1 month of storage, and by analysis of granulometry and surface electric charge of nanocapsules. In vitro antimicrobial activities of the CHX-NCs and chlorhexidine digluconate solution were compared by measuring the inhibition diameters of two bacterial strains (Escherichia coli and Staphylococcus aureus and one fungal strain (Candida albicans cultured onto appropriate media. Based on the findings of this study, we report a new solvent-free process for the

  2. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  3. Efficient Multidisciplinary Analysis Approach for Conceptual Design of Aircraft with Large Shape Change

    Science.gov (United States)

    Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2009-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing

  4. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  5. Interrelationships between trait anxiety, situational stress and mental effort predict phonological processing efficiency, but not effectiveness.

    Science.gov (United States)

    Edwards, Elizabeth J; Edwards, Mark S; Lyvers, Michael

    2016-08-01

    Attentional control theory (ACT) describes the mechanisms associated with the relationship between anxiety and cognitive performance. We investigated the relationship between cognitive trait anxiety, situational stress and mental effort on phonological performance using a simple (forward-) and complex (backward-) word span task. Ninety undergraduate students participated in the study. Predictor variables were cognitive trait anxiety, indexed using questionnaire scores; situational stress, manipulated using ego threat instructions; and perceived level of mental effort, measured using a visual analogue scale. Criterion variables (a) performance effectiveness (accuracy) and (b) processing efficiency (accuracy divided by response time) were analyzed in separate multiple moderated-regression analyses. The results revealed (a) no relationship between the predictors and performance effectiveness, and (b) a significant 3-way interaction on processing efficiency for both the simple and complex tasks, such that at higher effort, trait anxiety and situational stress did not predict processing efficiency, whereas at lower effort, higher trait anxiety was associated with lower efficiency at high situational stress, but not at low situational stress. Our results were in full support of the assumptions of ACT and implications for future research are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Process Cycle Efficiency Improvement Through Lean: A Case Study

    Directory of Open Access Journals (Sweden)

    P.V. Mohanram

    2011-06-01

    Full Text Available Lean manufacturing is an applied methodology of scientific, objective techniques that cause work tasks in a process to be performed with a minimum of non-value adding activities resulting in greatly reduced wait time, queue time, move time, administrative time, and other delays. This work addresses the implementation of lean principles in a construction equipment company. The prime objective is to evolve and test several strategies to eliminate waste on the shop floor. This paper describes an application of value stream mapping (VSM. Consequently, the present and future states of value stream maps are constructed to improve the production process by identifying waste and its sources. A noticeable reduction in cycle time and increase in cycle efficiency is confirmed. The production flow was optimized thus minimizing several non-value added activities/times such as bottlenecking time, waiting time, material handling time, etc. This case study can be useful in developing a more generic approach to design lean environment.

  7. Factors affecting energy and nitrogen efficiency of dairy cows: A meta-analysis

    NARCIS (Netherlands)

    Phuong, H.N.; Friggens, N.C.; Boer, de I.J.M.; Schmidely, P.

    2013-01-01

    A meta-analysis was performed to explore the correlation between energy and nitrogen efficiency of dairy cows, and to study nutritional and animal factors that influence these efficiencies, as well as their relationship. Treatment mean values were extracted from 68 peer-reviewed studies, including

  8. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  9. [Transmission efficiency analysis of near-field fiber probe using FDTD simulation].

    Science.gov (United States)

    Huang, Wei; Dai, Song-Tao; Wang, Huai-Yu; Zhou, Yun-Song

    2011-10-01

    A fiber probe is the key component of near-field optical technology which is widely used in high resolution imaging, spectroscopy detection and nano processing. How to improve the transmission efficiency of the fiber probe is a very important problem in the application of near-field optical technology. Based on the results of 3D-FDTD computation, the dependence of the transmission efficiency on the cone angle, the aperture diameter, the wavelength and the thickness of metal cladding is revealed. The authors have also made a comparison between naked probe and the probe with metal cladding in terms of transmission efficiency and spatial resolution. In addition, the authors have discovered the fluctuation phenomena of transmission efficiency as the wavelength of incident laser increases.

  10. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    Science.gov (United States)

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  11. Quality assurance of the Teaching – Learning Process in the Financial Economic Analysis

    Directory of Open Access Journals (Sweden)

    R.F. Creţu

    2013-06-01

    Full Text Available In the current context of economic development, human resources capable of long-life training and adaptable to economic change are essential elements of a model of growth based on competitiveness, efficiency and quality. In this paper we propose to identify strategies to improve the quality of the teaching learning process of Financial Economic Analysis in the Bucharest Academy of Economic Studies to the students in the first cycle of education - graduates cycle, the final year of study. Classroom observation is the qualitative method used to monitor the quality of the teaching learning process. As a complex process of instruments, classroom observation may take different forms and can play several roles.

  12. Novel Approach to Increase the Energy-related Process Efficiency and Performance of Laser Brazing

    Science.gov (United States)

    Mittelstädt, C.; Seefeld, T.; Radel, T.; Vollertsen, F.

    Although laser brazing is well established, the energy-related efficiency of this joining method is quite low. That is because of low absorptivity of solid-state laser radiation, especially when copper base braze metals are used. Conventionally the laser beam is set close to the vertical axis and the filler wire is delivered under a flat angle. Therefore, the most of the utilized laser power is reflected and thus left unexploited. To address this situation an alternative processing concept for laser brazing, where the laser beam is leading the filler wire, has been investigated intending to make use of reflected shares of the laser radiation. Process monitoring shows, that the reflection of the laser beam can be used purposefully to preheat the substrate which is supporting the wetting and furthermore increasing the efficiency of the process. Experiments address a standard application from the automotive industry joining zinc coated steels using CuSi3Mn1 filler wire. Feasibility of the alternative processing concept is demonstrated, showing that higher processing speeds can be attained, reducing the required energy per unit length while maintaining joint properties.

  13. Efficient separations and processing crosscutting program 1996 technical exchange meeting. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    This document contains summaries of technology development presented at the 1996 Efficient Separations and Processing Crosscutting Program Technical Exchange Meeting. This meeting is held annually to promote a free exchange of ideas among technology developers, potential users and other interested parties within the EM community. During this meeting the following many separation processes technologies were discussed such as ion exchange, membrane separation, vacuum distillation, selective sorption, and solvent extraction. Other topics discussed include: waste forms; testing or inorganic sorbents for radionuclide and heavy metal removal; selective crystallization; and electrochemical treatment of liquid wastes. This is the leading abstract, individual papers have been indexed separately for the databases.

  14. Efficient separations and processing crosscutting program 1996 technical exchange meeting. Proceedings

    International Nuclear Information System (INIS)

    1996-01-01

    This document contains summaries of technology development presented at the 1996 Efficient Separations and Processing Crosscutting Program Technical Exchange Meeting. This meeting is held annually to promote a free exchange of ideas among technology developers, potential users and other interested parties within the EM community. During this meeting the following many separation processes technologies were discussed such as ion exchange, membrane separation, vacuum distillation, selective sorption, and solvent extraction. Other topics discussed include: waste forms; testing or inorganic sorbents for radionuclide and heavy metal removal; selective crystallization; and electrochemical treatment of liquid wastes. This is the leading abstract, individual papers have been indexed separately for the databases

  15. Analysis of farm household technical efficiency in small-scale ...

    African Journals Online (AJOL)

    Data Envelopment Analysis (DEA) was applied to farm-level cross-sectional data collected in mid-2013 after the implementation of CIP activities. Our empirical results indicate that CIP participants and improved farmers (using using both traditional and modern hives) had the highest average levels of technical efficiencies.

  16. Kinetics and energy efficiency for the degradation of 1,4-dioxane by electro-peroxone process

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Huijiao; Bakheet, Belal; Yuan, Shi; Li, Xiang; Yu, Gang [School of Environment, Tsinghua University, Beijing 100084 (China); Murayama, Seiichi [Power and Industrial Systems R& D Center, Toshiba Corporation, Fuchu-shi, Tokyo (Japan); Wang, Yujue, E-mail: wangyujue@tsinghua.edu.cn [School of Environment, Tsinghua University, Beijing 100084 (China)

    2015-08-30

    Highlights: • E-peroxone couples electrolysis with ozonation to driven peroxone reaction for pollutant degradation. • Significant amounts of ·OH can be efficiently produced in the E-peroxone process. • E-peroxone greatly enhances 1,4-dioxane degradation kinetics compared with ozonation and electrolysis. • E-peroxone consumes less energy for 1,4-dioxane mineralization than ozonation and electrolysis. • E-peroxone offers a cost-effective and energy-efficient alternative to degrade 1,4-dioxane. - Abstract: Degradation of 1,4-dioxane by ozonation, electrolysis, and their combined electro-peroxone (E-peroxone) process was investigated. The E-peroxone process used a carbon-polytetrafluorethylene cathode to electrocatalytically convert O{sub 2} in the sparged ozone generator effluent (O{sub 2} and O{sub 3} gas mixture) to H{sub 2}O{sub 2}. The electro-generated H{sub 2}O{sub 2} then react with sparged O{sub 3} to yield aqueous ·OH, which can in turn oxidize pollutants rapidly in the bulk solution. Using p-chlorobenzoic acid as ·OH probe, the pseudo-steady concentration of ·OH was determined to be ∼0.744 × 10{sup −9} mM in the E-peroxone process, which is approximately 10 and 186 times of that in ozonation and electrolysis using a Pt anode. Thanks to its higher ·OH concentration, the E-peroxone process eliminated 96.6% total organic carbon (TOC) from a 1,4-dioxane solution after 2 h treatment with a specific energy consumption (SEC) of 0.376 kWh g{sup −1} TOC{sub removed}. In comparison, ozonation and electrolysis using a boron-doped diamond anode removed only ∼6.1% and 26.9% TOC with SEC of 2.43 and 0.558 kWh g{sup −1} TOC{sub removed}, respectively. The results indicate that the E-peroxone process can significantly improve the kinetics and energy efficiency for 1,4-dioxane mineralization as compared to the two individual processes. The E-peroxone process may thus offer a highly effective and energy-efficient alternative to treat 1,4-dioxane

  17. The application of short-term efficiency analysis in diagnosing occupational voice disorders

    Directory of Open Access Journals (Sweden)

    Ewa Niebudek-Bogusz

    2015-06-01

    Full Text Available Background: An objective determination of the range of vocal efficiency is rather difficult. The aim of the study was to assess the possibility of application of short-term acoustic efficiency analysis in diagnosing occupational voice disorders. Material and Methods: The study covered 98 people (87 women and 11 men diagnosed with occupational dysphonia through videostroboscopic examination. The control group comprised 100 people (81 women and 19 men with normal voices. The short-term acoustic analysis was carried out by means of DiagnoScope software, including classical parameters (Jitter group, Shimmer group and the assessment of noise degree NHR, as well as new short-term efficiency parameters determined in a short time period during sustained phonation of the vowel “a.” The results were then compared. Results: The values of all the examined classical parameters were considerably higher in the study group of pathological voices than in the control group of normal voices (p = 0.00. The aerodynamic parameter, maximum phonation time, was significantly shorter by over 0.5 s in the study group than in the control group. The majority of the acoustic efficiency parameters were also considerably worse in the study group of subjects with occupational dysphonia than in the control group (p = 0.00. Moreover, the correlation between the efficiency parameters and most of the classical acoustic parameters in the study group implies that for the voices with occupational pathology the decreased efficiency of the vocal apparatus is reflected in the acoustic voice structure. Conclusions: Efficiency parameters determined during short-term acoustic analysis can be an objective indicator of the decreased phonatory function of the larynx, useful in diagnosing occupational vocal pathology. Med Pr 2015;66(2:225–234

  18. Zinc oxide nanostructures and its nano-compounds for efficient visible light photo-catalytic processes

    Science.gov (United States)

    Adam, Rania E.; Alnoor, Hatim; Elhag, Sami; Nur, Omer; Willander, Magnus

    2017-02-01

    Zinc oxide (ZnO) in its nanostructure form is a promising material for visible light emission/absorption and utilization in different energy efficient photocatalytic processes. We will first present our recent results on the effect of varying the molar ratio of the synthesis nutrients on visible light emission. Further we will use the optimized conditions from the molar ration experiments to vary the synthesis processing parameters like stirring time etc. and the effect of all these parameters in order to optimize the efficiency and control the emission spectrum are investigated using different complementary techniques. Cathodoluminescence (CL) is combined with photoluminescence (PL) and electroluminescence (EL) as the techniques to investigate and optimizes visible light emission from ZnO/GaN light emitting diodes. We will then show and discuss our recent finding of the use of high quality ZnO nanoparticles (NPs) for efficient photo-degradation of toxic dyes using the visible spectra, namely with a wavelength up to 800 nm. In the end, we show how ZnO nanorods (NRs) are used as the first template to be transferred to bismuth zinc vanadate (BiZn2VO6). The BiZn2VO6 is then used to demonstrate efficient and cost effective hydrogen production through photoelectrochemical water splitting using solar radiation.

  19. Hard rock tunnel boring machine penetration test as an indicator of chipping process efficiency

    Directory of Open Access Journals (Sweden)

    M.C. Villeneuve

    2017-08-01

    Full Text Available The transition from grinding to chipping can be observed in tunnel boring machine (TBM penetration test data by plotting the penetration rate (distance/revolution against the net cutter thrust (force per cutter over the full range of penetration rates in the test. Correlating penetration test data to the geological and geomechanical characteristics of rock masses through which a penetration test is conducted provides the ability to reveal the efficiency of the chipping process in response to changing geological conditions. Penetration test data can also be used to identify stress-induced tunnel face instability. This research shows that the strength of the rock is an important parameter for controlling how much net cutter thrust is required to transition from grinding to chipping. It also shows that the geological characteristics of a rock will determine how efficient chipping occurs once it has begun. In particular, geological characteristics that lead to efficient fracture propagation, such as fabric and mica contents, will lead to efficient chipping. These findings will enable a better correlation between TBM performance and geological conditions for use in TBM design, as a basis for contractual payments where penetration rate dominates the excavation cycle and in further academic investigations into the TBM excavation process.

  20. A comparative analysis of selected wastewater pretreatment processes in food industry

    Science.gov (United States)

    Jaszczyszyn, Katarzyna; Góra, Wojciech; Dymaczewski, Zbysław; Borowiak, Robert

    2018-02-01

    The article presents a comparative analysis of the classical coagulation with the iron sulphate and adsorption on bentonite for the pretreatment of wastewater in the food industry. As a result of the studies, chemical oxygen demand (COD) and total nitrogen (TN) reduction were found to be comparable in both technologies, and a 29% higher total phosphorus removal efficiency by the coagulation was observed. After the coagulation and adsorption processes, a significant difference between mineral and organic fraction in the sludge was found (49% and 51% for bentonite and 28% and 72% for iron sulphate, respectively).

  1. Assessing the Efficiency of commercial Tunisian Banks using Fuzzy Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Houssine Tlig

    2017-08-01

    Full Text Available The banking sector is of great importance to Tunisian's economy. Major commercial banks continue to spend high proportion of their budgets on new technologies and innovation in order to satisfy their customers and enhance their competitiveness. Consequently, performance analysis has become part of their management practices.This paper aims to evaluate the efficiency of commercial Tunisian banks in terms of several crisp and imprecise data. Two approaches of fuzzy data envelopment analysis (FDEA, the possibility approach and the approach based on relations between fuzzy numbers (BRONF, are used to obtain the efficiency score of each bank. The results show that, in a competitive environment, no-financial inputs and outputs should be taken into account in order to obtain credible and realistic efficiency scores.

  2. Combined microfluidization and ultrasonication: a synergistic protocol for high-efficient processing of SWCNT dispersions with high quality

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Sida, E-mail: s.luo@buaa.edu.cn [Beihang University, School of Mechanical Engineering and Automation (China); Liu, Tao, E-mail: tliu@fsu.edu [Florida State University, High-Performance Materials Institute (United States); Wang, Yong; Li, Liuhe [Beihang University, School of Mechanical Engineering and Automation (China); Wang, Guantao; Luo, Yun [China University of Geosciences, Center of Safety Research, School of Engineering and Technology (China)

    2016-08-15

    High-efficient and large-scale production of high-quality CNT dispersions is necessary for meeting the future needs to develop various CNT-based electronic devices. Herein, we have designed novel processing protocols by combining conventional ultrasonication process with a new microfluidization technique to produce high-quality SWCNT dispersions with improved processing efficiency. To judge the quality of SWCNT dispersions, one critical factor is the degree of exfoliation, which could be quantified by both geometrical dimension of the exfoliated nanotubes and percentage of individual tubes in a given dispersion. In this paper, the synergistic effect of the combined protocols was systematically investigated through evaluating SWCNT dispersions with newly developed characterization techniques, namely preparative ultracentrifuge method (PUM) and simultaneous Raman scattering and photoluminescence spectroscopy (SRSPL). The results of both techniques draw similar conclusions that as compared with either of the processes operated separately, a low-pass microfluidization followed by a reasonable duration of ultrasonication could substantially improve the processing efficiency to produce high-quality SWCNT dispersions with averaged particle length and diameter as small as ~600 and ~2 nm, respectively.Graphical abstract.

  3. Efficient Cross-Device Query Processing

    NARCIS (Netherlands)

    H. Pirk (Holger)

    2012-01-01

    htmlabstractThe increasing diversity of hardware within a single system promises large performance gains but also poses a challenge for data management systems. Strategies for the efficient use of hardware with large performance differences are still lacking. For example, existing research on GPU

  4. Elk Valley Rancheria Energy Efficiency and Alternatives Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ed Wait, Elk Valley Rancheria; Frank Ziano & Associates, Inc.

    2011-11-30

    Elk Valley Rancheria; Tribe; renewable energy; energy options analysis. The Elk Valley Rancheria, California ('Tribe') is a federally recognized Indian tribe located in Del Norte County, California, in the northwestern corner of California. The Tribe, its members and Tribal enterprises are challenged by increasing energy costs and undeveloped local energy resources. The Tribe currently lacks an energy program. The Tribal government lacked sufficient information to make informed decisions about potential renewable energy resources, energy alternatives and other energy management issues. To meet this challenge efficiently, the Tribe contracted with Frank Zaino and Associates, Inc. to help become more energy self-sufficient, by reducing their energy costs and promoting energy alternatives that stimulate economic development. Frank Zaino & Associates, Inc. provided a high level economic screening analysis based on anticipated electric and natural gas rates. This was in an effort to determine which alternative energy system will performed at a higher level so the Tribe could reduce their energy model by 30% from alternative fuel sources. The feasibility study will identify suitable energy alternatives and conservation methods that will benefit the Tribe and tribal community through important reductions in cost. The lessons learned from these conservation efforts will yield knowledge that will serve a wider goal of executing energy efficiency measures and practices in Tribal residences and business facilities. Pacific Power is the provider of electrical power to the four properties under review at $ 0.08 per Kilowatt-hour (KWH). This is a very low energy cost compared to alternative energy sources. The Tribe used baseline audits to assess current and historic energy usage at four Rancheria owned facilities. Past electric and gas billing statements were retained for review for the four buildings that will be audited. A comparative assessment of the various

  5. Efficient CFIE-MOM Analysis of 3-D PEC Scatterers in Layered Media

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Jørgensen, E.; Meincke, Peter

    2002-01-01

    This paper presents an efficient technique for analysis of arbitrary closed perfectly conducting (PEC) scatterers in layered media. The technique is based on a method of moments (MoM) solution of the combined field integral equation (CFIE). The high efficiency is obtained by employing an accurate...

  6. Efficient Integrative Multi-SNP Association Analysis via Deterministic Approximation of Posteriors.

    Science.gov (United States)

    Wen, Xiaoquan; Lee, Yeji; Luca, Francesca; Pique-Regi, Roger

    2016-06-02

    With the increasing availability of functional genomic data, incorporating genomic annotations into genetic association analysis has become a standard procedure. However, the existing methods often lack rigor and/or computational efficiency and consequently do not maximize the utility of functional annotations. In this paper, we propose a rigorous inference procedure to perform integrative association analysis incorporating genomic annotations for both traditional GWASs and emerging molecular QTL mapping studies. In particular, we propose an algorithm, named deterministic approximation of posteriors (DAP), which enables highly efficient and accurate joint enrichment analysis and identification of multiple causal variants. We use a series of simulation studies to highlight the power and computational efficiency of our proposed approach and further demonstrate it by analyzing the cross-population eQTL data from the GEUVADIS project and the multi-tissue eQTL data from the GTEx project. In particular, we find that genetic variants predicted to disrupt transcription factor binding sites are enriched in cis-eQTLs across all tissues. Moreover, the enrichment estimates obtained across the tissues are correlated with the cell types for which the annotations are derived. Copyright © 2016 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  7. ANALYSIS OF THE ENERGY SYSTEM BALANCE EFFICIENCY PROVIDED WITH THE DIFFERENT GROUPS OF GENERATING PLANTS

    Directory of Open Access Journals (Sweden)

    O. Maksymovа

    2017-12-01

    Full Text Available Currently methods of efficiency analysis are being developed and applied, based on optimization tasks for various types and modes. Usually, the optimization criterion for these objectives is efficiency that can be calculated in various ways, for which there is no concurrent views. The target function based on minimization of given cost that allows comparing options with the same useful effect is used to search for the best indicators of power plants operated within the system. Marginal costs on the amount of difference in the useful effect are introduced to the target function in case of various useful effects. The criterion of selecting the best power plant from an economic point of view is the difference between the reduced costs of the considered and the basic options, but this approach does not allow using the results for long-term projections. Such approach depends on the situation and does not reflect the real costs. The value of the target function to optimize the effectiveness of the technical-economic method is not "marginal" and does not allow assessing the impact of various processes on the overall option efficiency. Therefore, the development of the efficiency criterion that considers the changing needs of the energy system is relevant for analyzing the power plant.

  8. Profitability and efficiency of Italian utilities: cluster analysis of financial statement ratios

    International Nuclear Information System (INIS)

    Linares, E.

    2008-01-01

    The last ten years have witnessed conspicuous changes in European and Italian regulation of public utility services and in the strategies of the major players in these fields. In response to these changes Italian utilities have made a variety of choices regarding size, presence in more or less capital-intensive stages of different value chains, and diversification. These choices have been implemented both through internal growth and by means of mergers and acquisitions. In this context it is interesting to try to establish whether there is a nexus between these choices and the performance of Italian utilities in terms of profitability and efficiency. Therefore statistical multivariate analysis techniques (cluster analysis and factor analysis) have been applied to several ratios obtained from the 2005 financial statement of 34 utilities. First, a hierarchical cluster analysis method has been applied to financial statement data in order to identify homogeneous groups based on several indicators of the incidence of costs (external costs, personnel costs, depreciation and amortization), profitability (return on sales, return on assets, return on equity) and efficiency (in the utilization of personnel, of total assets, of property, plant and equipment). Five clusters have been found. Then the clusters have been characterized in terms of the aforementioned indicators, the presence in different stages of the energy value chains (electricity and gas) and other descriptive variables (such as turnover, number of employees, assets, percentage of property, plant and equipment on total assets, sales revenues from electricity, gas, water supply and sanitation, waste collection and treatment and other services). In a second round cluster analysis has been preceded by factor analysis, in order to find a smaller set of variables. This procedure has revealed three not directly observable factors that can be interpreted as follows: i) efficiency in ordinary and financial management

  9. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  10. Analysis of energy end-use efficiency policy in Spain

    International Nuclear Information System (INIS)

    Collado, Rocío Román; Díaz, María Teresa Sanz

    2017-01-01

    The implementation of saving measures and energy efficiency entails the need to evaluate achievements in terms of energy saving and spending. This paper aims at analysing the effectiveness and economic efficiency of energy saving measures implemented in the Energy Savings and Efficiency Action Plan (2008–2012) (EAP4+) in Spain for 2010. The lack of assessment related to energy savings achieved and public spending allocated by the EAP4+ justifies the need of this analysis. The results show that the transport and building sectors seem to be the most important, from the energy efficiency perspective. Although they did not reach the direct energy savings that were expected, there is scope for reduction with the appropriate energy measures. For the effectiveness indicator, the best performance are achieved by public service, agricultural and fisheries and building sectors, while in terms of energy efficiency per monetary unit, the best results are achieved by transport, industry and agriculture sectors. Authors conclude that it is necessary that central, regional and local administrations will get involved, in order to get better estimates of the energy savings achieved and thus to affect the design of future energy efficiency measures at the lowest possible cost to the citizens. - Highlights: • Energy end-use efficiency policy is analysed in terms of energy savings and spending. • The energy savings achieved by some measures are not always provided. • The total energy savings achieved by transport and building sectors are large. • Different levels of administration should get involved in estimating energy savings.

  11. Inverse Statistics and Asset Allocation Efficiency

    Science.gov (United States)

    Bolgorian, Meysam

    In this paper using inverse statistics analysis, the effect of investment horizon on the efficiency of portfolio selection is examined. Inverse statistics analysis is a general tool also known as probability distribution of exit time that is used for detecting the distribution of the time in which a stochastic process exits from a zone. This analysis was used in Refs. 1 and 2 for studying the financial returns time series. This distribution provides an optimal investment horizon which determines the most likely horizon for gaining a specific return. Using samples of stocks from Tehran Stock Exchange (TSE) as an emerging market and S&P 500 as a developed market, effect of optimal investment horizon in asset allocation is assessed. It is found that taking into account the optimal investment horizon in TSE leads to more efficiency for large size portfolios while for stocks selected from S&P 500, regardless of portfolio size, this strategy does not only not produce more efficient portfolios, but also longer investment horizons provides more efficiency.

  12. Comparative analysis of technical efficiency for different production culture systems and species of freshwater aquaculture in Peninsular Malaysia

    Directory of Open Access Journals (Sweden)

    Abdullahi Iliyasu

    2016-05-01

    Full Text Available This study estimated the bias-corrected technical efficiency (BCTE of different culture systems and species of freshwater aquaculture in Malaysia using bootstrapping data envelopment analysis (DEA. Data were collected from 307 respondents from three states in Peninsular Malaysia using a well-structured questionnaire as well as oral interviews. The findings indicate that all technical efficiency scores for all culture systems and species are below the optimal level (i.e. one. In addition, the results show that farmers’ experience, contact with extension workers and household size have a positive and statistically significant impact on technical efficiency. This implies that farmers who have long tenure in fish farming and also the opportunity to meet with extension workers are operating close to the production frontier (technically efficient. On the other hand, the age of the farmers has a negative and statistically significant impact on technical efficiency. Although educational level and farm status have a positive impact on technical efficiency, they are statistically insignificant. Furthermore, all the inputs used in the production process of different culture systems and species contained slacks and need to be reduced accordingly. Feed, the major input in fish production and constituting over half of the production costs, is equally over-utilized. Thus, the government, in collaboration with research institutes and universities, should design a feeding formula for fish depending on species, culture systems and stages of growth. This could help to reduce production costs, increasing the farmers' income, as well as providing much needed animal protein to consumers at an affordable rate. Keywords: Bootstrapping data envelopment analysis (DEA, Technical efficiency, Technical inefficiency, Freshwater aquaculture, Malaysia

  13. Developing an energy efficient steam reforming process to produce hydrogen from sulfur-containing fuels

    Science.gov (United States)

    Simson, Amanda

    Hydrogen powered fuel cells have the potential to produce electricity with higher efficiency and lower emissions than conventional combustion technology. In order to realize the benefits of a hydrogen fuel cell an efficient method to produce hydrogen is needed. Currently, over 90% of hydrogen is produced from the steam reforming of natural gas. However, for many applications including fuel cell vehicles, the use of a liquid fuel rather than natural gas is desirable. This work investigates the feasibility of producing hydrogen efficiently by steam reforming E85 (85% ethanol/15% gasoline), a commercially available sulfur-containing transportation fuel. A Rh-Pt/SiO2-ZrO2 catalyst has demonstrated good activity for the E85 steam reforming reaction. An industrial steam reforming process is often run less efficiently, with more water and at higher temperatures, in order to prevent catalyst deactivation. Therefore, it is desirable to develop a process that can operate without catalyst deactivation at more energy efficient conditions. In this study, the steam reforming of a sulfur-containing fuel (E85) was studied at near stoichiometric steam/carbon ratios and at 650C, conditions at which catalyst deactivation is normally measured. At these conditions the catalyst was found to be stable steam reforming a sulfur-free E85. However, the addition of low concentrations of sulfur significantly deactivated the catalyst. The presence of sulfur in the fuel caused catalyst deactivation by promoting ethylene which generates surface carbon species (coke) that mask catalytic sites. The amount of coke increased during time on stream and became increasingly graphitic. However, the deactivation due to both sulfur adsorption and coke formation was reversible with air treatment at 650°C. However, regenerations were found to reduce the catalyst life. Air regenerations produce exotherms on the catalyst surface that cause structural changes to the catalyst. During regenerations the

  14. Laser processes and system technology for the production of high-efficient crystalline solar cells

    Science.gov (United States)

    Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.

    2012-10-01

    The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.

  15. An efficient and reproducible process for transmission electron microscopy (TEM) of rare cell populations

    Science.gov (United States)

    Kumar, Sachin; Ciraolo, Georgianne; Hinge, Ashwini; Filippi, Marie-Dominique

    2014-01-01

    Transmission electron microscopy (TEM) provides ultra-structural details of cells at the sub-organelle level. However, details of the cellular ultrastructure, and the cellular organization and content of various organelles in rare populations, particularly in the suspension, like hematopoietic stem cells (HSCs) remained elusive. This is mainly due to the requirement of millions of cells for TEM studies. Thus, there is a vital requirement of a method that will allow TEM studies with low cell numbers of such rare populations. We describe an alternative and novel approach for TEM studies for rare cell populations. Here we performed TEM study from 10,000 HSC cells with quite ease. In particular, tiny cell pellets were identified by Evans blue staining after PFA-GA fixation. The cell pellet was pre-embedded in agarose in a small microcentrifuge tube and processed for dehydration, infiltration and embedding. Semi-thin and ultra-thin sections identified clusters of numerous cells per sections with well preserved morphology and ultrastructural details of golgi complex and mitochondria. Together, this method provides an efficient, easy and reproducible process to perform qualitative and quantitative TEM analysis from limited biological samples including cells in suspension. PMID:24291346

  16. Role of computational efficiency in process simulation

    Directory of Open Access Journals (Sweden)

    Kurt Strand

    1989-07-01

    Full Text Available It is demonstrated how efficient numerical algorithms may be combined to yield a powerful environment for analysing and simulating dynamic systems. The importance of using efficient numerical algorithms is emphasized and demonstrated through examples from the petrochemical industry.

  17. Analysis of the Chinese Market for Building Energy Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Sha [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Evans, Meredydd [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Shi, Qing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-20

    China will account for about half of the new construction globally in the coming decade. Its floorspace doubled from 1996 to 2011, and Chinese rural buildings alone have as much floorspace as all of U.S. residential buildings. Building energy consumption has also grown, increasing by over 40% since 1990. To curb building energy demand, the Chinese government has launched a series of policies and programs. Combined, this growth in buildings and renovations, along with the policies to promote green buildings, are creating a large market for energy efficiency products and services. This report assesses the impact of China’s policies on building energy efficiency and on the market for energy efficiency in the future. The first chapter of this report introduces the trends in China, drawing on both historical analysis, and detailed modeling of the drivers behind changes in floorspace and building energy demand such as economic and population growth, urbanization, policy. The analysis describes the trends by region, building type and energy service. The second chapter discusses China’s policies to promote green buildings. China began developing building energy codes in the 1980s. Over time, the central government has increased the stringency of the code requirements and the extent of enforcement. The codes are mandatory in all new buildings and major renovations in China’s cities, and they have been a driving force behind the expansion of China’s markets for insulation, efficient windows, and other green building materials. China also has several other important policies to encourage efficient buildings, including the Three-Star Rating System (somewhat akin to LEED), financial incentives tied to efficiency, appliance standards, a phasing out of incandescent bulbs and promotion of efficient lighting, and several policies to encourage retrofits in existing buildings. In the third chapter, we take “deep dives” into the trends affecting key building components

  18. Integrating Efficiency of Industry Processes and Practices Alongside Technology Effectiveness in Space Transportation Cost Modeling and Analysis

    Science.gov (United States)

    Zapata, Edgar

    2012-01-01

    This paper presents past and current work in dealing with indirect industry and NASA costs when providing cost estimation or analysis for NASA projects and programs. Indirect costs, when defined as those costs in a project removed from the actual hardware or software hands-on labor; makes up most of the costs of today's complex large scale NASA space/industry projects. This appears to be the case across phases from research into development into production and into the operation of the system. Space transportation is the case of interest here. Modeling and cost estimation as a process rather than a product will be emphasized. Analysis as a series of belief systems in play among decision makers and decision factors will also be emphasized to provide context.

  19. Application of Optical Coherence Tomography Freeze-Drying Microscopy for Designing Lyophilization Process and Its Impact on Process Efficiency and Product Quality.

    Science.gov (United States)

    Korang-Yeboah, Maxwell; Srinivasan, Charudharshini; Siddiqui, Akhtar; Awotwe-Otoo, David; Cruz, Celia N; Muhammad, Ashraf

    2018-01-01

    Optical coherence tomography freeze-drying microscopy (OCT-FDM) is a novel technique that allows the three-dimensional imaging of a drug product during the entire lyophilization process. OCT-FDM consists of a single-vial freeze dryer (SVFD) affixed with an optical coherence tomography (OCT) imaging system. Unlike the conventional techniques, such as modulated differential scanning calorimetry (mDSC) and light transmission freeze-drying microscopy, used for predicting the product collapse temperature (Tc), the OCT-FDM approach seeks to mimic the actual product and process conditions during the lyophilization process. However, there is limited understanding on the application of this emerging technique to the design of the lyophilization process. In this study, we investigated the suitability of OCT-FDM technique in designing a lyophilization process. Moreover, we compared the product quality attributes of the resulting lyophilized product manufactured using Tc, a critical process control parameter, as determined by OCT-FDM versus as estimated by mDSC. OCT-FDM analysis revealed the absence of collapse even for the low protein concentration (5 mg/ml) and low solid content formulation (1%w/v) studied. This was confirmed by lab scale lyophilization. In addition, lyophilization cycles designed using Tc values obtained from OCT-FDM were more efficient with higher sublimation rate and mass flux than the conventional cycles, since drying was conducted at higher shelf temperature. Finally, the quality attributes of the products lyophilized using Tc determined by OCT-FDM and mDSC were similar, and product shrinkage and cracks were observed in all the batches of freeze-dried products irrespective of the technique employed in predicting Tc.

  20. Analysis of Technical Efficiency of Small Holder Maize Growing ...

    African Journals Online (AJOL)

    The objective of this study was to examine the level of technical efficiency of smallholder maize producers and identify its determinants in Horo Guduru Wollega zone of Oromia Regional State, Ethiopia. A Cobb-Douglass stochastic production function model was used for the analysis. To specify technical inefficiency effects ...

  1. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    Energy Technology Data Exchange (ETDEWEB)

    Boddu, S; Morrow, A; Krishnamurthy, N; McVicker, A; Deb, N; Rangaraj, D [Scott & White Hospital, Temple, TX (United States)

    2016-06-15

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality, undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.

  2. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    International Nuclear Information System (INIS)

    Boddu, S; Morrow, A; Krishnamurthy, N; McVicker, A; Deb, N; Rangaraj, D

    2016-01-01

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality, undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.

  3. RAPID PROCESSING OF ARCHIVAL TISSUE SAMPLES FOR PROTEOMIC ANALYSIS USING PRESSURE-CYCLING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vinuth N. Puttamallesh1,2

    2017-06-01

    Full Text Available Advent of mass spectrometry based proteomics has revolutionized our ability to study proteins from biological specimen in a high-throughput manner. Unlike cell line based studies, biomedical research involving tissue specimen is often challenging due to limited sample availability. In addition, investigation of clinically relevant research questions often requires enormous amount of time for sample collection prospectively. Formalin fixed paraffin embedded (FFPE archived tissue samples are a rich source of tissue specimen for biomedical research. However, there are several challenges associated with analysing FFPE samples. Protein cross-linking and degradation of proteins particularly affects proteomic analysis. We demonstrate that barocycler that uses pressure-cycling technology enables efficient protein extraction and processing of small amounts of FFPE tissue samples for proteomic analysis. We identified 3,525 proteins from six 10µm esophageal squamous cell carcinoma (ESCC tissue sections. Barocycler allows efficient protein extraction and proteolytic digestion of proteins from FFPE tissue sections at par with conventional methods.

  4. Application of Data Envelopment Analysis to Measure Cost, Revenue and Profit Efficiency

    Directory of Open Access Journals (Sweden)

    Kristína Kočišová

    2014-09-01

    Full Text Available The literature analysing efficiency of financial institutions has enveloped rapidly over the last years. Most studies have focused on the input side analysing input technical and cost efficiency. Only few studies have examined the output side evaluating output technical and revenue efficiency. We know that both sides are relevant when evaluating efficiency of financial institutions. Therefore the primary purpose of this paper is to review a number of approaches for efficiency measurement. In particular, the concepts of cost, revenue and profit functions are discussed. We apply Data Envelopment Analysis (DEA to a sample of Slovak and Czech commercial banks during years 2009–2013 comparing the efficiencies by either minimizing cost or maximizing revenue and profit. The results showed that the level of average revenue efficiency was the highest and the average profit efficiency was the lowest one. As can be seen the Czech banks were more cost, revenue and profit efficient than Slovak ones during the whole analysed period.

  5. A method to determine stratification efficiency of thermal energy storage processes independently from storage heat losses

    DEFF Research Database (Denmark)

    Haller, M.Y.; Yazdanshenas, Eshagh; Andersen, Elsa

    2010-01-01

    process is in agreement with the first law of thermodynamics. A comparison of the stratification efficiencies obtained from experimental results of charging, standby, and discharging processes gives meaningful insights into the different mixing behaviors of a storage tank that is charged and discharged......A new method for the calculation of a stratification efficiency of thermal energy storages based on the second law of thermodynamics is presented. The biasing influence of heat losses is studied theoretically and experimentally. Theoretically, it does not make a difference if the stratification...

  6. Technical Efficiency Analysis of Container Terminals in the Middle Eastern Region

    Directory of Open Access Journals (Sweden)

    Ebrahim Sharaf Almawsheki

    2015-12-01

    Full Text Available Despite an increasing number of studies on the efficiency of container terminals, their focus has mostly been on advanced and emerging markets. There are limited studies on container terminals in developing countries such as those of the Middle Eastern region, which are located in a critical geographic position in the international maritime route between the East and the West. Information on their potential for development relative to other terminals worldwide is thus not readily available. This study aims to evaluate the technical efficiency of 19 container terminals in the Middle Eastern region. The DEA approach is used to measure technical efficiency, and slack variable analysis identifies potential areas of improvement for inefficient terminals. The results show that the Jebel Ali, Salalah and Beirut container terminals are the most efficient terminals in the region, and that the least efficient is the terminal in Aden. The results provide valuable information for terminal managers, helping to develop resource utilisation for steady development in operational efficiency.

  7. [THE APPLICATION OF SHORT-TERM EFFICIENCY ANALYSIS IN DIAGNOSING OCCUPATIONAL VOICE DISORDERS].

    Science.gov (United States)

    Niebudek-Bogusz, Ewa; Just, Marcin; Tyc, Michał; Wiktorowicz, Justyna; Morawska, Joanna; Śliwińska-Kowalska, Mariola

    2015-01-01

    An objective determination of the range of vocal efficiency is rather difficult. The aim of the study was to assess the possibility of application of short-term acoustic efficiency analysis in diagnosing occupational voice disorders. The study covered 98 people (87 women and 11 men) diagnosed with occupational dysphonia throuigh videostroboscopic examination. The control group comprised 100 people (81 women and 19 men) with normal voices. The short-term acoustic analysis was carried out by means of DiagnoScope software, including classical parameters (Jitter group, Shimmer group and the assessment of noise degree NHR), as well as new short-term efficiency parameters determined in a short time period during sustained phonation of the vowel "a." The results were then compared. Results: The values of all the examined classical parameters were considerably higher in the study group of pathological voices than in the control group of normal voices (p = 0.00). The aerodynamic parameter, maximum phonation time, was significantly shorter by over 0.5 s in the study group than in the control group. The majority of the acoustic efficiency parameters were also considerably worse in the study group of subjects with occupational dysphonia than in the control group (p = 0.00). Moreover, the correlation between the efficiency parameters and most of the classical acoustic parameters in the study group implies that for the voices with occupational pathology the decreased efficiency of the vocal apparatus is reflected in the acoustic voice structure. Effliciency parameters determined during short-term acoustic analysis can be an objective indicator of the decreased phonatory function of the larnx, useful in diagnosing occupational vocal pathology.

  8. The Design and Analysis of Efficient Learning Algorithms

    Science.gov (United States)

    1991-01-01

    31] describe in detail how this can be done efficiently; see also Duda and Hart [22]. Let a&,..., &d be the resulting solution, and let h0 = Fd=1 af...Measure. Wiley, second edition, 1986. [13] Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Occam’s razor. Information...Processing Letters, 24(6):377-380, April 1987. [14] Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Learn- ability and the

  9. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  10. ANALYSIS OF PERFORMANCE FROM PROCESSING AND PRESERVING OF FRUIT AND VEGETABLES

    Directory of Open Access Journals (Sweden)

    APOSTOL CIPRIAN

    2017-04-01

    Full Text Available Given that the world population is increasing continuously in recent years, and natural resources are becoming increasingly scarce, ensuring healthy food problem is a global challenge. Nutritional value, palatability, and high degree of assimilation by the body makes fruits and vegetables are recommended and widely used in food, both fresh and preserved. The study aims at comparing the activity of manufacturing fruit and vegetable juices with the processing and preservation of fruit and vegetables to highlight which of the two is more efficient in the economic and financial point of view. Through a descriptive and comparative analysis of specific indicators are reflected the main aspects of the performance of the two sectors in Romania. The main source of information is the National Agency for Fiscal Administration. The period analyzed is from 2008, when started financial and economic crisis in Romania, and 2015, because until this year were available the necessary information, which can explain the evolution of performance not only during the crisis, as well as after it is finished. In this study, it was found that the industry processing and preservation of fruits and vegetables in Romania is quite efficient and has been constantly developing, but mainly in the processing and preservation of fruits and vegetables, the production of fruit and vegetables recording much lower.

  11. [Technical efficiency of traditional hospitals and public enterprises in Andalusia (Spain)].

    Science.gov (United States)

    Herrero Tabanera, Luis; Martín Martín, José Jesús; López del Amo González, Ma del Puerto

    2015-01-01

    To assess the technical efficiency of traditional public hospitals without their own legal identity and subject to administrative law, and that of public enterprise hospitals, with their own legal identities and partly governed by private law, all of them belonging to the taxypayer-funded health system of Andalusia during the period 2005 -2008. The study included the 32 publicly-owned hospitals in Andalusia during the period 2005-2008. The method consisted of two stages. In the first stage, the indices of technical efficiency of the hospitals were calculated using Data Envelopment Analysis, and the change in total factor productivity was estimated using the Malmquist index. The results were compared according to perceived quality, and a sensitivity analysis was conducted through an auxiliary model and bootstrapping. In the second stage, a bivariate analysis was performed between hospital efficiency and organization type. Public enterprises were more efficient than traditional hospitals (on average by over 10%) in each of the study years. Nevertheless, a process of convergence was observed between the two types of organizations because, while the efficiency of traditional hospitals increased slightly (by 0.50%) over the study period, the performance of public enterprises declined by over 2%. The possible reasons for the greater efficiency of public enterprises include their greater budgetary and employment flexibility. However, the convergence process observed points to a process of mutual learning that is not necessarily efficient. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  12. Parallel scalability and efficiency of vortex particle method for aeroelasticity analysis of bluff bodies

    Science.gov (United States)

    Tolba, Khaled Ibrahim; Morgenthal, Guido

    2018-01-01

    This paper presents an analysis of the scalability and efficiency of a simulation framework based on the vortex particle method. The code is applied for the numerical aerodynamic analysis of line-like structures. The numerical code runs on multicore CPU and GPU architectures using OpenCL framework. The focus of this paper is the analysis of the parallel efficiency and scalability of the method being applied to an engineering test case, specifically the aeroelastic response of a long-span bridge girder at the construction stage. The target is to assess the optimal configuration and the required computer architecture, such that it becomes feasible to efficiently utilise the method within the computational resources available for a regular engineering office. The simulations and the scalability analysis are performed on a regular gaming type computer.

  13. Evaluation of energy efficiency opportunities of a typical Moroccan cement plant: Part I. Energy analysis

    International Nuclear Information System (INIS)

    Fellaou, S.; Bounahmidi, T.

    2017-01-01

    Highlights: • We have analyzed the degree of freedom of the overall system. • We validated the redundant measurements by the Lagrange multipliers technique. • We have analyzed the mass and the energy balances by two approaches. • We identified the factors that penalize the energetic performance of the whole plant. • We assessed options to improve energy efficiency of the entire cement plant. - Abstract: The cement industry is one of Morocco’s most highly energy intensive economic sectors. It suffers from abnormally high cost of energy supplies, representing more than two thirds of the cost of cement; the first item of expenditure is electricity and fuel with 40% and 30% respectively. Herefor, much more effort is needed to make the cement sector reach energy saving targets set by the Moroccan energy efficiency strategy. The present work aims to evaluate energy performance of an existing Moroccan cement plant based on a detailed mass and energy balances analysis. Redundant measurements were validated by the Lagrange multipliers technique before being used for the calculation of unmeasured variables. The values for energy consumption and related losses through the whole production line are reported, and the results obtained have been used to assess the energy performance of the process. The evaluation was completed by both an analysis of possible energy loss sources and important solutions described in the international literature to improve the energy efficiency of the entire cement plant.

  14. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  15. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  16. Joint environmental and cost efficiency analysis of electricity generation

    International Nuclear Information System (INIS)

    Welch, Eric; Barnum, Darold

    2009-01-01

    Fossil-fuel based electricity generation produces the largest proportion of human-related carbon pollution in the United States. Hence, fuel choices by steam plants are key determinants of the industry's impact on national and global greenhouse gas emissions, and key foci for climate change policy. Yet, little research has been done to examine the economic and environmental tradeoffs among the different types of fuels that are used by these plants. This paper applies a Data Envelopment Analysis procedure that incorporates the materials balance principle to estimate the allocations of coal, gas and oil inputs that minimize carbon emissions and costs. Using EIA 906 and FERC 423 data, the paper estimates cost/carbon tradeoffs facing two sets of plants: those that use coal and gas inputs, and those that use coal, gas and oil inputs. Findings for our three-input sample show that there would be a 79% increase in cost for moving from the cost-efficient point to the carbon efficient point, while there would be a 38% increase in carbon for moving from the carbon efficient point to the cost-efficient point. These conclusions indicate that, in general, the gap between efficient cost and efficient environmental production is wide, and would require substantial policy intervention, technological change or market adjustment before it could be narrowed. However, our examination of individual plants shows that what is true in general is often not true for specific plants. Some plants that are currently less efficient than those on the production frontier could produce the same amount of electricity with less carbon output and less fuel input. Additionally, many plants on the production frontier could improve both cost and carbon efficiency by changing their mixture of fossil-fuel inputs. (author)

  17. An analysis of stock market efficiency: Developed vs Islamic stock markets using MF-DFA

    Science.gov (United States)

    Rizvi, Syed Aun R.; Dewandaru, Ginanjar; Bacha, Obiyathulla I.; Masih, Mansur

    An efficient market has been theoretically proven to be a key component for effective and efficient resource allocation in an economy. This paper incorporates econophysics with Efficient Market Hypothesis to undertake a comparative analysis of Islamic and developed countries’ markets by extending the understanding of their multifractal nature. By applying the Multifractal Detrended Fluctuation Analysis (MFDFA) we calculated the generalized Hurst exponents, multifractal scaling exponents and generalized multifractal dimensions for 22 broad market indices. The findings provide a deeper understanding of the markets in Islamic countries, where they have traces of highly efficient performance particularly in crisis periods. A key finding is the empirical evidence of the impact of the ‘stage of market development’ on the efficiency of the market. If Islamic countries aim to improve the efficiency of resource allocation, an important area to address is to focus, among others, on enhancing the stage of market development.

  18. Transcriptome Analysis of Maize Immature Embryos Reveals the Roles of Cysteine in Improving Agrobacterium Infection Efficiency

    Science.gov (United States)

    Liu, Yan; Zhang, Zhiqiang; Fu, Junjie; Wang, Guoying; Wang, Jianhua; Liu, Yunjun

    2017-01-01

    Maize Agrobacterium-mediated transformation efficiency has been greatly improved in recent years. Antioxidants, such as, cysteine, can significantly improve maize transformation frequency through improving the Agrobacterium infection efficiency. However, the mechanism underlying the transformation improvement after cysteine exposure has not been elucidated. In this study, we showed that the addition of cysteine to the co-cultivation medium significantly increased the Agrobacterium infection efficiency of hybrid HiII and inbred line Z31 maize embryos. Reactive oxygen species contents were higher in embryos treated with cysteine than that without cysteine. We further investigated the mechanism behind cysteine-related infection efficiency increase using transcriptome analysis. The results showed that the cysteine treatment up-regulated 939 genes and down-regulated 549 genes in both Z31 and HiII. Additionally, more differentially expressed genes were found in HiII embryos than those in Z31 embryos, suggesting that HiII was more sensitive to the cysteine treatment than Z31. GO analysis showed that the up-regulated genes were mainly involved in the oxidation reduction process. The up-regulation of these genes could help maize embryos to cope with the oxidative stress stimulated by Agrobacterium infection. The down-regulated genes were mainly involved in the cell wall and membrane metabolism, such as, aquaporin and expansin genes. Decreased expression of these cell wall integrity genes could loosen the cell wall, thereby improving the entry of Agrobacterium into plant cells. This study offers insight into the role of cysteine in improving Agrobacterium-mediated transformation of maize immature embryos. PMID:29089955

  19. Conceptual design of coke-oven gas assisted coal to olefins process for high energy efficiency and low CO2 emission

    International Nuclear Information System (INIS)

    Man, Yi; Yang, Siyu; Zhang, Jun; Qian, Yu

    2014-01-01

    Highlights: • A novel coke-oven gas assisted coal to olefins (GaCTO) process is proposed. • GaCTO has higher energy efficiency and emits less CO 2 compared to coal-to-olefins process. • GaCTO proposes an idea of using redundant coke-oven gas for producing value added products. - Abstract: Olefins are one of the most important platform chemicals. Developing coal-to-olefins (CTO) processes is regarded as one of promising alternatives to oil-to-olefins process. However, CTO suffers from high CO 2 emission due to the high carbon contents of coal. In China, there is 7 × 10 10 m 3 coke-oven gas (COG) produced in coke plants annually. However, most of the hydrogen-rich COG is utilized as fuel or discharged directly into the air. Such situation is a waste of precious hydrogen resource and serious economic loss, which causes serious environmental pollution either. This paper proposes a novel co-feed process of COG assist CTO in which CH 4 of COG reacts with CO 2 in a Dry Methane Reforming unit to reduce emissions, while the Steam Methane Reforming unit produces H 2 -rich syngas. H 2 of COG can adjust the H/C ratio of syngas. The analysis shows that the energy efficiency of the co-feed process increases about 10%, while at the same time, life cycle carbon footprint is reduced by around 85% in comparison to the conventional CTO process. The economic sustainability of the co-feed process will be reached when the carbon tax would be higher than 150 CNY/t CO 2

  20. Analysis and evaluation in the production process and equipment area of the low-cost solar array project

    Science.gov (United States)

    Goldman, H.; Wolf, M.

    1979-01-01

    Analyses of slicing processes and junction formation processes are presented. A simple method for evaluation of the relative economic merits of competing process options with respect to the cost of energy produced by the system is described. An energy consumption analysis was developed and applied to determine the energy consumption in the solar module fabrication process sequence, from the mining of the SiO2 to shipping. The analysis shows that, in current technology practice, inordinate energy use in the purification step, and large wastage of the invested energy through losses, particularly poor conversion in slicing, as well as inadequate yields throughout. The cell process energy expenditures already show a downward trend based on increased throughput rates. The large improvement, however, depends on the introduction of a more efficient purification process and of acceptable ribbon growing techniques.